ERIC Educational Resources Information Center
Verger, Antoni; Hermo, Javier Pablo
2010-01-01
The article analyses two processes of higher education regionalisation, MERCOSUR-Educativo in Latin America and the Bologna Process in Europe, from a comparative perspective. The comparative analysis is centered on the content and the governance of both processes and, specifically, on the reasons of their uneven evolution and implementation. We…
Criteria for Comparing Domain Analysis Approaches Version 01.00.00
1991-12-01
Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management
A comparative analysis of protected area planning and management frameworks
Per Nilsen; Grant Tayler
1997-01-01
A comparative analysis of the Recreation Opportunity Spectrum (ROS), Limits of Acceptable Change (LAC), a Process for Visitor Impact Management (VIM), Visitor Experience and Resource Protection (VERP), and the Management Process for Visitor Activities (known as VAMP) decision frameworks examines their origins; methodology; use of factors, indicators, and standards;...
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES
2017-06-01
FINDING A METHOD FOR THE MADNESS: A COMPARATIVE ANALYSIS OF STRATEGIC DESIGN METHODOLOGIES BY AMANDA DONNELLY A THESIS...work develops a comparative model for strategic design methodologies, focusing on the primary elements of vision, time, process, communication and...collaboration, and risk assessment. My analysis dissects and compares three potential design methodologies including, net assessment, scenarios and
Binnicker, M. J.; Jespersen, D. J.; Harring, J. A.; Rollins, L. O.; Bryant, S. C.; Beito, E. M.
2008-01-01
The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis. PMID:18463211
Binnicker, M J; Jespersen, D J; Harring, J A; Rollins, L O; Bryant, S C; Beito, E M
2008-07-01
The diagnosis of Lyme borreliosis (LB) is commonly made by serologic testing with Western blot (WB) analysis serving as an important supplemental assay. Although specific, the interpretation of WBs for diagnosis of LB (i.e., Lyme WBs) is subjective, with considerable variability in results. In addition, the processing, reading, and interpretation of Lyme WBs are laborious and time-consuming procedures. With the need for rapid processing and more objective interpretation of Lyme WBs, we evaluated the performances of two automated interpretive systems, TrinBlot/BLOTrix (Trinity Biotech, Carlsbad, CA) and BeeBlot/ViraScan (Viramed Biotech AG, Munich, Germany), using 518 serum specimens submitted to our laboratory for Lyme WB analysis. The results of routine testing with visual interpretation were compared to those obtained by BLOTrix analysis of MarBlot immunoglobulin M (IgM) and IgG and by ViraScan analysis of ViraBlot and ViraStripe IgM and IgG assays. BLOTrix analysis demonstrated an agreement of 84.7% for IgM and 87.3% for IgG compared to visual reading and interpretation. ViraScan analysis of the ViraBlot assays demonstrated agreements of 85.7% for IgM and 94.2% for IgG, while ViraScan analysis of the ViraStripe IgM and IgG assays showed agreements of 87.1 and 93.1%, respectively. Testing by the automated systems yielded an average time savings of 64 min/run compared to processing, reading, and interpretation by our current procedure. Our findings demonstrated that automated processing and interpretive systems yield results comparable to those of visual interpretation, while reducing the subjectivity and time required for Lyme WB analysis.
ERIC Educational Resources Information Center
Hladka, Halyna
2014-01-01
The comparative analysis of western and domestic practice of introducing active and interactive methods of studies in the process of teaching social science disciplines has been carried out. Features, realities, prospects and limitations in application of interactive methods of teaching in the process of implementing social-political science…
Standardization of pitch-range settings in voice acoustic analysis.
Vogel, Adam P; Maruff, Paul; Snyder, Peter J; Mundt, James C
2009-05-01
Voice acoustic analysis is typically a labor-intensive, time-consuming process that requires the application of idiosyncratic parameters tailored to individual aspects of the speech signal. Such processes limit the efficiency and utility of voice analysis in clinical practice as well as in applied research and development. In the present study, we analyzed 1,120 voice files, using standard techniques (case-by-case hand analysis), taking roughly 10 work weeks of personnel time to complete. The results were compared with the analytic output of several automated analysis scripts that made use of preset pitch-range parameters. After pitch windows were selected to appropriately account for sex differences, the automated analysis scripts reduced processing time of the 1,120 speech samples to less than 2.5 h and produced results comparable to those obtained with hand analysis. However, caution should be exercised when applying the suggested preset values to pathological voice populations.
NASA Astrophysics Data System (ADS)
Huang, J. C.; Wright, W. V.
1982-04-01
The Defense Waste Processing Facility (DWPF) for immobilizing nuclear high level waste (HLW) is scheduled to be built. High level waste is produced when reactor components are subjected to chemical separation operations. Two candidates for immobilizing this HLW are borosilicate glass and crystalline ceramic, either being contained in weld sealed stainless steel canisters. A number of technical analyses are being conducted to support a selection between these two waste forms. The risks associated with the manufacture and interim storage of these two forms in the DWPF are compared. Process information used in the risk analysis was taken primarily from a DWPF processibility analysis. The DWPF environmental analysis provided much of the necessary environmental information.
NASA Astrophysics Data System (ADS)
Venkata Subbaiah, K.; Raju, Ch.; Suresh, Ch.
2017-08-01
The present study aims to compare the conventional cutting inserts with wiper cutting inserts during the hard turning of AISI 4340 steel at different workpiece hardness. Type of insert, hardness, cutting speed, feed, and depth of cut are taken as process parameters. Taguchi’s L18 orthogonal array was used to conduct the experimental tests. Parametric analysis carried in order to know the influence of each process parameter on the three important Surface Roughness Characteristics (Ra, Rz, and Rt) and Material Removal Rate. Taguchi based Grey Relational Analysis (GRA) used to optimize the process parameters for individual response and multi-response outputs. Additionally, the analysis of variance (ANOVA) is also applied to identify the most significant factor.
A Comparative Analysis of Extract, Transformation and Loading (ETL) Process
NASA Astrophysics Data System (ADS)
Runtuwene, J. P. A.; Tangkawarow, I. R. H. T.; Manoppo, C. T. M.; Salaki, R. J.
2018-02-01
The current growth of data and information occurs rapidly in varying amount and media. These types of development will eventually produce large number of data better known as the Big Data. Business Intelligence (BI) utilizes large number of data and information for analysis so that one can obtain important information. This type of information can be used to support decision-making process. In practice a process integrating existing data and information into data warehouse is needed. This data integration process is known as Extract, Transformation and Loading (ETL). In practice, many applications have been developed to carry out the ETL process, but selection which applications are more time, cost and power effective and efficient may become a challenge. Therefore, the objective of the study was to provide comparative analysis through comparison between the ETL process using Microsoft SQL Server Integration Service (SSIS) and one using Pentaho Data Integration (PDI).
Production cost comparisons of hydrogen from fossil and nuclear fuel and water decomposition
NASA Technical Reports Server (NTRS)
Ekman, K. R.
1981-01-01
The comparative costs entailed in producing hydrogen by major technologies that rely on petroleum, natural gas, coal, thermochemical cycles, and electrolysis are examined. Techniques were developed for comparing these processes by formulating the process data and economic assessments on a uniform and consistent basis. These data were normalized to permit a meaningful comparative analysis of product costs of these processes.
Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.
Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias
2016-01-01
To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.
Research on the raw data processing method of the hydropower construction project
NASA Astrophysics Data System (ADS)
Tian, Zhichao
2018-01-01
In this paper, based on the characteristics of the fixed data, this paper compares the various mathematical statistics analysis methods and chooses the improved Grabs criterion to analyze the data, and through the analysis of the data processing, the data processing method is not suitable. It is proved that this method can be applied to the processing of fixed raw data. This paper provides a reference for reasonably determining the effective quota analysis data.
The digital storytelling process: A comparative analysis from various experts
NASA Astrophysics Data System (ADS)
Hussain, Hashiroh; Shiratuddin, Norshuhada
2016-08-01
Digital Storytelling (DST) is a method of delivering information to the audience. It combines narrative and digital media content infused with the multimedia elements. In order for the educators (i.e the designers) to create a compelling digital story, there are sets of processes introduced by experts. Nevertheless, the experts suggest varieties of processes to guide them; of which some are redundant. The main aim of this study is to propose a single guide process for the creation of DST. A comparative analysis is employed where ten DST models from various experts are analysed. The process can also be implemented in other multimedia materials that used the concept of DST.
NASA Technical Reports Server (NTRS)
Mausel, P. W.; Todd, W. J.; Baumgardner, M. F.
1976-01-01
A successful application of state-of-the-art remote sensing technology in classifying an urban area into its broad land use classes is reported. This research proves that numerous urban features are amenable to classification using ERTS multispectral data automatically processed by computer. Furthermore, such automatic data processing (ADP) techniques permit areal analysis on an unprecedented scale with a minimum expenditure of time. Also, classification results obtained using ADP procedures are consistent, comparable, and replicable. The results of classification are compared with the proposed U. S. G. S. land use classification system in order to determine the level of classification that is feasible to obtain through ERTS analysis of metropolitan areas.
Comparing the High School English Curriculum in Turkey through Multi-Analysis
ERIC Educational Resources Information Center
Batdi, Veli
2017-01-01
This study aimed to compare the High School English Curriculum (HSEC) in accordance with Stufflebeam's context, input, process and product (CIPP) model through multi-analysis. The research includes both quantitative and qualitative aspects. A descriptive analysis was operated through Rasch Measurement Model; SPSS program for the quantitative…
Analysis of energy recovery potential using innovative technologies of waste gasification.
Lombardi, Lidia; Carnevale, Ennio; Corti, Andrea
2012-04-01
In this paper, two alternative thermo-chemical processes for waste treatment were analysed: high temperature gasification and gasification associated to plasma process. The two processes were analysed from the thermodynamic point of view, trying to reconstruct two simplified models, using appropriate simulation tools and some support data from existing/planned plants, able to predict the energy recovery performances by process application. In order to carry out a comparative analysis, the same waste stream input was considered as input to the two models and the generated results were compared. The performances were compared with those that can be obtained from conventional combustion with energy recovery process by means of steam turbine cycle. Results are reported in terms of energy recovery performance indicators as overall energy efficiency, specific energy production per unit of mass of entering waste, primary energy source savings, specific carbon dioxide production. Copyright © 2011 Elsevier Ltd. All rights reserved.
Strategic and Market Analysis | Bioenergy | NREL
recent efforts in comparative techno-economic analysis. Our analysis considers a wide range of conversion Intermediates NREL has developed first-of-its-kind process models and economic assessments of the co-processing work strives to understand the economic incentives, technical risks, and key data gaps that need to be
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Johnson, Kenneth I.; Khaleel, Mohammad A.
2003-04-01
This paper employs an inverse approach (IA) formulation for the analysis of tubes under free hydroforming conditions. The IA formulation is derived from that of Guo et al. established for flat sheet hydroforming analysis using constant strain triangular membrane elements. At first, an incremental analysis of free hydroforming for a hot-dip galvanized (HG/Z140) DP600 tube is performed using the finite element Marc code. The deformed geometry obtained at the last converged increment is then used as the final configuration in the inverse analysis. This comparative study allows us to assess the predicting capability of the inverse analysis. The results willmore » be compared with the experimental values determined by Asnafi and Skogsgardh. After that, a procedure based on a forming limit diagram (FLD) is proposed to adjust the process parameters such as the axial feed and internal pressure. Finally, the adjustment process is illustrated through a re-analysis of the same tube using the inverse approach« less
NASA Technical Reports Server (NTRS)
Wolf, M.
1979-01-01
To facilitate the task of objectively comparing competing process options, a methodology was needed for the quantitative evaluation of their relative cost effectiveness. Such a methodology was developed and is described, together with three examples for its application. The criterion for the evaluation is the cost of the energy produced by the system. The method permits the evaluation of competing design options for subsystems, based on the differences in cost and efficiency of the subsystems, assuming comparable reliability and service life, or of competing manufacturing process options for such subsystems, which include solar cells or modules. This process option analysis is based on differences in cost, yield, and conversion efficiency contribution of the process steps considered.
A Comparative of business process modelling techniques
NASA Astrophysics Data System (ADS)
Tangkawarow, I. R. H. T.; Waworuntu, J.
2016-04-01
In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.
A comparative analysis of selected wastewater pretreatment processes in food industry
NASA Astrophysics Data System (ADS)
Jaszczyszyn, Katarzyna; Góra, Wojciech; Dymaczewski, Zbysław; Borowiak, Robert
2018-02-01
The article presents a comparative analysis of the classical coagulation with the iron sulphate and adsorption on bentonite for the pretreatment of wastewater in the food industry. As a result of the studies, chemical oxygen demand (COD) and total nitrogen (TN) reduction were found to be comparable in both technologies, and a 29% higher total phosphorus removal efficiency by the coagulation was observed. After the coagulation and adsorption processes, a significant difference between mineral and organic fraction in the sludge was found (49% and 51% for bentonite and 28% and 72% for iron sulphate, respectively).
Waste water processing technology for Space Station Freedom - Comparative test data analysis
NASA Technical Reports Server (NTRS)
Miernik, Janie H.; Shah, Burt H.; Mcgriff, Cindy F.
1991-01-01
Comparative tests were conducted to choose the optimum technology for waste water processing on SSF. A thermoelectric integrated membrane evaporation (TIMES) subsystem and a vapor compression distillation subsystem (VCD) were built and tested to compare urine processing capability. Water quality, performance, and specific energy were compared for conceptual designs intended to function as part of the water recovery and management system of SSF. The VCD is considered the most mature and efficient technology and was selected to replace the TIMES as the baseline urine processor for SSF.
Derakhshanrad, Seyed Alireza; Piven, Emily; Ghoochani, Bahareh Zeynalzadeh
2017-10-01
Walter J. Freeman pioneered the neurodynamic model of brain activity when he described the brain dynamics for cognitive information transfer as the process of circular causality at intention, meaning, and perception (IMP) levels. This view contributed substantially to establishment of the Intention, Meaning, and Perception Model of Neuro-occupation in occupational therapy. As described by the model, IMP levels are three components of the brain dynamics system, with nonlinear connections that enable cognitive function to be processed in a circular causality fashion, known as Cognitive Process of Circular Causality (CPCC). Although considerable research has been devoted to study the brain dynamics by sophisticated computerized imaging techniques, less attention has been paid to study it through investigating the adaptation process of thoughts and behaviors. To explore how CPCC manifested thinking and behavioral patterns, a qualitative case study was conducted on two matched female participants with strokes, who were of comparable ages, affected sides, and other characteristics, except for their resilience and motivational behaviors. CPCC was compared by matrix analysis between two participants, using content analysis with pre-determined categories. Different patterns of thinking and behavior may have happened, due to disparate regulation of CPCC between two participants.
Real-time fMRI processing with physiological noise correction - Comparison with off-line analysis.
Misaki, Masaya; Barzigar, Nafise; Zotev, Vadim; Phillips, Raquel; Cheng, Samuel; Bodurka, Jerzy
2015-12-30
While applications of real-time functional magnetic resonance imaging (rtfMRI) are growing rapidly, there are still limitations in real-time data processing compared to off-line analysis. We developed a proof-of-concept real-time fMRI processing (rtfMRIp) system utilizing a personal computer (PC) with a dedicated graphic processing unit (GPU) to demonstrate that it is now possible to perform intensive whole-brain fMRI data processing in real-time. The rtfMRIp performs slice-timing correction, motion correction, spatial smoothing, signal scaling, and general linear model (GLM) analysis with multiple noise regressors including physiological noise modeled with cardiac (RETROICOR) and respiration volume per time (RVT). The whole-brain data analysis with more than 100,000voxels and more than 250volumes is completed in less than 300ms, much faster than the time required to acquire the fMRI volume. Real-time processing implementation cannot be identical to off-line analysis when time-course information is used, such as in slice-timing correction, signal scaling, and GLM. We verified that reduced slice-timing correction for real-time analysis had comparable output with off-line analysis. The real-time GLM analysis, however, showed over-fitting when the number of sampled volumes was small. Our system implemented real-time RETROICOR and RVT physiological noise corrections for the first time and it is capable of processing these steps on all available data at a given time, without need for recursive algorithms. Comprehensive data processing in rtfMRI is possible with a PC, while the number of samples should be considered in real-time GLM. Copyright © 2015 Elsevier B.V. All rights reserved.
Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P
2015-02-01
To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.
Applications of satellite image processing to the analysis of Amazonian cultural ecology
NASA Technical Reports Server (NTRS)
Behrens, Clifford A.
1991-01-01
This paper examines the application of satellite image processing towards identifying and comparing resource exploitation among indigenous Amazonian peoples. The use of statistical and heuristic procedures for developing land cover/land use classifications from Thematic Mapper satellite imagery will be discussed along with actual results from studies of relatively small (100 - 200 people) settlements. Preliminary research indicates that analysis of satellite imagery holds great potential for measuring agricultural intensification, comparing rates of tropical deforestation, and detecting changes in resource utilization patterns over time.
Comparative study of submerged and surface culture acetification process for orange vinegar.
Cejudo-Bastante, Cristina; Durán-Guerrero, Enrique; García-Barroso, Carmelo; Castro-Mejías, Remedios
2018-02-01
The two main acetification methodologies generally employed in the production of vinegar (surface and submerged cultures) were studied and compared for the production of orange vinegar. Polyphenols (UPLC/DAD) and volatiles compounds (SBSE-GC/MS) were considered as the main variables in the comparative study. Sensory characteristics of the obtained vinegars were also evaluated. Seventeen polyphenols and 24 volatile compounds were determined in the samples during both acetification processes. For phenolic compounds, analysis of variance showed significant higher concentrations when surface culture acetification was employed. However, for the majority of volatile compounds higher contents were observed for submerged culture acetification process, and it was also reflected in the sensory analysis, presenting higher scores for the different descriptors. Multivariate statistical analysis such as principal component analysis demonstrated the possibility of discriminating the samples regarding the type of acetification process. Polyphenols such as apigenin derivative or ferulic acid and volatile compounds such as 4-vinylguaiacol, decanoic acid, nootkatone, trans-geraniol, β-citronellol or α-terpineol, among others, were those compounds that contributed more to the discrimination of the samples. The acetification process employed in the production of orange vinegar has been demonstrated to be very significant for the final characteristics of the vinegar obtained. So it must be carefully controlled to obtain high quality products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Illeghems, Koen; De Vuyst, Luc; Weckx, Stefan
2015-10-12
Lactobacillus fermentum 222 and Lactobacillus plantarum 80, isolates from a spontaneous Ghanaian cocoa bean fermentation process, proved to be interesting functional starter culture strains for cocoa bean fermentations. Lactobacillus fermentum 222 is a thermotolerant strain, able to dominate the fermentation process, thereby converting citrate and producing mannitol. Lactobacillus plantarum 80 is an acid-tolerant and facultative heterofermentative strain that is competitive during cocoa bean fermentation processes. In this study, whole-genome sequencing and comparative genome analysis was used to investigate the mechanisms of these strains to dominate the cocoa bean fermentation process. Through functional annotation and analysis of the high-coverage contigs obtained through 454 pyrosequencing, plantaricin production was predicted for L. plantarum 80. For L. fermentum 222, genes encoding a complete arginine deiminase pathway were attributed. Further, in-depth functional analysis revealed the capacities of these strains associated with carbohydrate and amino acid metabolism, such as the ability to use alternative external electron acceptors, the presence of an extended pyruvate metabolism, and the occurrence of several amino acid conversion pathways. A comparative genome sequence analysis using publicly available genome sequences of strains of the species L. plantarum and L. fermentum revealed unique features of both strains studied. Indeed, L. fermentum 222 possessed genes encoding additional citrate transporters and enzymes involved in amino acid conversions, whereas L. plantarum 80 is the only member of this species that harboured a gene cluster involved in uptake and consumption of fructose and/or sorbose. In-depth genome sequence analysis of the candidate functional starter culture strains L. fermentum 222 and L. plantarum 80 revealed their metabolic capacities, niche adaptations and functionalities that enable them to dominate the cocoa bean fermentation process. Further, these results offered insights into the cocoa bean fermentation ecosystem as a whole and will facilitate the selection of appropriate starter culture strains for controlled cocoa bean fermentation processes.
NASA Astrophysics Data System (ADS)
Hassan, Said A.; Abdel-Gawad, Sherif A.
2018-02-01
Two signal processing methods, namely, Continuous Wavelet Transform (CWT) and the second was Discrete Fourier Transform (DFT) were introduced as alternatives to the classical Derivative Spectrophotometry (DS) in analysis of binary mixtures. To show the advantages of these methods, a comparative study was performed on a binary mixture of Naltrexone (NTX) and Bupropion (BUP). The methods were compared by analyzing laboratory prepared mixtures of the two drugs. By comparing performance of the three methods, it was proved that CWT and DFT methods are more efficient and advantageous in analysis of mixtures with overlapped spectra than DS. The three signal processing methods were adopted for the quantification of NTX and BUP in pure and tablet forms. The adopted methods were validated according to the ICH guideline where accuracy, precision and specificity were found to be within appropriate limits.
Participatory Data Analysis Alongside Co-Researchers Who Have Down Syndrome
ERIC Educational Resources Information Center
Stevenson, Miriam
2014-01-01
Background: There is comparatively little published research that transparently charts the contribution of people with an intellectual disability to a collaborative research process. This paper illustrates the process of data analysis in a project located within the Emancipatory Disability Research (EDR) paradigm. Materials and Methods: Textual…
ERIC Educational Resources Information Center
Liu, Shujie; Xu, Xianxuan; Grant, Leslie; Strong, James; Fang, Zheng
2017-01-01
This article presents the results of an interpretive policy analysis of China's Ministry of Education Standards (2013) for the professional practice of principals. In addition to revealing the evolution of the evaluation of principals in China and the processes by which this policy is formulated, a comparative analysis was conducted to compare it…
Zeng, Rui; Fu, Juan; Wu, La-Bin; Huang, Lin-Fang
2013-07-01
To analyze components of Citrus reticulata and salt-processed C. reticulata by ultra-performance liquid chromatography coupled with quadrupole-time-of-flight mass spectrometry (UPLC-Q-TOF/MS), and compared the changes in components before and after being processed with salt. Principal component analysis (PCA) and partial least squares discriminant analysis (OPLS-DA) were adopted to analyze the difference in fingerprint between crude and processed C. reticulata, showing increased content of eriocitrin, limonin, nomilin and obacunone increase in salt-processed C. reticulata. Potential chemical markers were identified as limonin, obacunone and nomilin, which could be used for distinguishing index components of crude and processed C. reticulata.
Job optimization in ATLAS TAG-based distributed analysis
NASA Astrophysics Data System (ADS)
Mambelli, M.; Cranshaw, J.; Gardner, R.; Maeno, T.; Malon, D.; Novak, M.
2010-04-01
The ATLAS experiment is projected to collect over one billion events/year during the first few years of operation. The efficient selection of events for various physics analyses across all appropriate samples presents a significant technical challenge. ATLAS computing infrastructure leverages the Grid to tackle the analysis across large samples by organizing data into a hierarchical structure and exploiting distributed computing to churn through the computations. This includes events at different stages of processing: RAW, ESD (Event Summary Data), AOD (Analysis Object Data), DPD (Derived Physics Data). Event Level Metadata Tags (TAGs) contain information about each event stored using multiple technologies accessible by POOL and various web services. This allows users to apply selection cuts on quantities of interest across the entire sample to compile a subset of events that are appropriate for their analysis. This paper describes new methods for organizing jobs using the TAGs criteria to analyze ATLAS data. It further compares different access patterns to the event data and explores ways to partition the workload for event selection and analysis. Here analysis is defined as a broader set of event processing tasks including event selection and reduction operations ("skimming", "slimming" and "thinning") as well as DPD making. Specifically it compares analysis with direct access to the events (AOD and ESD data) to access mediated by different TAG-based event selections. We then compare different ways of splitting the processing to maximize performance.
A comparative analysis of frequency modulation threshold extension techniques
NASA Technical Reports Server (NTRS)
Arndt, G. D.; Loch, F. J.
1970-01-01
FM threshold extension for system performance improvement, comparing impulse noise elimination, correlation detection and delta modulation signal processing techniques implemented at demodulator output
Achieving Methodological Alignment When Combining QCA and Process Tracing in Practice
ERIC Educational Resources Information Center
Beach, Derek
2018-01-01
This article explores the practical challenges one faces when combining qualitative comparative analysis (QCA) and process tracing (PT) in a manner that is consistent with their underlying assumptions about the nature of causal relationships. While PT builds on a mechanism-based understanding of causation, QCA as a comparative method makes claims…
ERIC Educational Resources Information Center
Morozov, Andrew; Kilgore, Deborah; Atman, Cynthia
2007-01-01
In this study, the authors used two methods for analyzing expert data: verbal protocol analysis (VPA) and narrative analysis. VPA has been effectively used to describe the design processes employed by engineering students, expert designers, and expert-novice comparative research. VPA involves asking participants to "think aloud" while…
Consolidation of materials by pulse-discharge processes
NASA Astrophysics Data System (ADS)
Strizhakov, E. L.; Nescoromniy, S. V.
2017-07-01
The article presents the research and the analysis of the pulse-discharge processes of capacitor discharge sintering: CD Stud Welding, capacitor discharge percussion welding (CDPW), high-voltage capacitor welding with an inductive-dynamic drive (HVCW with IDD), pulse electric current sintering (PECS) of powders. The comparative analysis of the impact parameter is presented.
Comparative Analysis of InSAR Digital Surface Models for Test Area Bucharest
NASA Astrophysics Data System (ADS)
Dana, Iulia; Poncos, Valentin; Teleaga, Delia
2010-03-01
This paper presents the results of the interferometric processing of ERS Tandem, ENVISAT and TerraSAR- X for digital surface model (DSM) generation. The selected test site is Bucharest (Romania), a built-up area characterized by the usual urban complex pattern: mixture of buildings with different height levels, paved roads, vegetation, and water bodies. First, the DSMs were generated following the standard interferometric processing chain. Then, the accuracy of the DSMs was analyzed against the SPOT HRS model (30 m resolution at the equator). A DSM derived by optical stereoscopic processing of SPOT 5 HRG data and also the SRTM (3 arc seconds resolution at the equator) DSM have been included in the comparative analysis.
Process fault detection and nonlinear time series analysis for anomaly detection in safeguards
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burr, T.L.; Mullen, M.F.; Wangen, L.E.
In this paper we discuss two advanced techniques, process fault detection and nonlinear time series analysis, and apply them to the analysis of vector-valued and single-valued time-series data. We investigate model-based process fault detection methods for analyzing simulated, multivariate, time-series data from a three-tank system. The model-predictions are compared with simulated measurements of the same variables to form residual vectors that are tested for the presence of faults (possible diversions in safeguards terminology). We evaluate two methods, testing all individual residuals with a univariate z-score and testing all variables simultaneously with the Mahalanobis distance, for their ability to detect lossmore » of material from two different leak scenarios from the three-tank system: a leak without and with replacement of the lost volume. Nonlinear time-series analysis tools were compared with the linear methods popularized by Box and Jenkins. We compare prediction results using three nonlinear and two linear modeling methods on each of six simulated time series: two nonlinear and four linear. The nonlinear methods performed better at predicting the nonlinear time series and did as well as the linear methods at predicting the linear values.« less
Li, Wenhua; Yang, Bin; Zhou, Dongmei; Xu, Jun; Ke, Zhi; Suen, Wen-Chen
2016-07-01
Liquid chromatography mass spectrometry (LC-MS) is the most commonly used technique for the characterization of antibody variants. MAb-X and mAb-Y are two approved IgG1 subtype monoclonal antibody drugs recombinantly produced in Chinese hamster ovary (CHO) cells. We report here that two unexpected and rare antibody variants have been discovered during cell culture process development of biosimilars for these two approved drugs through intact mass analysis. We then used comprehensive mass spectrometry-based comparative analysis including reduced light, heavy chains, and domain-specific mass as well as peptide mapping analysis to fully characterize the observed antibody variants. The "middle-up" mass comparative analysis demonstrated that the antibody variant from mAb-X biosimilar candidate was caused by mass variation of antibody crystalline fragment (Fc), whereas a different variant with mass variation in antibody antigen-binding fragment (Fab) from mAb-Y biosimilar candidate was identified. Endoproteinase Lys-C digested peptide mapping and tandem mass spectrometry analysis further revealed that a leucine to glutamine change in N-terminal 402 site of heavy chain was responsible for the generation of mAb-X antibody variant. Lys-C and trypsin coupled non-reduced and reduced peptide mapping comparative analysis showed that the formation of the light-heavy interchain trisulfide bond resulted in the mAb-Y antibody variant. These two cases confirmed that mass spectrometry-based comparative analysis plays a critical role for the characterization of monoclonal antibody variants, and biosimilar developers should start with a comprehensive structural assessment and comparative analysis to decrease the risk of the process development for biosimilars. Copyright © 2016 Elsevier B.V. All rights reserved.
Noise removal using factor analysis of dynamic structures: application to cardiac gated studies.
Bruyant, P P; Sau, J; Mallet, J J
1999-10-01
Factor analysis of dynamic structures (FADS) facilitates the extraction of relevant data, usually with physiologic meaning, from a dynamic set of images. The result of this process is a set of factor images and curves plus some residual activity. The set of factor images and curves can be used to retrieve the original data with reduced noise using an inverse factor analysis process (iFADS). This improvement in image quality is expected because the inverse process does not use the residual activity, assumed to be made of noise. The goal of this work is to quantitate and assess the efficiency of this method on gated cardiac images. A computer simulation of a planar cardiac gated study was performed. The simulated images were added with noise and processed by the FADS-iFADS program. The signal-to-noise ratios (SNRs) were compared between original and processed data. Planar gated cardiac studies from 10 patients were tested. The data processed by FADS-iFADS were subtracted to the original data. The result of the substraction was studied to evaluate its noisy nature. The SNR is about five times greater after the FADS-iFADS process. The difference between original and processed data is noise only, i.e., processed data equals original data minus some white noise. The FADS-iFADS process is successful in the removal of an important part of the noise and therefore is a tool to improve the image quality of cardiac images. This tool does not decrease the spatial resolution (compared with smoothing filters) and does not lose details (compared with frequential filters). Once the number of factors is chosen, this method is not operator dependent.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miebach, Barbara; McDuffie, Dwayne; Spiry, Irina
The objective of this project is to design and build a bench-scale process for a novel phase-changing CO 2 capture solvent. The project will establish scalability and technical and economic feasibility of using a phase-changing CO 2 capture absorbent for post-combustion capture of CO 2 from coal-fired power plants with 90% capture efficiency and 95% CO 2 purity at a cost of $40/tonne of CO 2 captured by 2025 and a cost of <$10/tonne of CO 2 captured by 2035. This report presents system and economic analysis for a process that uses a phase changing aminosilicone solvent to remove COmore » 2 from pulverized coal (PC) power plant flue gas. The aminosilicone solvent is a pure 1,3-bis(3-aminopropyl)-1,1,3,3-tetramethyldisiloxane (GAP-0). Performance of the phase-changing aminosilicone technology is compared to that of a conventional carbon capture system using aqueous monoethanolamine (MEA). This analysis demonstrates that the aminosilicone process has significant advantages relative to an MEA-based system. The first-year CO 2 removal cost for the phase-changing CO 2 capture process is $52.1/tonne, compared to $66.4/tonne for the aqueous amine process. The phase-changing CO 2 capture process is less costly than MEA because of advantageous solvent properties that include higher working capacity, lower corrosivity, lower vapor pressure, and lower heat capacity. The phase-changing aminosilicone process has approximately 32% lower equipment capital cost compared to that of the aqueous amine process. However, this solvent is susceptible to thermal degradation at CSTR desorber operating temperatures, which could add as much as $88/tonne to the CO 2 capture cost associated with solvent makeup. Future work is focused on mitigating this critical risk by developing an advanced low-temperature desorber that can deliver comparable desorption performance and significantly reduced thermal degradation rate.« less
Emotional loneliness in sexual murderers: a qualitative analysis.
Milsom, Jacci; Beech, Anthony R; Webster, Stephen D
2003-10-01
This study compared levels of emotional loneliness between sexual murderers and rapists who had not gone on to kill their victim/s. All participants were life-sentenced prisoners in the United Kingdom. Assessment consisted of a semistructured interview and was subjected to grounded theory analysis. This approach is defined as the breaking down, naming, comparing, and categorizing of data. As such, it is distinguished from other qualitative methods by the process of constant comparison. This continual sifting and comparing elements assists in promoting conceptual and theoretical development. The results of this process found that sexual murderers, compared to rapists, reported significantly higher levels of grievance towards females in childhood, significantly higher levels of peer group loneliness in adolescence, and significantly higher levels of self as victim in adulthood.
ERIC Educational Resources Information Center
Ma, Dongmei; Yu, Xiaoru; Zhang, Haomin
2017-01-01
The present study aimed to investigate second language (L2) word-level and sentence-level automatic processing among English as a foreign language students through a comparative analysis of students with different proficiency levels. As a multidimensional and dynamic construct, automaticity is conceptualized as processing speed, stability, and…
Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin
2014-03-01
A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.
Liu, Qiong; Liu, Jun; Wang, Pengqian; Zhang, Yingying; Li, Bing; Yu, Yanan; Dang, Haixia; Li, Haixia; Zhang, Xiaoxu; Wang, Zhong
2017-07-01
This study aimed to investigate the pure pharmacological mechanisms of baicalin/baicalein (BA) in the targeted network of mouse cerebral ischemia using a poly-dimensional network comparative analysis. Eighty mice with induced focal cerebral ischemia were randomly divided into four groups: BA, Concha Margaritifera (CM), vehicle and sham group. A poly-dimensional comparative analysis of the expression levels of 374 stroke-related genes in each of the four groups was performed using MetaCore. BA significantly reduced the ischemic infarct volume (P<0.05), whereas CM was ineffective. Two processes and 10 network nodes were shared between "BA vs CM" and vehicle, but there were no overlapping pathways. Two pathways, three processes and 12 network nodes overlapped in "BA vs CM" and BA. The pure pharmacological mechanism of BA resulted in targeting of pathways related to development, G-protein signaling, apoptosis, signal transduction and immunity. The biological processes affected by BA were primarily found to correlate with apoptotic, anti-apoptotic and neurophysiological processes. Three network nodes changed from up-regulation to down-regulation, while mitogen-activated protein kinase kinase 6 (MAP2K6, also known as MEK6) changed from down-regulation to up-regulation in "BA vs CM" and vehicle. The changed nodes were all related to cell death and development. The pure pharmacological mechanism of BA is related to immunity, apoptosis, development, cytoskeletal remodeling, transduction and neurophysiology, as ascertained using a poly-dimensional network comparative analysis. Copyright © 2017. Published by Elsevier B.V.
ERIC Educational Resources Information Center
Indefrey, Peter
2006-01-01
This article presents the results of a meta-analysis of 30 hemodynamic experiments comparing first language (L1) and second language (L2) processing in a range of tasks. The results suggest that reliably stronger activation during L2 processing is found (a) only for task-specific subgroups of L2 speakers and (b) within some, but not all regions…
Global analysis of bacterial transcription factors to predict cellular target processes.
Doerks, Tobias; Andrade, Miguel A; Lathe, Warren; von Mering, Christian; Bork, Peer
2004-03-01
Whole-genome sequences are now available for >100 bacterial species, giving unprecedented power to comparative genomics approaches. We have applied genome-context methods to predict target processes that are regulated by transcription factors (TFs). Of 128 orthologous groups of proteins annotated as TFs, to date, 36 are functionally uncharacterized; in our analysis we predict a probable cellular target process or biochemical pathway for half of these functionally uncharacterized TFs.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that the specific add-on costs of the Cz-process can be expected to be reduced by about a factor of three by 1982, and about a factor of five by 1986. A format to guide in the accumulation of the data needed for thorough techno-economic analysis of solar cell production processes was developed.
Highly comparative time-series analysis: the empirical structure of time series and their methods.
Fulcher, Ben D; Little, Max A; Jones, Nick S
2013-06-06
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines.
Highly comparative time-series analysis: the empirical structure of time series and their methods
Fulcher, Ben D.; Little, Max A.; Jones, Nick S.
2013-01-01
The process of collecting and organizing sets of observations represents a common theme throughout the history of science. However, despite the ubiquity of scientists measuring, recording and analysing the dynamics of different processes, an extensive organization of scientific time-series data and analysis methods has never been performed. Addressing this, annotated collections of over 35 000 real-world and model-generated time series, and over 9000 time-series analysis algorithms are analysed in this work. We introduce reduced representations of both time series, in terms of their properties measured by diverse scientific methods, and of time-series analysis methods, in terms of their behaviour on empirical time series, and use them to organize these interdisciplinary resources. This new approach to comparing across diverse scientific data and methods allows us to organize time-series datasets automatically according to their properties, retrieve alternatives to particular analysis methods developed in other scientific disciplines and automate the selection of useful methods for time-series classification and regression tasks. The broad scientific utility of these tools is demonstrated on datasets of electroencephalograms, self-affine time series, heartbeat intervals, speech signals and others, in each case contributing novel analysis techniques to the existing literature. Highly comparative techniques that compare across an interdisciplinary literature can thus be used to guide more focused research in time-series analysis for applications across the scientific disciplines. PMID:23554344
Lab-on-a-chip based total-phosphorus analysis device utilizing a photocatalytic reaction
NASA Astrophysics Data System (ADS)
Jung, Dong Geon; Jung, Daewoong; Kong, Seong Ho
2018-02-01
A lab-on-a-chip (LOC) device for total phosphorus (TP) analysis was fabricated for water quality monitoring. Many commercially available TP analysis systems used to estimate water quality have good sensitivity and accuracy. However, these systems also have many disadvantages such as bulky size, complex pretreatment processes, and high cost, which limit their application. In particular, conventional TP analysis systems require an indispensable pretreatment step, in which the fluidic analyte is heated to 120 °C for 30 min to release the dissolved phosphate, because many phosphates are soluble in water at a standard temperature and pressure. In addition, this pretreatment process requires elevated pressures of up to 1.1 kg cm-2 in order to prevent the evaporation of the heated analyte. Because of these limiting conditions required by the pretreatment processes used in conventional systems, it is difficult to miniaturize TP analysis systems. In this study, we employed a photocatalytic reaction in the pretreatment process. The reaction was carried out by illuminating a photocatalytic titanium dioxide (TiO2) surface formed in a microfluidic channel with ultraviolet (UV) light. This pretreatment process does not require elevated temperatures and pressures. By applying this simplified, photocatalytic-reaction-based pretreatment process to a TP analysis system, greater degrees of freedom are conferred to the design and fabrication of LOC devices for TP monitoring. The fabricated LOC device presented in this paper was characterized by measuring the TP concentration of an unknown sample, and comparing the results with those measured by a conventional TP analysis system. The TP concentrations of the unknown sample measured by the proposed LOC device and the conventional TP analysis system were 0.018 mgP/25 mL and 0.019 mgP/25 mL, respectively. The experimental results revealed that the proposed LOC device had a performance comparable to the conventional bulky TP analysis system. Therefore, our device could be directly employed in water quality monitoring as an alternative to conventional TP analysis systems.
NASA Astrophysics Data System (ADS)
Manrubia, S. C.; Prieto Ballesteros, O.; González Kessler, C.; Fernández Remolar, D.; Córdoba-Jabonero, C.; Selsis, F.; Bérczi, S.; Gánti, T.; Horváth, A.; Sik, A.; Szathmáry, E.
2004-03-01
We carry out a comparative analysis of the morphological and seasonal features of two regions in the Martian Southern Polar Region: the Inca City (82S 65W) and the Pityusa Patera zone (66S 37E). These two sites are representative of a large number of areas which are subjected to dynamical, seasonal processes that deeply modify the local conditions of those regions. Due to varitions in sunlight, seasonal CO2 accumulates during autumn and winter and starts defrosting in spring. By mid summer the seasonal ice has disappeared. Despite a number of relevant differences in the morphology of the seasonal features observed, they seem to result from similar processes.
ERIC Educational Resources Information Center
Chao, Roger Y., Jr.
2014-01-01
The Author argues that historical regional developments in Europe and East Asia greatly influence the formation of an East Asian Higher Education Area. As such, this article compares European and East Asian regionalization and higher education regionalization processes to show this path dependency in East Asian regionalization of higher education…
NASA Astrophysics Data System (ADS)
Cerchiari, G.; Croccolo, F.; Cardinaux, F.; Scheffold, F.
2012-10-01
We present an implementation of the analysis of dynamic near field scattering (NFS) data using a graphics processing unit. We introduce an optimized data management scheme thereby limiting the number of operations required. Overall, we reduce the processing time from hours to minutes, for typical experimental conditions. Previously the limiting step in such experiments, the processing time is now comparable to the data acquisition time. Our approach is applicable to various dynamic NFS methods, including shadowgraph, Schlieren and differential dynamic microscopy.
Secretome profiles of immortalized dental follicle cells using iTRAQ-based proteomic analysis.
Dou, Lei; Wu, Yan; Yan, Qifang; Wang, Jinhua; Zhang, Yan; Ji, Ping
2017-08-04
Secretomes produced by mesenchymal stromal cells (MSCs) were considered to be therapeutic potential. However, harvesting enough primary MSCs from tissue was time-consuming and costly, which impeded the application of MSCs secretomes. This study was to immortalize MSCs and compare the secretomes profile of immortalized and original MSCs. Human dental follicle cells (DFCs) were isolated and immortalized using pMPH86. The secretome profile of immortalized DFCs (iDFCs) was investigated and compared using iTRAQ labeling combined with mass spectrometry (MS) quantitative proteomics. The MS data was analyzed using ProteinPilotTM software, and then bioinformatic analysis of identified proteins was done. A total of 2092 secreted proteins were detected in conditioned media of iDFCs. Compared with primary DFCs, 253 differently expressed proteins were found in iDFCs secretome (142 up-regulated and 111 down-regulated). Intensive bioinformatic analysis revealed that the majority of secreted proteins were involved in cellular process, metabolic process, biological regulation, cellular component organization or biogenesis, immune system process, developmental process, response to stimulus and signaling. Proteomic profile of cell secretome wasn't largely affected after immortalization converted by this piggyBac immortalization system. The secretome of iDFCs may be a good candidate of primary DFCs for regenerative medicine.
All about the Human Genome Project (HGP)
... CSER), and Genome Sequencing Informatics Tools (GS-IT) Comparative Genomics Background information prepared for the media on ... other species to the human sequence. Background on Comparative Genomic Analysis New Process to Prioritize Animal Genomes ...
Analysis of curing process and thermal properties of phenol-urea-formaldehyde cocondensed resins
Bunchiro Tomita; Masahiko Ohyama; Atsushi Itoh; Kiyoto Doi; Chung-Yun Hse
1994-01-01
The curing processes of resols, urea-formaldehyde (UF) resins, their mechanical blends, and phenol-urea cocondensed resins, as well as the reaction of 2,4,6-trimethylolphenol with urea were investiiated with the torsional braid analysis method. The thermal stabilities of these resins after curing also were compared. The results were as follows: (1) In the curing...
ERIC Educational Resources Information Center
Mazza, Monica; Mariano, Melania; Peretti, Sara; Masedu, Francesco; Pino, Maria Chiara; Valenti, Marco
2017-01-01
Individuals with autism spectrum disorders (ASD) show significant impairments in social skills and theory of mind (ToM). The aim of this study was to evaluate ToM and social information processing abilities in 52 children with ASD compared to 55 typically developing (TD) children. A mediation analysis evaluated whether social information…
Cost analysis of advanced turbine blade manufacturing processes
NASA Technical Reports Server (NTRS)
Barth, C. F.; Blake, D. E.; Stelson, T. S.
1977-01-01
A rigorous analysis was conducted to estimate relative manufacturing costs for high technology gas turbine blades prepared by three candidate materials process systems. The manufacturing costs for the same turbine blade configuration of directionally solidified eutectic alloy, an oxide dispersion strengthened superalloy, and a fiber reinforced superalloy were compared on a relative basis to the costs of the same blade currently in production utilizing the directional solidification process. An analytical process cost model was developed to quantitatively perform the cost comparisons. The impact of individual process yield factors on costs was also assessed as well as effects of process parameters, raw materials, labor rates and consumable items.
Evaluation of Apache Hadoop for parallel data analysis with ROOT
NASA Astrophysics Data System (ADS)
Lehrack, S.; Duckeck, G.; Ebke, J.
2014-06-01
The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.
NASA Technical Reports Server (NTRS)
Baron, S.; Levison, W. H.
1977-01-01
Application of the optimal control model of the human operator to problems in display analysis is discussed. Those aspects of the model pertaining to the operator-display interface and to operator information processing are reviewed and discussed. The techniques are then applied to the analysis of advanced display/control systems for a Terminal Configured Vehicle. Model results are compared with those obtained in a large, fixed-base simulation.
Ristivojević, Petar; Trifković, Jelena; Vovk, Irena; Milojković-Opsenica, Dušanka
2017-01-01
Considering the introduction of phytochemical fingerprint analysis, as a method of screening the complex natural products for the presence of most bioactive compounds, use of chemometric classification methods, application of powerful scanning and image capturing and processing devices and algorithms, advancement in development of novel stationary phases as well as various separation modalities, high-performance thin-layer chromatography (HPTLC) fingerprinting is becoming attractive and fruitful field of separation science. Multivariate image analysis is crucial in the light of proper data acquisition. In a current study, different image processing procedures were studied and compared in detail on the example of HPTLC chromatograms of plant resins. In that sense, obtained variables such as gray intensities of pixels along the solvent front, peak area and mean values of peak were used as input data and compared to obtained best classification models. Important steps in image analysis, baseline removal, denoising, target peak alignment and normalization were pointed out. Numerical data set based on mean value of selected bands and intensities of pixels along the solvent front proved to be the most convenient for planar-chromatographic profiling, although required at least the basic knowledge on image processing methodology, and could be proposed for further investigation in HPLTC fingerprinting. Copyright © 2016 Elsevier B.V. All rights reserved.
Comparing digital data processing techniques for surface mine and reclamation monitoring
NASA Technical Reports Server (NTRS)
Witt, R. G.; Bly, B. G.; Campbell, W. J.; Bloemer, H. H. L.; Brumfield, J. O.
1982-01-01
The results of three techniques used for processing Landsat digital data are compared for their utility in delineating areas of surface mining and subsequent reclamation. An unsupervised clustering algorithm (ISOCLS), a maximum-likelihood classifier (CLASFY), and a hybrid approach utilizing canonical analysis (ISOCLS/KLTRANS/ISOCLS) were compared by means of a detailed accuracy assessment with aerial photography at NASA's Goddard Space Flight Center. Results show that the hybrid approach was superior to the traditional techniques in distinguishing strip mined and reclaimed areas.
Kashiha, Mohammad Amin; Green, Angela R; Sales, Tatiana Glogerley; Bahr, Claudia; Berckmans, Daniel; Gates, Richard S
2014-10-01
Image processing systems have been widely used in monitoring livestock for many applications, including identification, tracking, behavior analysis, occupancy rates, and activity calculations. The primary goal of this work was to quantify image processing performance when monitoring laying hens by comparing length of stay in each compartment as detected by the image processing system with the actual occurrences registered by human observations. In this work, an image processing system was implemented and evaluated for use in an environmental animal preference chamber to detect hen navigation between 4 compartments of the chamber. One camera was installed above each compartment to produce top-view images of the whole compartment. An ellipse-fitting model was applied to captured images to detect whether the hen was present in a compartment. During a choice-test study, mean ± SD success detection rates of 95.9 ± 2.6% were achieved when considering total duration of compartment occupancy. These results suggest that the image processing system is currently suitable for determining the response measures for assessing environmental choices. Moreover, the image processing system offered a comprehensive analysis of occupancy while substantially reducing data processing time compared with the time-intensive alternative of manual video analysis. The above technique was used to monitor ammonia aversion in the chamber. As a preliminary pilot study, different levels of ammonia were applied to different compartments while hens were allowed to navigate between compartments. Using the automated monitor tool to assess occupancy, a negative trend of compartment occupancy with ammonia level was revealed, though further examination is needed. ©2014 Poultry Science Association Inc.
Comparative Proteomic Analysis of Light-Induced Mycelial Brown Film Formation in Lentinula edodes.
Tang, Li Hua; Tan, Qi; Bao, Da Peng; Zhang, Xue Hong; Jian, Hua Hua; Li, Yan; Yang, Rui Heng; Wang, Ying
2016-01-01
Light-induced brown film (BF) formation by the vegetative mycelium of Lentinula edodes is important for ensuring the quantity and quality of this edible mushroom. Nevertheless, the molecular mechanism underlying this phenotype is still unclear. In this study, a comparative proteomic analysis of mycelial BF formation in L. edodes was performed. Seventy-three protein spots with at least a twofold difference in abundance on two-dimensional electrophoresis (2DE) maps were observed, and 52 of them were successfully identified by matrix-assisted laser desorption/ionization tandem time-of-flight mass spectrometry (MALDI-TOF/TOF/MS). These proteins were classified into the following functional categories: small molecule metabolic processes (39%), response to oxidative stress (5%), and organic substance catabolic processes (5%), followed by oxidation-reduction processes (3%), single-organism catabolic processes (3%), positive regulation of protein complex assembly (3%), and protein metabolic processes (3%). Interestingly, four of the proteins that were upregulated in response to light exposure were nucleoside diphosphate kinases. To our knowledge, this is the first proteomic analysis of the mechanism of BF formation in L. edodes . Our data will provide a foundation for future detailed investigations of the proteins linked to BF formation.
NASA Astrophysics Data System (ADS)
Gopi, K. R.; Nayaka, H. Shivananda; Sahu, Sandeep
2016-09-01
Magnesium alloy Mg-Al-Mn (AM70) was processed by equal channel angular pressing (ECAP) at 275 °C for up to 4 passes in order to produce ultrafine-grained microstructure and improve its mechanical properties. ECAP-processed samples were characterized for microstructural analysis using optical microscopy, scanning electron microscopy, and transmission electron microscopy. Microstructural analysis showed that, with an increase in the number of ECAP passes, grains refined and grain size reduced from an average of 45 to 1 µm. Electron backscatter diffraction analysis showed the transition from low angle grain boundaries to high angle grain boundaries in ECAP 4 pass sample as compared to as-cast sample. The strength and hardness values an showed increasing trend for the initial 2 passes of ECAP processing and then started decreasing with further increase in the number of ECAP passes, even though the grain size continued to decrease in all the successive ECAP passes. However, the strength and hardness values still remained quite high when compared to the initial condition. This behavior was found to be correlated with texture modification in the material as a result of ECAP processing.
Comparative Analysis of the Measurement of Total Instructional Alignment
ERIC Educational Resources Information Center
Kick, Laura C.
2013-01-01
In 2007, Lisa Carter created the Total Instructional Alignment system--a process that aligns standards, curriculum, assessment, and instruction. Employed in several hundred school systems, the TIA process is a successful professional development program. The researcher developed an instrument to measure the success of the TIA process with the…
ERIC Educational Resources Information Center
Greene, Carolyn J.; Morland, Leslie A.; Macdonald, Alexandra; Frueh, B. Christopher; Grubbs, Kathleen M.; Rosen, Craig S.
2010-01-01
Objective: Video teleconferencing (VTC) is used for mental health treatment delivery to geographically remote, underserved populations. However, few studies have examined how VTC affects individual or group psychotherapy processes. This study compares process variables such as therapeutic alliance and attrition among participants receiving anger…
Graeber, Kai; Linkies, Ada; Wood, Andrew T.A.; Leubner-Metzger, Gerhard
2011-01-01
Comparative biology includes the comparison of transcriptome and quantitative real-time RT-PCR (qRT-PCR) data sets in a range of species to detect evolutionarily conserved and divergent processes. Transcript abundance analysis of target genes by qRT-PCR requires a highly accurate and robust workflow. This includes reference genes with high expression stability (i.e., low intersample transcript abundance variation) for correct target gene normalization. Cross-species qRT-PCR for proper comparative transcript quantification requires reference genes suitable for different species. We addressed this issue using tissue-specific transcriptome data sets of germinating Lepidium sativum seeds to identify new candidate reference genes. We investigated their expression stability in germinating seeds of L. sativum and Arabidopsis thaliana by qRT-PCR, combined with in silico analysis of Arabidopsis and Brassica napus microarray data sets. This revealed that reference gene expression stability is higher for a given developmental process between distinct species than for distinct developmental processes within a given single species. The identified superior cross-species reference genes may be used for family-wide comparative qRT-PCR analysis of Brassicaceae seed germination. Furthermore, using germinating seeds, we exemplify optimization of the qRT-PCR workflow for challenging tissues regarding RNA quality, transcript stability, and tissue abundance. Our work therefore can serve as a guideline for moving beyond Arabidopsis by establishing high-quality cross-species qRT-PCR. PMID:21666000
Sobhani, R; McVicker, R; Spangenberg, C; Rosso, D
2012-01-01
In regions characterized by water scarcity, such as coastal Southern California, groundwater containing chromophoric dissolved organic matter is a viable source of water supply. In the coastal aquifer of Orange County in California, seawater intrusion driven by coastal groundwater pumping increased the concentration of bromide in extracted groundwater from 0.4 mg l⁻¹ in 2000 to over 0.8 mg l⁻¹ in 2004. Bromide, a precursor to bromate formation is regulated by USEPA and the California Department of Health as a potential carcinogen and therefore must be reduced to a level below 10 μg l⁻¹. This paper compares two processes for treatment of highly coloured groundwater: nanofiltration and ozone injection coupled with biologically activated carbon. The requirement for bromate removal decreased the water production in the ozonation process to compensate for increased maintenance requirements, and required the adoption of catalytic carbon with associated increase in capital and operating costs per unit volume. However, due to the absence of oxidant addition in nanofiltration processes, this process is not affected by bromide. We performed a process analysis and a comparative economic analysis of capital and operating costs for both technologies. Our results show that for the case studied in coastal Southern California, nanofiltration has higher throughput and lower specific capital and operating cost, when compared to ozone injection with biologically activate carbon. Ozone injection with biologically activated carbon, compared to nanofiltration, has 14% higher capital cost and 12% higher operating costs per unit water produced while operating at the initial throughput. Due to reduced ozone concentration required to accommodate for bromate reduction, the ozonation process throughput is reduced and the actual cost increase (per unit water produced) is 68% higher for capital cost and 30% higher for operations. Copyright © 2011 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Shintani, Natsuko
2015-01-01
This article reports a meta-analysis of 42 experiments in 33 published studies involving processing instruction (PI) and production-based instruction (PB) used in the PI studies. The comparative effectiveness of PI and PB showed that although PI was more effective than PB for developing receptive knowledge, PB was just as effective as PI for…
Engine Icing Data - An Analytics Approach
NASA Technical Reports Server (NTRS)
Fitzgerald, Brooke A.; Flegel, Ashlie B.
2017-01-01
Engine icing researchers at the NASA Glenn Research Center use the Escort data acquisition system in the Propulsion Systems Laboratory (PSL) to generate and collect a tremendous amount of data every day. Currently these researchers spend countless hours processing and formatting their data, selecting important variables, and plotting relationships between variables, all by hand, generally analyzing data in a spreadsheet-style program (such as Microsoft Excel). Though spreadsheet-style analysis is familiar and intuitive to many, processing data in spreadsheets is often unreproducible and small mistakes are easily overlooked. Spreadsheet-style analysis is also time inefficient. The same formatting, processing, and plotting procedure has to be repeated for every dataset, which leads to researchers performing the same tedious data munging process over and over instead of making discoveries within their data. This paper documents a data analysis tool written in Python hosted in a Jupyter notebook that vastly simplifies the analysis process. From the file path of any folder containing time series datasets, this tool batch loads every dataset in the folder, processes the datasets in parallel, and ingests them into a widget where users can search for and interactively plot subsets of columns in a number of ways with a click of a button, easily and intuitively comparing their data and discovering interesting dynamics. Furthermore, comparing variables across data sets and integrating video data (while extremely difficult with spreadsheet-style programs) is quite simplified in this tool. This tool has also gathered interest outside the engine icing branch, and will be used by researchers across NASA Glenn Research Center. This project exemplifies the enormous benefit of automating data processing, analysis, and visualization, and will help researchers move from raw data to insight in a much smaller time frame.
Fetterhoff, Dustin; Opris, Ioan; Simpson, Sean L.; Deadwyler, Sam A.; Hampson, Robert E.; Kraft, Robert A.
2014-01-01
Background Multifractal analysis quantifies the time-scale-invariant properties in data by describing the structure of variability over time. By applying this analysis to hippocampal interspike interval sequences recorded during performance of a working memory task, a measure of long-range temporal correlations and multifractal dynamics can reveal single neuron correlates of information processing. New method Wavelet leaders-based multifractal analysis (WLMA) was applied to hippocampal interspike intervals recorded during a working memory task. WLMA can be used to identify neurons likely to exhibit information processing relevant to operation of brain–computer interfaces and nonlinear neuronal models. Results Neurons involved in memory processing (“Functional Cell Types” or FCTs) showed a greater degree of multifractal firing properties than neurons without task-relevant firing characteristics. In addition, previously unidentified FCTs were revealed because multifractal analysis suggested further functional classification. The cannabinoid-type 1 receptor partial agonist, tetrahydrocannabinol (THC), selectively reduced multifractal dynamics in FCT neurons compared to non-FCT neurons. Comparison with existing methods WLMA is an objective tool for quantifying the memory-correlated complexity represented by FCTs that reveals additional information compared to classification of FCTs using traditional z-scores to identify neuronal correlates of behavioral events. Conclusion z-Score-based FCT classification provides limited information about the dynamical range of neuronal activity characterized by WLMA. Increased complexity, as measured with multifractal analysis, may be a marker of functional involvement in memory processing. The level of multifractal attributes can be used to differentially emphasize neural signals to improve computational models and algorithms underlying brain–computer interfaces. PMID:25086297
In-Situ Molecular Vapor Composition Measurements During Lyophilization.
Liechty, Evan T; Strongrich, Andrew D; Moussa, Ehab M; Topp, Elizabeth; Alexeenko, Alina A
2018-04-11
Monitoring process conditions during lyophilization is essential to ensuring product quality for lyophilized pharmaceutical products. Residual gas analysis has been applied previously in lyophilization applications for leak detection, determination of endpoint in primary and secondary drying, monitoring sterilization processes, and measuring complex solvents. The purpose of this study is to investigate the temporal evolution of the process gas for various formulations during lyophilization to better understand the relative extraction rates of various molecular compounds over the course of primary drying. In this study, residual gas analysis is used to monitor molecular composition of gases in the product chamber during lyophilization of aqueous formulations typical for pharmaceuticals. Residual gas analysis is also used in the determination of the primary drying endpoint and compared to the results obtained using the comparative pressure measurement technique. The dynamics of solvent vapors, those species dissolved therein, and the ballast gas (the gas supplied to maintain a set-point pressure in the product chamber) are observed throughout the course of lyophilization. In addition to water vapor and nitrogen, the two most abundant gases for all considered aqueous formulations are oxygen and carbon dioxide. In particular, it is observed that the relative concentrations of carbon dioxide and oxygen vary depending on the formulation, an observation which stems from the varying solubility of these species. This result has implications on product shelf life and stability during the lyophilization process. Chamber process gas composition during lyophilization is quantified for several representative formulations using residual gas analysis. The advantages of the technique lie in its ability to measure the relative concentration of various species during the lyophilization process. This feature gives residual gas analysis utility in a host of applications from endpoint determination to quality assurance. In contrast to other methods, residual gas analysis is able to determine oxygen and water vapor content in the process gas. These compounds have been shown to directly influence product shelf life. With these results, residual gas analysis technique presents a potential new method for real-time lyophilization process control and improved understanding of formulation and processing effects for lyophilized pharmaceutical products.
Prospects for energy recovery during hydrothermal and biological processing of waste biomass.
Gerber Van Doren, Léda; Posmanik, Roy; Bicalho, Felipe A; Tester, Jefferson W; Sills, Deborah L
2017-02-01
Thermochemical and biological processes represent promising technologies for converting wet biomasses, such as animal manure, organic waste, or algae, to energy. To convert biomass to energy and bio-chemicals in an economical manner, internal energy recovery should be maximized to reduce the use of external heat and power. In this study, two conversion pathways that couple hydrothermal liquefaction with anaerobic digestion or catalytic hydrothermal gasification were compared. Each of these platforms is followed by two alternative processes for gas utilization: 1) combined heat and power; and 2) combustion in a boiler. Pinch analysis was applied to integrate thermal streams among unit processes and improve the overall system efficiency. A techno-economic analysis was conducted to compare the feasibility of the four modeled scenarios under different market conditions. Our results show that a systems approach designed to recover internal heat and power can reduce external energy demands and increase the overall process sustainability. Copyright © 2016 Elsevier Ltd. All rights reserved.
Fan, Xiaoguang; Cheng, Gang; Zhang, Hongjia; Li, Menghua; Wang, Shizeng; Yuan, Qipeng
2014-12-19
Corncob residue is a cellulose-rich byproduct obtained from industrial xylose production via dilute acid hydrolysis processes. Enzymatic hydrolysis of cellulose in acid hydrolysis residue of corncob (AHRC) is often less efficient without further pretreatment. In this work, the process characteristics of acid impregnated steam explosion were studied in conjunction with a dilute acid process, and their effects on physiochemical changes and enzymatic saccharification of corncob residue were compared. With the acid impregnated steam explosion process, both higher xylose recovery and higher cellulose conversion were obtained. The maximum conversion of cellulose in acid impregnated steam explosion residue of corncob (ASERC) reached 85.3%, which was 1.6 times higher than that of AHRC. Biomass compositional analysis showed similar cellulose and lignin content in ASERC and AHRC. XRD analysis demonstrated comparable crystallinity of ASERC and AHRC. The improved enzymatic hydrolysis efficiency was attributed to higher porosity in ASERC, measured by mercury porosimetry. Copyright © 2014 Elsevier Ltd. All rights reserved.
Air impacts from three alternatives for producing JP-8 jet fuel.
Kositkanawuth, Ketwalee; Gangupomu, Roja Haritha; Sattler, Melanie L; Dennis, Brian H; MacDonnell, Frederick M; Billo, Richard; Priest, John W
2012-10-01
To increase U.S. petroleum energy independence, the University of Texas at Arlington (UT Arlington) has developed a direct coal liquefaction process which uses a hydrogenated solvent and a proprietary catalyst to convert lignite coal to crude oil. This sweet crude can be refined to form JP-8 military jet fuel, as well as other end products like gasoline and diesel. This paper presents an analysis of air pollutants resulting from using UT Arlington's liquefaction process to produce crude and then JP-8, compared with 2 alternative processes: conventional crude extraction and refining (CCER), and the Fischer-Tropsch process. For each of the 3 processes, air pollutant emissions through production of JP-8 fuel were considered, including emissions from upstream extraction/ production, transportation, and conversion/refining. Air pollutants from the direct liquefaction process were measured using a LandTEC GEM2000 Plus, Draeger color detector tubes, OhioLumex RA-915 Light Hg Analyzer, and SRI 8610 gas chromatograph with thermal conductivity detector. According to the screening analysis presented here, producing jet fuel from UT Arlington crude results in lower levels of pollutants compared to international conventional crude extraction/refining. Compared to US domestic CCER, the UTA process emits lower levels of CO2-e, NO(x), and Hg, and higher levels of CO and SO2. Emissions from the UT Arlington process for producing JP-8 are estimated to be lower than for the Fischer-Tropsch process for all pollutants, with the exception of CO2-e, which were high for the UT Arlington process due to nitrous oxide emissions from crude refining. When comparing emissions from conventional lignite combustion to produce electricity, versus UT Arlington coal liquefaction to make JP-8 and subsequent JP-8 transport, emissions from the UT Arlington process are estimated to be lower for all air pollutants, per MJ of power delivered to the end user. The United States currently imports two-thirds of its crude oil, leaving its transportation system especially vulnerable to disruptions in international crude supplies. At current use rates, U.S. coal reserves (262 billion short tons, including 23 billion short tons lignite) would last 236 years. Accordingly, the University of Texas at Arlington (UT Arlington) has developed a process that converts lignite to crude oil, at about half the cost of regular crude. According to the screening analysis presented here, producing jet fuel from UT Arlington crude generates lower levels of pollutants compared to international conventional crude extraction/refining (CCER).
Selection of Sustainable Processes using Sustainability ...
Chemical products can be obtained by process pathways involving varying amounts and types of resources, utilities, and byproduct formation. When such competing process options such as six processes for making methanol as are considered in this study, it is necessary to identify the most sustainable option. Sustainability of a chemical process is generally evaluated with indicators that require process and chemical property data. These indicators individually reflect the impacts of the process on areas of sustainability, such as the environment or society. In order to choose among several alternative processes an overall comparative analysis is essential. Generally net profit will show the most economic process. A mixed integer optimization problem can also be solved to identify the most economic among competing processes. This method uses economic optimization and leaves aside the environmental and societal impacts. To make a decision on the most sustainable process, the method presented here rationally aggregates the sustainability indicators into a single index called sustainability footprint (De). Process flow and economic data were used to compute the indicator values. Results from sustainability footprint (De) are compared with those from solving a mixed integer optimization problem. In order to identify the rank order of importance of the indicators, a multivariate analysis is performed using partial least square variable importance in projection (PLS-VIP)
Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0
Huck, Kevin A.; Malony, Allen D.; Shende, Sameer; ...
2008-01-01
The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis ofmore » individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.« less
Adults with reading disabilities: converting a meta-analysis to practice.
Swanson, H Lee
2012-01-01
This article reviews the results of a meta-analysis of the experimental published literature that compares the academic, cognitive, and behavioral performance of adults with reading disabilities (RD) with average achieving adult readers. The meta-analysis shows that deficits independent of the classification measures emerged for adults with RD on measures of vocabulary, math, spelling, and specific cognitive process related to naming speed, phonological processing, and verbal memory. The results also showed that adults with high verbal IQs (scores > 100) but low word recognition standard scores (< 90) yielded greater deficits related to their average reading counterparts when compared to studies that included adults with RD with verbal IQ and reading scores in the same low range. Implications of the findings related to assessment and intervention are discussed.
Kim, Heung-Kyu; Lee, Seong Hyeon; Choi, Hyunjoo
2015-01-01
Using an inverse analysis technique, the heat transfer coefficient on the die-workpiece contact surface of a hot stamping process was evaluated as a power law function of contact pressure. This evaluation was to determine whether the heat transfer coefficient on the contact surface could be used for finite element analysis of the entire hot stamping process. By comparing results of the finite element analysis and experimental measurements of the phase transformation, an evaluation was performed to determine whether the obtained heat transfer coefficient function could provide reasonable finite element prediction for workpiece properties affected by the hot stamping process. PMID:28788046
Schrem, Harald; Schneider, Valentin; Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz
2016-01-01
The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33-3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root-cause analysis of relevant detection rate changes. Further, comparative G-chart analysis would enable benchmarking of cancer surveillance processes between centers.
Kurok, Marlene; Goldis, Alon; Dreier, Maren; Kaltenborn, Alexander; Gwinner, Wilfried; Barthold, Marc; Liebeneiner, Jan; Winny, Markus; Klempnauer, Jürgen; Kleine, Moritz
2016-01-01
Background The aim of this study is to identify independent pre-transplant cancer risk factors after kidney transplantation and to assess the utility of G-chart analysis for clinical process control. This may contribute to the improvement of cancer surveillance processes in individual transplant centers. Patients and Methods 1655 patients after kidney transplantation at our institution with a total of 9,425 person-years of follow-up were compared retrospectively to the general German population using site-specific standardized-incidence-ratios (SIRs) of observed malignancies. Risk-adjusted multivariable Cox regression was used to identify independent pre-transplant cancer risk factors. G-chart analysis was applied to determine relevant differences in the frequency of cancer occurrences. Results Cancer incidence rates were almost three times higher as compared to the matched general population (SIR = 2.75; 95%-CI: 2.33–3.21). Significantly increased SIRs were observed for renal cell carcinoma (SIR = 22.46), post-transplant lymphoproliferative disorder (SIR = 8.36), prostate cancer (SIR = 2.22), bladder cancer (SIR = 3.24), thyroid cancer (SIR = 10.13) and melanoma (SIR = 3.08). Independent pre-transplant risk factors for cancer-free survival were age <52.3 years (p = 0.007, Hazard ratio (HR): 0.82), age >62.6 years (p = 0.001, HR: 1.29), polycystic kidney disease other than autosomal dominant polycystic kidney disease (ADPKD) (p = 0.001, HR: 0.68), high body mass index in kg/m2 (p<0.001, HR: 1.04), ADPKD (p = 0.008, HR: 1.26) and diabetic nephropathy (p = 0.004, HR = 1.51). G-chart analysis identified relevant changes in the detection rates of cancer during aftercare with no significant relation to identified risk factors for cancer-free survival (p<0.05). Conclusions Risk-adapted cancer surveillance combined with prospective G-chart analysis likely improves cancer surveillance schemes by adapting processes to identified risk factors and by using G-chart alarm signals to trigger Kaizen events and audits for root-cause analysis of relevant detection rate changes. Further, comparative G-chart analysis would enable benchmarking of cancer surveillance processes between centers. PMID:27398803
T-Group and Therapy Group Communication: An Interaction Analysis of the Group Process.
ERIC Educational Resources Information Center
Fisher, B. Aubrey
1979-01-01
Provides an insight into the group process of therapy and compares and contrasts the T-group process with therapy group process. The here-and-now orientation was present in T-group and therapy-group interaction. Greater relational conflict was present in the T-group. Members of the therapy group were much more defensive than members of the…
A Substantive Process Analysis of Responses to Items from the Multistate Bar Examination
ERIC Educational Resources Information Center
Bonner, Sarah M.; D'Agostino, Jerome V.
2012-01-01
We investigated examinees' cognitive processes while they solved selected items from the Multistate Bar Exam (MBE), a high-stakes professional certification examination. We focused on ascertaining those mental processes most frequently used by examinees, and the most common types of errors in their thinking. We compared the relationships between…
The Architecture, Dynamics, and Development of Mental Processing: Greek, Chinese, or Universal?
ERIC Educational Resources Information Center
Demetriou, A.; Kui, Z.X.; Spanoudis, G.; Christou, C.; Kyriakides, L.; Platsidou, M.
2005-01-01
This study compared Greeks with Chinese, from 8 to 14 years of age, on measures of processing efficiency, working memory, and reasoning. All processes were addressed through three domains of relations: verbal/propositional, quantitative, and visuo/spatial. Structural equations modelling and rating scale analysis showed that the architecture and…
Comparative Analysis of Languages for Machine Processing. Interim Report.
ERIC Educational Resources Information Center
Zierer, Ernesto; And Others
This report gives the results obtained in the semantic and syntactic analysis of the Japanese particles "de,""ni,""e," and "wo" in comparison to their equivalents in English, German, and Spanish. The study is based on the so-called "Correlational Analysis" as proposed by Ernst von Glaserfeld. The…
Application of a High-Fidelity Icing Analysis Method to a Model-Scale Rotor in Forward Flight
NASA Technical Reports Server (NTRS)
Narducci, Robert; Orr, Stanley; Kreeger, Richard E.
2012-01-01
An icing analysis process involving the loose coupling of OVERFLOW-RCAS for rotor performance prediction and with LEWICE3D for thermal analysis and ice accretion is applied to a model-scale rotor for validation. The process offers high-fidelity rotor analysis for the noniced and iced rotor performance evaluation that accounts for the interaction of nonlinear aerodynamics with blade elastic deformations. Ice accumulation prediction also involves loosely coupled data exchanges between OVERFLOW and LEWICE3D to produce accurate ice shapes. Validation of the process uses data collected in the 1993 icing test involving Sikorsky's Powered Force Model. Non-iced and iced rotor performance predictions are compared to experimental measurements as are predicted ice shapes.
Enabling Design for Affordability: An Epoch-Era Analysis Approach
2013-04-01
Analysis on the DoD Pre-Milestone B Acquisition Processes Danielle Worger and Teresa Wu, Arizona State University Eugene Rex Jalao, Arizona State...Management Best Practices Brandon Keller and J. Robert Wirthlin Air Force Institute of Technology The RITE Approach to Agile Acquisition Timothy Boyce...Change Kathryn Aten and John T . Dillard Naval Postgraduate School A Comparative Assessment of the Navy’s Future Naval Capabilities (FNC) Process
Information theoretic analysis of edge detection in visual communication
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2010-08-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the artifacts introduced into the process by the image gathering process. However, experiments show that the image gathering process profoundly impacts the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. In this paper, we perform an end-to-end information theory based system analysis to assess edge detection methods. We evaluate the performance of the different algorithms as a function of the characteristics of the scene, and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge detection algorithm is regarded to have high performance only if the information rate from the scene to the edge approaches the maximum possible. This goal can be achieved only by jointly optimizing all processes. People generally use subjective judgment to compare different edge detection methods. There is not a common tool that can be used to evaluate the performance of the different algorithms, and to give people a guide for selecting the best algorithm for a given system or scene. Our information-theoretic assessment becomes this new tool to which allows us to compare the different edge detection operators in a common environment.
Augmenting Qualitative Text Analysis with Natural Language Processing: Methodological Study.
Guetterman, Timothy C; Chang, Tammy; DeJonckheere, Melissa; Basu, Tanmay; Scruggs, Elizabeth; Vydiswaran, V G Vinod
2018-06-29
Qualitative research methods are increasingly being used across disciplines because of their ability to help investigators understand the perspectives of participants in their own words. However, qualitative analysis is a laborious and resource-intensive process. To achieve depth, researchers are limited to smaller sample sizes when analyzing text data. One potential method to address this concern is natural language processing (NLP). Qualitative text analysis involves researchers reading data, assigning code labels, and iteratively developing findings; NLP has the potential to automate part of this process. Unfortunately, little methodological research has been done to compare automatic coding using NLP techniques and qualitative coding, which is critical to establish the viability of NLP as a useful, rigorous analysis procedure. The purpose of this study was to compare the utility of a traditional qualitative text analysis, an NLP analysis, and an augmented approach that combines qualitative and NLP methods. We conducted a 2-arm cross-over experiment to compare qualitative and NLP approaches to analyze data generated through 2 text (short message service) message survey questions, one about prescription drugs and the other about police interactions, sent to youth aged 14-24 years. We randomly assigned a question to each of the 2 experienced qualitative analysis teams for independent coding and analysis before receiving NLP results. A third team separately conducted NLP analysis of the same 2 questions. We examined the results of our analyses to compare (1) the similarity of findings derived, (2) the quality of inferences generated, and (3) the time spent in analysis. The qualitative-only analysis for the drug question (n=58) yielded 4 major findings, whereas the NLP analysis yielded 3 findings that missed contextual elements. The qualitative and NLP-augmented analysis was the most comprehensive. For the police question (n=68), the qualitative-only analysis yielded 4 primary findings and the NLP-only analysis yielded 4 slightly different findings. Again, the augmented qualitative and NLP analysis was the most comprehensive and produced the highest quality inferences, increasing our depth of understanding (ie, details and frequencies). In terms of time, the NLP-only approach was quicker than the qualitative-only approach for the drug (120 vs 270 minutes) and police (40 vs 270 minutes) questions. An approach beginning with qualitative analysis followed by qualitative- or NLP-augmented analysis took longer time than that beginning with NLP for both drug (450 vs 240 minutes) and police (390 vs 220 minutes) questions. NLP provides both a foundation to code qualitatively more quickly and a method to validate qualitative findings. NLP methods were able to identify major themes found with traditional qualitative analysis but were not useful in identifying nuances. Traditional qualitative text analysis added important details and context. ©Timothy C Guetterman, Tammy Chang, Melissa DeJonckheere, Tanmay Basu, Elizabeth Scruggs, VG Vinod Vydiswaran. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 29.06.2018.
Summary and recommendations. [reduced gravitational effects on materials manufactured in space
NASA Technical Reports Server (NTRS)
1975-01-01
An economic analysis using econometric and cost benefit analysis techniques was performed to determine the feasibility of space processing of certain products. The overall objectives of the analysis were (1) to determine specific products or processes uniquely connected with space manufacturing, (2) to select a specific product or process from each of the areas of semiconductors, metals, and biochemicals, and (3) to determine the overall price/cost structure of each product or process considered. The economic elements of the analysis involved a generalized decision making format for analyzing space manufacturing, a comparative cost study of the selected processes in space vs. earth manufacturing, and a supply and demand study of the economic relationships of one of the manufacturing processes. Space processing concepts were explored. The first involved the use of the shuttle as the factory with all operations performed during individual flights. The second concept involved a permanent unmanned space factory which would be launched separately. The shuttle in this case would be used only for maintenance and refurbishment. Finally, some consideration was given to a permanent manned space factory.
A comparative study of the characterization of miR-155 in knockout mice
Zhang, Dong; Cui, Yongchun; Li, Bin; Luo, Xiaokang; Li, Bo; Tang, Yue
2017-01-01
miR-155 is one of the most important miRNAs and plays a very important role in numerous biological processes. However, few studies have characterized this miRNA in mice under normal physiological conditions. We aimed to characterize miR-155 in vivo by using a comparative analysis. In our study, we compared miR-155 knockout (KO) mice with C57BL/6 wild type (WT) mice in order to characterize miR-155 in mice under normal physiological conditions using many evaluation methods, including a reproductive performance analysis, growth curve, ultrasonic estimation, haematological examination, and histopathological analysis. These analyses showed no significant differences between groups in the main evaluation indices. The growth and development were nearly normal for all mice and did not differ between the control and model groups. Using a comparative analysis and a summary of related studies published in recent years, we found that miR-155 was not essential for normal physiological processes in 8-week-old mice. miR-155 deficiency did not affect the development and growth of naturally ageing mice during the 42 days after birth. Thus, studying the complex biological functions of miR-155 requires the further use of KO mouse models. PMID:28278287
Crew Interface Analysis: Selected Articles on Space Human Factors Research, 1987 - 1991
1993-07-01
recognitions to that distractor ) suggest that the perceptual type of the graph has a strong representation in memory . We found that both training with... processing strategy. If my goal were to compare the value of variables or (possibly) to compare a trend, I would select a perceptual strategy. If...be needed to determine specific processing models for different questions using the perceptual strategy. In addition, predictions about the memory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoaf, S.; APS Engineering Support Division
A real-time image analysis system was developed for beam imaging diagnostics. An Apple Power Mac G5 with an Active Silicon LFG frame grabber was used to capture video images that were processed and analyzed. Software routines were created to utilize vector-processing hardware to reduce the time to process images as compared to conventional methods. These improvements allow for more advanced image processing diagnostics to be performed in real time.
State machine analysis of sensor data from dynamic processes
Cook, William R.; Brabson, John M.; Deland, Sharon M.
2003-12-23
A state machine model analyzes sensor data from dynamic processes at a facility to identify the actual processes that were performed at the facility during a period of interest for the purpose of remote facility inspection. An inspector can further input the expected operations into the state machine model and compare the expected, or declared, processes to the actual processes to identify undeclared processes at the facility. The state machine analysis enables the generation of knowledge about the state of the facility at all levels, from location of physical objects to complex operational concepts. Therefore, the state machine method and apparatus may benefit any agency or business with sensored facilities that stores or manipulates expensive, dangerous, or controlled materials or information.
Hybrid life-cycle assessment of natural gas based fuel chains for transportation.
Strømman, Anders Hammer; Solli, Christian; Hertwich, Edgar G
2006-04-15
This research compares the use of natural gas, methanol, and hydrogen as transportation fuels. These three fuel chains start with the extraction and processing of natural gas in the Norwegian North Sea and end with final use in Central Europe. The end use is passenger transportation with a sub-compact car that has an internal combustion engine for the natural gas case and a fuel cell for the methanol and hydrogen cases. The life cycle assessment is performed by combining a process based life-cycle inventory with economic input-output data. The analysis shows that the potential climate impacts are lowest for the hydrogen fuel scenario with CO2 deposition. The hydrogen fuel chain scenario has no significant environmental disadvantage compared to the other fuel chains. Detailed analysis shows that the construction of the car contributes significantly to most impact categories. Finally, it is shown how the application of a hybrid inventory model ensures a more complete inventory description compared to standard process-based life-cycle assessment. This is particularly significant for car construction which would have been significantly underestimated in this study using standard process life-cycle assessment alone.
Perlovich, German L; Volkova, Tatyana V; Proshin, Alexey N; Sergeev, Dmitriy Yu; Bui, Cong Trinh; Petrova, Ludmila N; Bachurin, Sergey O
2010-09-01
A novel 1,2,4-thiadiazoles were synthesized. Crystal structures of these compounds were solved by X-ray diffraction experiments and comparative analysis of molecular conformational states, packing architecture, and hydrogen bonds networks were carried out. Thermodynamic aspects of sublimation processes of studied compounds were determined using temperature dependencies of vapor pressure. Thermophysical characteristics of the molecular crystals were obtained and compared with the sublimation and structural parameters. Solubility and solvation processes of 1,2,4-thiadiazoles in buffer, n-hexane and n-octanol were studied within the wide range of temperature intervals and thermodynamic functions were calculated. Specific and nonspecific interactions of molecules resolved in crystals and solvents were estimated and compared. Distribution processes of compounds in buffer/n-octanol and buffer/n-hexane systems (describing different types of membranes) were investigated. Analysis of transfer processes of studied molecules from the buffer to n-octanol/n-hexane phases was carried out by the diagram method with evaluation of the enthalpic and entropic terms. This approach allows us to design drug molecules with optimal passive transport properties. Calcium-blocking properties of the substances were evaluated.
Emergy Analysis for the Sustainable Utilization of Biosolids ...
This contribution describes the application of an emergy-based methodology for comparing two management alternatives of biosolids produced in a wastewater treatment plant. The current management practice of using biosolids as soil fertilizers was evaluated and compared to another alternative, the recovery of energy from the biosolid gasification process. This emergy assessment and comparison approach identifies more sustainable processes which achieve economic and social benefits with a minimal environmental impact. In addition, emergy-based sustainability indicators and the GREENSCOPE methodology were used to compare the two biosolid management alternatives. According to the sustainability assessment results, the energy production from biosolid gasification is energetically profitable, economically viable, and environmentally suitable. Furthermore, it was found that the current use of biosolids as soil fertilizer does not generate any considerable environmental stress, has the potential to achieve more economic benefits, and a post-processing of biosolids prior to its use as soil fertilizer improves its sustainability performance. In conclusion, this emergy analysis provides a sustainability assessment of both alternatives of biosolid management and helps decision-makers to identify opportunities for improvement during the current process of biosolid management. This work aims to identify the best option for the use and management of biosolids generated in a wa
Home Environment, Social Status, and Mental Test Performance
ERIC Educational Resources Information Center
Bradley, Robert H.; And Others
1977-01-01
The ability of an environmental process measure and socioeconomic status (SES) measures to predict Stanford-Binet IQ at 3 years of age was compared in a separate analysis by sex and race. The environmental process measure predicted IQ as well as a combination of process and status measures, and was superior to SES measures alone. (Author/CP)
ERIC Educational Resources Information Center
Woldegiorgis, Emnet Tadesse; Jonck, Petronella; Goujon, Anne
2015-01-01
Europe's Bologna Process has been identified as a pioneering approach in regional cooperation with respect to the area of higher education. To address the challenges of African higher education, policymakers are recommending regional cooperation that uses the Bologna Process as a model. Based on these recommendations, the African Union Commission…
Thermal Analysis and Microhardness Mapping in Hybrid Laser Welds in a Structural Steel
2003-01-01
conditions. Via the keyhole the laser beam brings about easier ignition of the arc, stabilization of the arc welding process, and penetration of the...with respect to the conventional GMAW or GTAW processes without the need for very close fit-up. This paper will compare an autogenous laser weld to a...UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADP017864 TITLE: Thermal Analysis and Microhardness Mapping in Hybrid Laser
Data analysis for GOPEX image frames
NASA Technical Reports Server (NTRS)
Levine, B. M.; Shaik, K. S.; Yan, T.-Y.
1993-01-01
The data analysis based on the image frames received at the Solid State Imaging (SSI) camera of the Galileo Optical Experiment (GOPEX) demonstration conducted between 9-16 Dec. 1992 is described. Laser uplink was successfully established between the ground and the Galileo spacecraft during its second Earth-gravity-assist phase in December 1992. SSI camera frames were acquired which contained images of detected laser pulses transmitted from the Table Mountain Facility (TMF), Wrightwood, California, and the Starfire Optical Range (SOR), Albuquerque, New Mexico. Laser pulse data were processed using standard image-processing techniques at the Multimission Image Processing Laboratory (MIPL) for preliminary pulse identification and to produce public release images. Subsequent image analysis corrected for background noise to measure received pulse intensities. Data were plotted to obtain histograms on a daily basis and were then compared with theoretical results derived from applicable weak-turbulence and strong-turbulence considerations. Processing steps are described and the theories are compared with the experimental results. Quantitative agreement was found in both turbulence regimes, and better agreement would have been found, given more received laser pulses. Future experiments should consider methods to reliably measure low-intensity pulses, and through experimental planning to geometrically locate pulse positions with greater certainty.
Integrated Multi-process Microfluidic Systems for Automating Analysis
Yang, Weichun; Woolley, Adam T.
2010-01-01
Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343
Belmartino, Susana
2014-04-01
This article presents a comparative analysis of the processes leading to health care reform in Argentina and in the USA. The core of the analysis centers on the ideological references utilized by advocates of the reform and the decision-making processes that support or undercut such proposals. The analysis begins with a historical summary of the issue in each country. The political process that led to the sanction of the Obama reform is then described. The text defends a hypothesis aiming to show that deficiencies in the institutional capacities of Argentina's decision-making bodies are a severe obstacle to attaining substantial changes in this area within the country.
Adaptation of video game UVW mapping to 3D visualization of gene expression patterns
NASA Astrophysics Data System (ADS)
Vize, Peter D.; Gerth, Victor E.
2007-01-01
Analysis of gene expression patterns within an organism plays a critical role in associating genes with biological processes in both health and disease. During embryonic development the analysis and comparison of different gene expression patterns allows biologists to identify candidate genes that may regulate the formation of normal tissues and organs and to search for genes associated with congenital diseases. No two individual embryos, or organs, are exactly the same shape or size so comparing spatial gene expression in one embryo to that in another is difficult. We will present our efforts in comparing gene expression data collected using both volumetric and projection approaches. Volumetric data is highly accurate but difficult to process and compare. Projection methods use UV mapping to align texture maps to standardized spatial frameworks. This approach is less accurate but is very rapid and requires very little processing. We have built a database of over 180 3D models depicting gene expression patterns mapped onto the surface of spline based embryo models. Gene expression data in different models can easily be compared to determine common regions of activity. Visualization software, both Java and OpenGL optimized for viewing 3D gene expression data will also be demonstrated.
NASA Astrophysics Data System (ADS)
Liu, Jian; Bearden, Mark D.; Fernandez, Carlos A.; Fifield, Leonard S.; Nune, Satish K.; Motkuri, Radha K.; Koech, Philip K.; McGrail, B. Pete
2018-03-01
Magnesium (Mg) has many useful applications especially in the form of various Mg alloys that can decrease weight while increasing strength compared with common steels. To increase the affordability and minimize environment consequence, a novel catalyzed organo-metathetical (COMET) process was proposed to extract Mg from seawater aiming to achieve a significant reduction in total energy and production cost compared with the melting salt electrolysis method currently adapted by US Mg LLC. A process flow sheet for a reference COMET process was set up using Aspen Plus. The energy consumption, production cost, and CO2 emissions were estimated using the Aspen economic analyzer. Our results showed that it is possible to produce Mg from seawater with a production cost of 2.0/kg-Mg while consuming about 35.6 kWh/kg-Mg and releasing 7.7 kg CO2/kg-Mg. Under the simulated conditions, the reference COMET process maintains a comparable CO2 emission rate, saves about 40% in production cost, and saves about 15% in energy consumption compared with a simplified US Mg process.
Magsonic™ Carbothermal Technology Compared with the Electrolytic and Pidgeon Processes
NASA Astrophysics Data System (ADS)
Prentice, Leon H.; Haque, Nawshad
A broad technology comparison of carbothermal magnesium production with present technologies has not been previously presented. In this paper a comparative analysis of CSIRO's MagSonic™ process is made with the electrolytic and Pidgeon processes. The comparison covers energy intensity (GJ/tonne Mg), labor intensity (person-hours/tonne Mg), capital intensity (USD/tonne annual Mg installed capacity), and Global Warming Potential (GWP, tonnes CO2-equivalent/tonne Mg). Carbothermal technology is advantageous on all measures except capital intensity (where it is roughly twice the capital cost of a similarly-sized Pidgeon plant). Carbothermal and electrolytic production can have comparatively low environmental impacts, with typical emissions one-sixth those of the Pidgeon process. Despite recent progress, the Pidgeon process depends upon abundant energy and labor combined with few environmental constraints. Pressure is expected to increase on environmental constraints and labor and energy costs over the coming decade. Carbothermal reduction technology appears to be competitive for future production.
Hard Choices for Individual Situations.
ERIC Educational Resources Information Center
Landon, Bruce
This paper focuses on faculty use of a decision-making process for complex situations. The analysis part of the process describes and compares course management software focusing on: technical specifications, instructional design values,tools and features, ease of use, and standards compliance. The extensive comparisons provide faculty with…
A comparative analysis of whispered and normally phonated speech using an LPC-10 vocoder
NASA Astrophysics Data System (ADS)
Wilson, J. B.; Mosko, J. D.
1985-12-01
The determination of the performance of an LPC-10 vocoder in the processing of adult male and female whispered and normally phonated connected speech was the focus of this study. The LPC-10 vocoder's analysis of whispered speech compared quite favorably with similar studies which used sound spectrographic processing techniques. Shifting from phonated speech to whispered speech caused a substantial increase in the phonomic formant frequencies and formant bandwidths for both male and female speakers. The data from this study showed no evidence that the LPC-10 vocoder's ability to process voices with pitch extremes and quality extremes was limited in any significant manner. A comparison of the unprocessed natural vowel waveforms and qualities with the synthesized vowel waveforms and qualities revealed almost imperceptible differences. An LPC-10 vocoder's ability to process linguistic and dialectical suprasegmental features such as intonation, rate and stress at low bit rates should be a critical issue of concern for future research.
Tao, Ling; Aden, Andy; Elander, Richard T; Pallapolu, Venkata Ramesh; Lee, Y Y; Garlock, Rebecca J; Balan, Venkatesh; Dale, Bruce E; Kim, Youngmi; Mosier, Nathan S; Ladisch, Michael R; Falls, Matthew; Holtzapple, Mark T; Sierra, Rocio; Shi, Jian; Ebrik, Mirvat A; Redmond, Tim; Yang, Bin; Wyman, Charles E; Hames, Bonnie; Thomas, Steve; Warner, Ryan E
2011-12-01
Six biomass pretreatment processes to convert switchgrass to fermentable sugars and ultimately to cellulosic ethanol are compared on a consistent basis in this technoeconomic analysis. The six pretreatment processes are ammonia fiber expansion (AFEX), dilute acid (DA), lime, liquid hot water (LHW), soaking in aqueous ammonia (SAA), and sulfur dioxide-impregnated steam explosion (SO(2)). Each pretreatment process is modeled in the framework of an existing biochemical design model so that systematic variations of process-related changes are consistently captured. The pretreatment area process design and simulation are based on the research data generated within the Biomass Refining Consortium for Applied Fundamentals and Innovation (CAFI) 3 project. Overall ethanol production, total capital investment, and minimum ethanol selling price (MESP) are reported along with selected sensitivity analysis. The results show limited differentiation between the projected economic performances of the pretreatment options, except for processes that exhibit significantly lower monomer sugar and resulting ethanol yields. Copyright © 2011 Elsevier Ltd. All rights reserved.
Retinal imaging analysis based on vessel detection.
Jamal, Arshad; Hazim Alkawaz, Mohammed; Rehman, Amjad; Saba, Tanzila
2017-07-01
With an increase in the advancement of digital imaging and computing power, computationally intelligent technologies are in high demand to be used in ophthalmology cure and treatment. In current research, Retina Image Analysis (RIA) is developed for optometrist at Eye Care Center in Management and Science University. This research aims to analyze the retina through vessel detection. The RIA assists in the analysis of the retinal images and specialists are served with various options like saving, processing and analyzing retinal images through its advanced interface layout. Additionally, RIA assists in the selection process of vessel segment; processing these vessels by calculating its diameter, standard deviation, length, and displaying detected vessel on the retina. The Agile Unified Process is adopted as the methodology in developing this research. To conclude, Retina Image Analysis might help the optometrist to get better understanding in analyzing the patient's retina. Finally, the Retina Image Analysis procedure is developed using MATLAB (R2011b). Promising results are attained that are comparable in the state of art. © 2017 Wiley Periodicals, Inc.
Pazesh, Samaneh; Lazorova, Lucia; Berggren, Jonas; Alderborn, Göran; Gråsjö, Johan
2016-09-10
The main purpose of the study was to evaluate various pre-processing and quantification approaches of Raman spectrum to quantify low level of amorphous content in milled lactose powder. To improve the quantification analysis, several spectral pre-processing methods were used to adjust background effects. The effects of spectral noise on the variation of determined amorphous content were also investigated theoretically by propagation of error analysis and were compared to the experimentally obtained values. Additionally, the applicability of calibration method with crystalline or amorphous domains in the estimation of amorphous content in milled lactose powder was discussed. Two straight baseline pre-processing methods gave the best and almost equal performance. By the succeeding quantification methods, PCA performed best, although the classical least square analysis (CLS) gave comparable results, while peak parameter analysis displayed to be inferior. The standard deviations of experimental determined percentage amorphous content were 0.94% and 0.25% for pure crystalline and pure amorphous samples respectively, which was very close to the standard deviation values from propagated spectral noise. The reasonable conformity between the milled samples spectra and synthesized spectra indicated representativeness of physical mixtures with crystalline or amorphous domains in the estimation of apparent amorphous content in milled lactose. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
COMPARATIVE EVALUATION OF GC/MS (GAS CHROMATOGRAPHY/MASS SPECTROMETRY) DATA ANALYSIS PROCESSING
Mass spectra obtained by fused silica capillary gas chromatography/mass spectrometry/data system (GC/MS/DS) analysis of mixtures of organic chemicals adsorbed on Tenax GC cartridges was subjected to manual and automated interpretative techniques. Synthetic mixtures (85 chemicals ...
Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas
2016-01-01
The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125
Novelli, M D; Barreto, E; Matos, D; Saad, S S; Borra, R C
1997-01-01
The authors present the experimental results of the computerized quantifying of tissular structures involved in the reparative process of colonic anastomosis performed by manual suture and biofragmentable ring. The quantified variables in this study were: oedema fluid, myofiber tissue, blood vessel and cellular nuclei. An image processing software developed at Laboratório de Informática Dedicado à Odontologia (LIDO) was utilized to quantifying the pathognomonic alterations in the inflammatory process in colonic anastomosis performed in 14 dogs. The results were compared to those obtained through traditional way diagnosis by two pathologists in view of counterproof measures. The criteria for these diagnoses were defined in levels represented by absent, light, moderate and intensive which were compared to analysis performed by the computer. There was significant statistical difference between two techniques: the biofragmentable ring technique exhibited low oedema fluid, organized myofiber tissue and higher number of alongated cellular nuclei in relation to manual suture technique. The analysis of histometric variables through computational image processing was considered efficient and powerful to quantify the main tissular inflammatory and reparative changing.
Sorting processes with energy-constrained comparisons*
NASA Astrophysics Data System (ADS)
Geissmann, Barbara; Penna, Paolo
2018-05-01
We study very simple sorting algorithms based on a probabilistic comparator model. In this model, errors in comparing two elements are due to (1) the energy or effort put in the comparison and (2) the difference between the compared elements. Such algorithms repeatedly compare and swap pairs of randomly chosen elements, and they correspond to natural Markovian processes. The study of these Markov chains reveals an interesting phenomenon. Namely, in several cases, the algorithm that repeatedly compares only adjacent elements is better than the one making arbitrary comparisons: in the long-run, the former algorithm produces sequences that are "better sorted". The analysis of the underlying Markov chain poses interesting questions as the latter algorithm yields a nonreversible chain, and therefore its stationary distribution seems difficult to calculate explicitly. We nevertheless provide bounds on the stationary distributions and on the mixing time of these processes in several restrictions.
NASA Astrophysics Data System (ADS)
Shulgina, T.; Genina, E.; Gordov, E.; Nikitchuk, K.
2009-04-01
At present numerous data archives which include meteorological observations as well as climate processes modeling data are available for Earth Science specialists. Methods of mathematical statistics are widely used for their processing and analysis. In many cases they represent the only way of quantitative assessment of the meteorological and climatic information. Unified set of analysis methods allows us to compare climatic characteristics calculated on the basis of different datasets with the purpose of performing more detailed analysis of climate dynamics for both regional and global levels. The report presents the results of comparative analysis of atmosphere temperature behavior for the Northern Eurasia territory for the period from 1979 to 2004 based on the NCEP/NCAR Reanalysis, NCEP/DOE Reanalysis AMIP II, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis data and observation data obtained from meteorological stations of the former Soviet Union. Statistical processing of atmosphere temperature data included analysis of time series homogeneity of climate indices approved by WMO, such as "Number of frost days", "Number of summer days", "Number of icing days", "Number of tropical nights", etc. by means of parametric methods of mathematical statistics (Fisher and Student tests). That allowed conducting comprehensive research of spatio-temporal features of the atmosphere temperature. Analysis of the atmosphere temperature dynamics revealed inhomogeneity of the data obtained for large observation intervals. Particularly, analysis performed for the period 1979 - 2004 showed the significant increase of the number of frost and icing days approximately by 1 day for every 2 years and decrease roughly by 1 day for 2 years for the number of summer days. Also it should be mentioned that the growth period mean temperature have increased by 1.5 - 2° C for the time period being considered. The usage of different Reanalysis datasets in conjunction with in-situ observed data allowed comparing of climate indices values calculated on the basis of different datasets that improves the reliability of the results obtained. Partial support of SB RAS Basic Research Program 4.5.2 (Project 2) is acknowledged.
Dimeric spectra analysis in Microsoft Excel: a comparative study.
Gilani, A Ghanadzadeh; Moghadam, M; Zakerhamidi, M S
2011-11-01
The purpose of this work is to introduce the reader to an Add-in implementation, Decom. This implementation provides the whole processing requirements for analysis of dimeric spectra. General linear and nonlinear decomposition algorithms were integrated as an Excel Add-in for easy installation and usage. In this work, the results of several samples investigations were compared to those obtained by Datan. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
Several experimental and projected Czochralski crystal growing process methods were studied and compared to available operations and cost-data of recent production Cz-pulling, in order to elucidate the role of the dominant cost contributing factors. From this analysis, it becomes apparent that substantial cost reductions can be realized from technical advancements which fall into four categories: an increase in furnace productivity; the reduction of crucible cost through use of the crucible for the equivalent of multiple state-of-the-art crystals; the combined effect of several smaller technical improvements; and a carry over effect of the expected availability of semiconductor grade polysilicon at greatly reduced prices. A format for techno-economic analysis of solar cell production processes was developed, called the University of Pennsylvania Process Characterization (UPPC) format. The accumulated Cz process data are presented.
Minor, K S; Willits, J A; Marggraf, M P; Jones, M N; Lysaker, P H
2018-04-25
Conveying information cohesively is an essential element of communication that is disrupted in schizophrenia. These disruptions are typically expressed through disorganized symptoms, which have been linked to neurocognitive, social cognitive, and metacognitive deficits. Automated analysis can objectively assess disorganization within sentences, between sentences, and across paragraphs by comparing explicit communication to a large text corpus. Little work in schizophrenia has tested: (1) links between disorganized symptoms measured via automated analysis and neurocognition, social cognition, or metacognition; and (2) if automated analysis explains incremental variance in cognitive processes beyond clinician-rated scales. Disorganization was measured in schizophrenia (n = 81) with Coh-Metrix 3.0, an automated program that calculates basic and complex language indices. Trained staff also assessed neurocognition, social cognition, metacognition, and clinician-rated disorganization. Findings showed that all three cognitive processes were significantly associated with at least one automated index of disorganization. When automated analysis was compared with a clinician-rated scale, it accounted for significant variance in neurocognition and metacognition beyond the clinician-rated measure. When combined, these two methods explained 28-31% of the variance in neurocognition, social cognition, and metacognition. This study illustrated how automated analysis can highlight the specific role of disorganization in neurocognition, social cognition, and metacognition. Generally, those with poor cognition also displayed more disorganization in their speech-making it difficult for listeners to process essential information needed to tie the speaker's ideas together. Our findings showcase how implementing a mixed-methods approach in schizophrenia can explain substantial variance in cognitive processes.
Conceptual design and structural analysis for an 8.4-m telescope
NASA Astrophysics Data System (ADS)
Mendoza, Manuel; Farah, Alejandro; Ruiz Schneider, Elfego
2004-09-01
This paper describes the conceptual design of the optics support structures of a telescope with a primary mirror of 8.4 m, the same size as a Large Binocular Telescope (LBT) primary mirror. The design goal is to achieve a structure for supporting the primary and secondary mirrors and keeping them joined as rigid as possible. With this purpose an optimization with several models was done. This iterative design process includes: specifications development, concepts generation and evaluation. Process included Finite Element Analysis (FEA) as well as other analytical calculations. Quality Function Deployment (QFD) matrix was used to obtain telescope tube and spider specifications. Eight spiders and eleven tubes geometric concepts were proposed. They were compared in decision matrixes using performance indicators and parameters. Tubes and spiders went under an iterative optimization process. The best tubes and spiders concepts were assembled together. All assemblies were compared and ranked according to their performance.
Influence of plasma shock wave on the morphology of laser drilling in different environments
NASA Astrophysics Data System (ADS)
Zhai, Zhaoyang; Wang, Wenjun; Mei, Xuesong; Wang, Kedian; Yang, Huizhu
2017-05-01
Nanosecond pulse laser was used to study nickel-based alloy drilling and compare processing results of microholes in air environment and water environment. Through analysis and comparison, it's found that environmental medium had obvious influence on morphology of laser drilling. High-speed camera was used to shoot plasma morphology during laser drilling process, theoretical formula was used to calculate boundary dimension of plasma and shock wave velocity, and finally parameters were substituted into computational fluid dynamics simulation software to obtain solutions. Obtained analysis results could intuitively explain different morphological features and forming reasons between laser drilling in air environment and water environment in the experiment from angle of plasma shock waves. By comparing simulation results and experimental results, it could help to get an understanding of formation mechanism of microhole morphology, thus providing basis for further improving process optimization of laser drilling quality.
The Communication Model Perspective of Oral Interpretation.
ERIC Educational Resources Information Center
Peterson, Eric E.
Communication models suggest that oral interpretation is a communicative process, that this process may be represented by specification of implicit and explicit content and structure, and that the models themselves are useful. This paper examines these assumptions through a comparative analysis of communication models employed by oral…
ETO - ENGINEERING TRADE-OFFS (SYSTEMS ANALYSIS BRANCH, SUSTAINABLE TECHNOLOGY DIVISION, NRMRL)
The ETO - Engineering Trade-Offs program is to develop a new, integrated decision-making approach to compare/contrast two or more states of being: a benchmark and an alternative, a change in a production process, alternative processes or products. ETO highlights the difference in...
Cappozzo, Jack C; Koutchma, Tatiana; Barnes, Gail
2015-08-01
As a result of growing interest to nonthermal processing of milk, the purpose of this study was to characterize the chemical changes in raw milk composition after exposure to a new nonthermal turbulent flow UV process, conventional thermal pasteurization process (high-temperature, short-time; HTST), and their combinations, and compare those changes with commercially UHT-treated milk. Raw milk was exposed to UV light in turbulent flow at a flow rate of 4,000L/h and applied doses of 1,045 and 2,090 J/L, HTST pasteurization, and HTST in combination with UV (before or after the UV). Unprocessed raw milk, HTST-treated milk, and UHT-treated milk were the control to the milk processed with the continuous turbulent flow UV treatment. The chemical characterization included component analysis and fatty acid composition (with emphasis on conjugated linoleic acid) and analysis for vitamin D and A and volatile components. Lipid oxidation, which is an indicator to oxidative rancidity, was evaluated by free fatty acid analysis, and the volatile components (extracted organic fraction) by gas chromatography-mass spectrometry to obtain mass spectral profile. These analyses were done over a 14-d period (initially after treatment and at 7 and 14 d) because of the extended shelf-life requirement for milk. The effect of UV light on proteins (i.e., casein or lactalbumin) was evaluated qualitatively by sodium dodecyl sulfate-PAGE. The milk or liquid soluble fraction was analyzed by sodium dodecyl sulfate-PAGE for changes in the protein profile. From this study, it appears that continuous turbulent flow UV processing, whether used as a single process or in combination with HTST did not cause any statistically significant chemical changes when compared with raw milk with regard to the proximate analysis (total fat, protein, moisture, or ash), the fatty acid profile, lipid oxidation with respect to volatile analysis, or protein profile. A 56% loss of vitamin D and a 95% loss of vitamin A content was noted after 7 d from the continuous turbulent flow UV processing, but this loss was equally comparable to that found with traditional thermal processing, such as HTST and UHT. Chemical characterization of milk showed that turbulent flow UV light technology can be considered as alternative nonthermal treatment of pasteurized milk and raw milk to extend shelf life. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
Trautmann-Lengsfeld, Sina Alexa; Domínguez-Borràs, Judith; Escera, Carles; Herrmann, Manfred; Fehr, Thorsten
2013-01-01
A recent functional magnetic resonance imaging (fMRI) study by our group demonstrated that dynamic emotional faces are more accurately recognized and evoked more widespread patterns of hemodynamic brain responses than static emotional faces. Based on this experimental design, the present study aimed at investigating the spatio-temporal processing of static and dynamic emotional facial expressions in 19 healthy women by means of multi-channel electroencephalography (EEG), event-related potentials (ERP) and fMRI-constrained regional source analyses. ERP analysis showed an increased amplitude of the LPP (late posterior positivity) over centro-parietal regions for static facial expressions of disgust compared to neutral faces. In addition, the LPP was more widespread and temporally prolonged for dynamic compared to static faces of disgust and happiness. fMRI constrained source analysis on static emotional face stimuli indicated the spatio-temporal modulation of predominantly posterior regional brain activation related to the visual processing stream for both emotional valences when compared to the neutral condition in the fusiform gyrus. The spatio-temporal processing of dynamic stimuli yielded enhanced source activity for emotional compared to neutral conditions in temporal (e.g., fusiform gyrus), and frontal regions (e.g., ventromedial prefrontal cortex, medial and inferior frontal cortex) in early and again in later time windows. The present data support the view that dynamic facial displays trigger more information reflected in complex neural networks, in particular because of their changing features potentially triggering sustained activation related to a continuing evaluation of those faces. A combined fMRI and EEG approach thus provides an advanced insight to the spatio-temporal characteristics of emotional face processing, by also revealing additional neural generators, not identifiable by the only use of an fMRI approach. PMID:23818974
Zhuo, Yang; Han, Yun; Qu, Qiliang; Cao, Yuqin; Peng, Dangcong; Li, Yuyou
2018-08-01
The feasibility of ammonia pre-separation during the thermal-alkaline pretreatment (TAP) of waste activated sludge was evaluated to mitigate ammonia inhibition during high solid anaerobic digestion (HSAD). The results showed that the TAP increased the organics hydrolysis rate as much as 77% compared to the thermal hydrolysis pretreatment (THP). The production and separation of the ammonia during the TAP exhibited a linear relationship with the hydrolysis of organics and the Emerson model. The pre-separation ratio of the free ammonia nitrogen exceeded 98.00% at a lime dosage exceeding 0.021 g CaO/g TS. However, the separation ratio of the total ammonia nitrogen (TAN) was hindered by its production ratio. Compared to the THP, the TAP increased the methane production rate under similar production yield. A mass flow analysis indicated that the TAP-HSAD process reduced the volume of the digester compared to the THP-HSAD process and the recirculated HSAD-TAP process recovered 45% of the nitrogen in the waste activated sludge. Copyright © 2018 Elsevier Ltd. All rights reserved.
(abstract) Generic Modeling of a Life Support System for Process Technology Comparisons
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support systems and process technology options for a Lunar Base and a Mars Exploration Mission.
NASA Technical Reports Server (NTRS)
An, S. H.; Yao, K.
1986-01-01
Lattice algorithm has been employed in numerous adaptive filtering applications such as speech analysis/synthesis, noise canceling, spectral analysis, and channel equalization. In this paper the application to adaptive-array processing is discussed. The advantages are fast convergence rate as well as computational accuracy independent of the noise and interference conditions. The results produced by this technique are compared to those obtained by the direct matrix inverse method.
ERIC Educational Resources Information Center
Gulaliyev, Mayis G.; Ok, Nuri I.; Musayeva, Fargana Q.; Efendiyev, Rufat J.; Musayeva, Jamila Q.; Agayeva, Samira R.
2016-01-01
The aim of the article is to study the nature of liberalization as a specific economic process, which is formed and developed under the influence of the changing conditions of the globalization and integration processes in the society, as well as to identify the characteristic differences in the processes of liberalization of Turkey and Azerbaijan…
ERIC Educational Resources Information Center
Rimoldi, Horacio J. A.
The study of problem solving is made through the analysis of the process that leads to the final answer. The type of information obtained through the study of the process is compared with the information obtained by studying the final answer. The experimental technique used permits to identify the sequence of questions (tactics) that subjects ask…
NASA Technical Reports Server (NTRS)
Calle, Luz Marina; Hintze, Paul E.; Parlier, Christopher R.; Coffman, Brekke E.; Kolody, Mark R.; Curran, Jerome P.; Trejo, David; Reinschmidt, Ken; Kim, Hyung-Jin
2009-01-01
A 20-year life cycle cost analysis was performed to compare the operational life cycle cost, processing/turnaround timelines, and operations manpower inspection/repair/refurbishment requirements for corrosion protection of the Kennedy Space Center launch pad flame deflector associated with the existing cast-in-place materials and a newer advanced refractory ceramic material. The analysis compared the estimated costs of(1) continuing to use of the current refractory material without any changes; (2) completely reconstructing the flame trench using the current refractory material; and (3) completely reconstructing the flame trench with a new high-performance refractory material. Cost estimates were based on an analysis of the amount of damage that occurs after each launch and an estimate of the average repair cost. Alternative 3 was found to save $32M compared to alternative 1 and $17M compared to alternative 2 over a 20-year life cycle.
Trujillo, Carlos; Garcia-Sucerquia, Jorge
2015-06-01
A comparative analysis of the performance of the modified enclosed energy (MEE) method for self-focusing holograms recorded with digital lensless holographic microscopy is presented. Notwithstanding the MEE analysis previously published, no extended analysis of its performance has been reported. We have tested the MEE in terms of the minimum axial distance allowed between the set of reconstructed holograms to search for the focal plane and the elapsed time to obtain the focused image. These parameters have been compared with those for some of the already reported methods in the literature. The MEE achieves better results in terms of self-focusing quality but at a higher computational cost. Despite its longer processing time, the method remains within a time frame to be technologically attractive. Modeled and experimental holograms have been utilized in this work to perform the comparative study.
Random Process Simulation for stochastic fatigue analysis. Ph.D. Thesis - Rice Univ., Houston, Tex.
NASA Technical Reports Server (NTRS)
Larsen, Curtis E.
1988-01-01
A simulation technique is described which directly synthesizes the extrema of a random process and is more efficient than the Gaussian simulation method. Such a technique is particularly useful in stochastic fatigue analysis because the required stress range moment E(R sup m), is a function only of the extrema of the random stress process. The family of autoregressive moving average (ARMA) models is reviewed and an autoregressive model is presented for modeling the extrema of any random process which has a unimodal power spectral density (psd). The proposed autoregressive technique is found to produce rainflow stress range moments which compare favorably with those computed by the Gaussian technique and to average 11.7 times faster than the Gaussian technique. The autoregressive technique is also adapted for processes having bimodal psd's. The adaptation involves using two autoregressive processes to simulate the extrema due to each mode and the superposition of these two extrema sequences. The proposed autoregressive superposition technique is 9 to 13 times faster than the Gaussian technique and produces comparable values for E(R sup m) for bimodal psd's having the frequency of one mode at least 2.5 times that of the other mode.
Technical and economic analysis of solvent-based lithium-ion electrode drying with water and NMP
Wood, David L.; Quass, Jeffrey D.; Li, Jianlin; ...
2017-05-16
Processing lithium-ion battery (LIB) electrode dispersions with water as the solvent during primary drying offers many advantages over N-methylpyrrolidone (NMP). An in-depth analysis of the comparative drying costs of LIB electrodes is discussed for both NMP- and water-based dispersion processing in terms of battery pack $/kWh. Electrode coating manufacturing and capital equipment cost savings are compared for water vs. conventional NMP organic solvent processing. A major finding of this work is that the total electrode manufacturing costs, whether water- or NMP-based, contribute about 8–9% of the total pack cost. However, it was found that up to a 2 × reductionmore » in electrode processing (drying and solvent recovery) cost can be expected along with a $3–6 M savings in associated plant capital equipment (for a plant producing 100,000 10-kWh Plug-in Hybrid Electric Vehicle (PHEV) batteries) using water as the electrode solvent. This paper shows a different perspective in that the most important benefits of aqueous electrode processing actually revolve around capital equipment savings and environmental stewardship and not processing cost savings.« less
Inferring Group Processes from Computer-Mediated Affective Text Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schryver, Jack C; Begoli, Edmon; Jose, Ajith
2011-02-01
Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Severalmore » useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.« less
NASA Technical Reports Server (NTRS)
Ulbrich, N.; Volden, T.
2018-01-01
Analysis and use of temperature-dependent wind tunnel strain-gage balance calibration data are discussed in the paper. First, three different methods are presented and compared that may be used to process temperature-dependent strain-gage balance data. The first method uses an extended set of independent variables in order to process the data and predict balance loads. The second method applies an extended load iteration equation during the analysis of balance calibration data. The third method uses temperature-dependent sensitivities for the data analysis. Physical interpretations of the most important temperature-dependent regression model terms are provided that relate temperature compensation imperfections and the temperature-dependent nature of the gage factor to sets of regression model terms. Finally, balance calibration recommendations are listed so that temperature-dependent calibration data can be obtained and successfully processed using the reviewed analysis methods.
Some comments on Hurst exponent and the long memory processes on capital markets
NASA Astrophysics Data System (ADS)
Sánchez Granero, M. A.; Trinidad Segovia, J. E.; García Pérez, J.
2008-09-01
The analysis of long memory processes in capital markets has been one of the topics in finance, since the existence of the market memory could implicate the rejection of an efficient market hypothesis. The study of these processes in finance is realized through Hurst exponent and the most classical method applied is R/S analysis. In this paper we will discuss the efficiency of this methodology as well as some of its more important modifications to detect the long memory. We also propose the application of a classical geometrical method with short modifications and we compare both approaches.
ERIC Educational Resources Information Center
Shelina, S. L.; Mitina, O. V.
2015-01-01
The article presents the results of an analysis of the moral value judgments of adults (parents, teachers, educators) that directly concern the socialization process of the young generation in the modern metropolis. This paper follows the model study by Jean Piaget that investigated the moral value judgments of children. A comparative analysis of…
Error minimization algorithm for comparative quantitative PCR analysis: Q-Anal.
OConnor, William; Runquist, Elizabeth A
2008-07-01
Current methods for comparative quantitative polymerase chain reaction (qPCR) analysis, the threshold and extrapolation methods, either make assumptions about PCR efficiency that require an arbitrary threshold selection process or extrapolate to estimate relative levels of messenger RNA (mRNA) transcripts. Here we describe an algorithm, Q-Anal, that blends elements from current methods to by-pass assumptions regarding PCR efficiency and improve the threshold selection process to minimize error in comparative qPCR analysis. This algorithm uses iterative linear regression to identify the exponential phase for both target and reference amplicons and then selects, by minimizing linear regression error, a fluorescence threshold where efficiencies for both amplicons have been defined. From this defined fluorescence threshold, cycle time (Ct) and the error for both amplicons are calculated and used to determine the expression ratio. Ratios in complementary DNA (cDNA) dilution assays from qPCR data were analyzed by the Q-Anal method and compared with the threshold method and an extrapolation method. Dilution ratios determined by the Q-Anal and threshold methods were 86 to 118% of the expected cDNA ratios, but relative errors for the Q-Anal method were 4 to 10% in comparison with 4 to 34% for the threshold method. In contrast, ratios determined by an extrapolation method were 32 to 242% of the expected cDNA ratios, with relative errors of 67 to 193%. Q-Anal will be a valuable and quick method for minimizing error in comparative qPCR analysis.
Comparative analysis of gene regulatory networks: from network reconstruction to evolution.
Thompson, Dawn; Regev, Aviv; Roy, Sushmita
2015-01-01
Regulation of gene expression is central to many biological processes. Although reconstruction of regulatory circuits from genomic data alone is therefore desirable, this remains a major computational challenge. Comparative approaches that examine the conservation and divergence of circuits and their components across strains and species can help reconstruct circuits as well as provide insights into the evolution of gene regulatory processes and their adaptive contribution. In recent years, advances in genomic and computational tools have led to a wealth of methods for such analysis at the sequence, expression, pathway, module, and entire network level. Here, we review computational methods developed to study transcriptional regulatory networks using comparative genomics, from sequence to functional data. We highlight how these methods use evolutionary conservation and divergence to reliably detect regulatory components as well as estimate the extent and rate of divergence. Finally, we discuss the promise and open challenges in linking regulatory divergence to phenotypic divergence and adaptation.
Drop-on-Demand Single Cell Isolation and Total RNA Analysis
Moon, Sangjun; Kim, Yun-Gon; Dong, Lingsheng; Lombardi, Michael; Haeggstrom, Edward; Jensen, Roderick V.; Hsiao, Li-Li; Demirci, Utkan
2011-01-01
Technologies that rapidly isolate viable single cells from heterogeneous solutions have significantly contributed to the field of medical genomics. Challenges remain both to enable efficient extraction, isolation and patterning of single cells from heterogeneous solutions as well as to keep them alive during the process due to a limited degree of control over single cell manipulation. Here, we present a microdroplet based method to isolate and pattern single cells from heterogeneous cell suspensions (10% target cell mixture), preserve viability of the extracted cells (97.0±0.8%), and obtain genomic information from isolated cells compared to the non-patterned controls. The cell encapsulation process is both experimentally and theoretically analyzed. Using the isolated cells, we identified 11 stem cell markers among 1000 genes and compare to the controls. This automated platform enabling high-throughput cell manipulation for subsequent genomic analysis employs fewer handling steps compared to existing methods. PMID:21412416
Mammogram registration using the Cauchy-Navier spline
NASA Astrophysics Data System (ADS)
Wirth, Michael A.; Choi, Christopher
2001-07-01
The process of comparative analysis involves inspecting mammograms for characteristic signs of potential cancer by comparing various analogous mammograms. Factors such as the deformable behavior of the breast, changes in breast positioning, and the amount/geometry of compression may contribute to spatial differences between corresponding structures in corresponding mammograms, thereby significantly complicating comparative analysis. Mammogram registration is a process whereby spatial differences between mammograms can be reduced. Presented in this paper is a nonrigid approach to matching corresponding mammograms based on a physical registration model. Many of the earliest approaches to mammogram registration used spatial transformations which were innately rigid or affine in nature. More recently algorithms have incorporated radial basis functions such as the Thin-Plate Spline to match mammograms. The approach presented here focuses on the use of the Cauchy-Navier Spline, a deformable registration model which offers approximate nonrigid registration. The utility of the Cauchy-Navier Spline is illustrated by matching both temporal and bilateral mammograms.
Trautz, Florian; Dreßler, Jan; Stassart, Ruth; Müller, Wolf; Ondruschka, Benjamin
2018-01-03
Immunohistochemistry (IHC) has become an integral part in forensic histopathology over the last decades. However, the underlying methods for IHC vary greatly depending on the institution, creating a lack of comparability. The aim of this study was to assess the optimal approach for different technical aspects of IHC, in order to improve and standardize this procedure. Therefore, qualitative results from manual and automatic IHC staining of brain samples were compared, as well as potential differences in suitability of common IHC glass slides. Further, possibilities of image digitalization and connected issues were investigated. In our study, automatic staining showed more consistent staining results, compared to manual staining procedures. Digitalization and digital post-processing facilitated direct analysis and analysis for reproducibility considerably. No differences were found for different commercially available microscopic glass slides regarding suitability of IHC brain researches, but a certain rate of tissue loss should be expected during the staining process.
Mazaro, José Vitor Quinelli; Gennari Filho, Humberto; Vedovatto, Eduardo; Amoroso, Andressa Paschoal; Pellizzer, Eduardo Piza; Zavanelli, Adriana Cristina
2011-09-01
The purpose of this study was to compare the dental movement that occurs during the processing of maxillary complete dentures with 3 different base thicknesses, using 2 investment methods, and microwave polymerization. A sample of 42 denture models was randomly divided into 6 groups (n = 7), with base thicknesses of 1.25, 2.50, and 3.75 mm and gypsum or silicone flask investment. Points were demarcated on the distal surface of the second molars and on the back of the gypsum cast at the alveolar ridge level to allow linear and angular measurement using AutoCAD software. The data were subjected to analysis of variance with double factor, Tukey test and Fisher (post hoc). Angular analysis of the varying methods and their interactions generated a statistical difference (P = 0.023) when the magnitudes of molar inclination were compared. Tooth movement was greater for thin-based prostheses, 1.25 mm (-0.234), versus thick 3.75 mm (0.2395), with antagonistic behavior. Prosthesis investment with silicone (0.053) showed greater vertical change compared with the gypsum investment (0.032). There was a difference between the point of analysis, demonstrating that the changes were not symmetric. All groups evaluated showed change in the position of artificial teeth after processing. The complete denture with a thin base (1.25 mm) and silicone investment showed the worst results, whereas intermediate thickness (2.50 mm) was demonstrated to be ideal for the denture base.
2014-01-01
Background This article proposes an approach to comparing and assessing the adaptive capacity of multilateral health agencies in meeting country and individual healthcare needs. Most studies comparing multilateral health agencies have failed to clearly propose a method for conducting agency comparisons. Methods This study conducted a qualitative case study methodological approach, such that secondary and primary case study literature was used to conduct case study comparisons of multilateral health agencies. Results Through the proposed Sequential Comparative Analysis (SCA), the author found a more effective way to justify the selection of cases, compare and assess organizational transformative capacity, and to learn from agency success in policy sustainability processes. Conclusions To more affectively understand and explain why some multilateral health agencies are more capable of adapting to country and individual healthcare needs, SCA provides a methodological approach that may help to better understand why these agencies are so different and what we can learn from successful reform processes. As funding challenges continue to hamper these agencies' adaptive capacity, learning from each other will become increasingly important. PMID:24886283
Gómez, Eduardo J
2014-05-20
This article proposes an approach to comparing and assessing the adaptive capacity of multilateral health agencies in meeting country and individual healthcare needs. Most studies comparing multilateral health agencies have failed to clearly propose a method for conducting agency comparisons. This study conducted a qualitative case study methodological approach, such that secondary and primary case study literature was used to conduct case study comparisons of multilateral health agencies. Through the proposed Sequential Comparative Analysis (SCA), the author found a more effective way to justify the selection of cases, compare and assess organizational transformative capacity, and to learn from agency success in policy sustainability processes. To more affectively understand and explain why some multilateral health agencies are more capable of adapting to country and individual healthcare needs, SCA provides a methodological approach that may help to better understand why these agencies are so different and what we can learn from successful reform processes. As funding challenges continue to hamper these agencies' adaptive capacity, learning from each other will become increasingly important.
Illeghems, Koen; De Vuyst, Luc; Weckx, Stefan
2013-08-01
Acetobacter pasteurianus 386B, an acetic acid bacterium originating from a spontaneous cocoa bean heap fermentation, proved to be an ideal functional starter culture for coca bean fermentations. It is able to dominate the fermentation process, thereby resisting high acetic acid concentrations and temperatures. However, the molecular mechanisms underlying its metabolic capabilities and niche adaptations are unknown. In this study, whole-genome sequencing and comparative genome analysis was used to investigate this strain's mechanisms to dominate the cocoa bean fermentation process. The genome sequence of A. pasteurianus 386B is composed of a 2.8-Mb chromosome and seven plasmids. The annotation of 2875 protein-coding sequences revealed important characteristics, including several metabolic pathways, the occurrence of strain-specific genes such as an endopolygalacturonase, and the presence of mechanisms involved in tolerance towards various stress conditions. Furthermore, the low number of transposases in the genome and the absence of complete phage genomes indicate that this strain might be more genetically stable compared with other A. pasteurianus strains, which is an important advantage for the use of this strain as a functional starter culture. Comparative genome analysis with other members of the Acetobacteraceae confirmed the functional properties of A. pasteurianus 386B, such as its thermotolerant nature and unique genetic composition. Genome analysis of A. pasteurianus 386B provided detailed insights into the underlying mechanisms of its metabolic features, niche adaptations, and tolerance towards stress conditions. Combination of these data with previous experimental knowledge enabled an integrated, global overview of the functional characteristics of this strain. This knowledge will enable improved fermentation strategies and selection of appropriate acetic acid bacteria strains as functional starter culture for cocoa bean fermentation processes.
[Comparative analysis on industrial standardization degree of Chinese and Korean ginseng].
Chu, Qiao; Xi, Xing-Jun; Wang, He-Yan; Si, Ding-Hua; Tang, Fei; Lan, Tao
2017-05-01
Panax ginseng is a well-known medicinal plant all over the world. It has high nutritional value and medicinal value. China and South Korea are the major countries in the world for ginseng cultivation, production and exportation. China's ginseng production accounts for more than half of the world, but the output value is less than that of Korea. The standardization process of ginseng industry plays an important role. This paper makes a detailed analysis of the Chinese and Korean ginseng national standards and the standardization process, and makes a detailed comparative analysis of the categories, standard contents, index selection, age, implementation and promotion status of the Chinese and Korean ginseng standards. The development disadvantages of ginseng industry standardization were displayed. And we give our advises on the standard revision, implementation of China's ginseng industry standardization, hoping to enhance the competitiveness of China's ginseng industry. Copyright© by the Chinese Pharmaceutical Association.
An integrated workflow for analysis of ChIP-chip data.
Weigelt, Karin; Moehle, Christoph; Stempfl, Thomas; Weber, Bernhard; Langmann, Thomas
2008-08-01
Although ChIP-chip is a powerful tool for genome-wide discovery of transcription factor target genes, the steps involving raw data analysis, identification of promoters, and correlation with binding sites are still laborious processes. Therefore, we report an integrated workflow for the analysis of promoter tiling arrays with the Genomatix ChipInspector system. We compare this tool with open-source software packages to identify PU.1 regulated genes in mouse macrophages. Our results suggest that ChipInspector data analysis, comparative genomics for binding site prediction, and pathway/network modeling significantly facilitate and enhance whole-genome promoter profiling to reveal in vivo sites of transcription factor-DNA interactions.
Design as Knowledge Construction: Constructing Knowledge of Design
ERIC Educational Resources Information Center
Cennamo, Katherine C.
2004-01-01
In this article, I present a model of instructional design that has evolved from analysis and reflection on the process of designing materials for constructivist learning environments. I observed how we addressed the critical questions for instructional design, comparing the process to traditional instructional design models and to my emerging…
Seeking Common Ground: Deliberative Democracy and Sustainable Communities.
ERIC Educational Resources Information Center
Hyman, Drew; Clinehens, Brad
Public deliberation, sometimes called deliberative democracy, offers alternatives to what are often adversarial governmental debates and hearings. This paper provides a case example of applying the deliberative democracy process to development issues and an analysis of data comparing the effectiveness of the process for creating a consensus for…
Encoding Orientation and the Remembering of Schizophrenic Young Adults
ERIC Educational Resources Information Center
Koh, Soon D.; Peterson, Rolf A.
1978-01-01
This research examines different types of encoding strategies, in addition to semantic and organizational encodings, and their effects on schizophrenics' remembering. Based on Craik and Lockhart (1972), i.e., memory performance is a function of depth of encoding processing, this analysis compares schizophrenics' encoding processing with that of…
USDA-ARS?s Scientific Manuscript database
In this study, analytical results were compared when using different approaches to bulk food sample comminution, consisting of a vertical chopper (Blixer) at room temperature and at dry ice cryogenic conditions, followed by further subsample processing (20 g) using liquid nitrogen cryogenic conditio...
Gender, the Labor Process and Dignity at Work
ERIC Educational Resources Information Center
Crowley, Martha
2013-01-01
This study brings together gender inequality and labor process research to investigate how divergent control structures generate inequality in work experiences for women and men. Content-coded data on 155 work groups are analyzed using Qualitative Comparative Analysis to identify combinations of control techniques encountered by female and male…
Global GNSS processing based on the raw observation approach
NASA Astrophysics Data System (ADS)
Strasser, Sebastian; Zehentner, Norbert; Mayer-Gürr, Torsten
2017-04-01
Many global navigation satellite system (GNSS) applications, e.g. Precise Point Positioning (PPP), require high-quality GNSS products, such as precise GNSS satellite orbits and clocks. These products are routinely determined by analysis centers of the International GNSS Service (IGS). The current processing methods of the analysis centers make use of the ionosphere-free linear combination to reduce the ionospheric influence. Some of the analysis centers also form observation differences, in general double-differences, to eliminate several additional error sources. The raw observation approach is a new GNSS processing approach that was developed at Graz University of Technology for kinematic orbit determination of low Earth orbit (LEO) satellites and subsequently adapted to global GNSS processing in general. This new approach offers some benefits compared to well-established approaches, such as a straightforward incorporation of new observables due to the avoidance of observation differences and linear combinations. This becomes especially important in view of the changing GNSS landscape with two new systems, the European system Galileo and the Chinese system BeiDou, currently in deployment. GNSS products generated at Graz University of Technology using the raw observation approach currently comprise precise GNSS satellite orbits and clocks, station positions and clocks, code and phase biases, and Earth rotation parameters. To evaluate the new approach, products generated using the Global Positioning System (GPS) constellation and observations from the global IGS station network are compared to those of the IGS analysis centers. The comparisons show that the products generated at Graz University of Technology are on a similar level of quality to the products determined by the IGS analysis centers. This confirms that the raw observation approach is applicable to global GNSS processing. Some areas requiring further work have been identified, enabling future improvements of the method.
Mazza, Monica; Mariano, Melania; Peretti, Sara; Masedu, Francesco; Pino, Maria Chiara; Valenti, Marco
2017-05-01
Individuals with autism spectrum disorders (ASD) show significant impairments in social skills and theory of mind (ToM). The aim of this study was to evaluate ToM and social information processing abilities in 52 children with ASD compared to 55 typically developing (TD) children. A mediation analysis evaluated whether social information processing abilities can be mediated by ToM competences. In our results, children with autism showed a deficit in social skills and ToM components. The innovative results of our study applying mediation analysis demonstrate that ToM plays a key role in the development of social abilities, and the lack of ToM competences in children with autism impairs their competent social behavior.
Comparative study of resist stabilization techniques for metal etch processing
NASA Astrophysics Data System (ADS)
Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.
1999-06-01
This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.
2014-12-01
example of maximizing or minimizing decision variables within a model. Carol Stoker and Stephen Mehay present a comparative analysis of marketing and advertising strategies...strategy development process; documenting various recruiting, marketing , and advertising initiatives in each nation; and examining efforts to
The r-Java 2.0 code: nuclear physics
NASA Astrophysics Data System (ADS)
Kostka, M.; Koning, N.; Shand, Z.; Ouyed, R.; Jaikumar, P.
2014-08-01
Aims: We present r-Java 2.0, a nucleosynthesis code for open use that performs r-process calculations, along with a suite of other analysis tools. Methods: Equipped with a straightforward graphical user interface, r-Java 2.0 is capable of simulating nuclear statistical equilibrium (NSE), calculating r-process abundances for a wide range of input parameters and astrophysical environments, computing the mass fragmentation from neutron-induced fission and studying individual nucleosynthesis processes. Results: In this paper we discuss enhancements to this version of r-Java, especially the ability to solve the full reaction network. The sophisticated fission methodology incorporated in r-Java 2.0 that includes three fission channels (beta-delayed, neutron-induced, and spontaneous fission), along with computation of the mass fragmentation, is compared to the upper limit on mass fission approximation. The effects of including beta-delayed neutron emission on r-process yield is studied. The role of Coulomb interactions in NSE abundances is shown to be significant, supporting previous findings. A comparative analysis was undertaken during the development of r-Java 2.0 whereby we reproduced the results found in the literature from three other r-process codes. This code is capable of simulating the physical environment of the high-entropy wind around a proto-neutron star, the ejecta from a neutron star merger, or the relativistic ejecta from a quark nova. Likewise the users of r-Java 2.0 are given the freedom to define a custom environment. This software provides a platform for comparing proposed r-process sites.
NASA Astrophysics Data System (ADS)
Pratiwi, V. N.
2018-03-01
Rice is a staple food and regarded as a useful carbohydrate source. In general rice is high in glycaemic index (GI) and low colonic fermentation. People are aware of the alterations in blood glucose levels or glycaemic index after consuming rice. Resistant starch (RS) and amylose content play an important role in controlling GI. GI and RS content have been established as important indicators of starch digestibility. The aim of this study was to determine the precooked process with hydrothermal (boiling at 80°C, 10 minutes) and cooling process with low temperature (4°C, 1 h) to increase potential content of RS and decrease of glycaemic index of white rice. There were two stages of this research, 1) preparation of white rice with precooked process; 2) analysis of precooked white rice characteristics (resistant starch, amylose content, and estimated glycaemic index). The result of analysis on precooked white rice showed an increased RS content (1.11%) and white rice (0.99%), but the difference was not statistically significant. The amylose content increased significantly after precooked process in white rice (24.70%) compared with white rice (20.89%). Estimated glycaemic index (EGI) decreased after precooked proses (65.63%) but not significant as compared to white rice (66.47%). From the present study it was concluded that precooked process had no significant impact on increasing RS and decreasing EGI of white rice. This may be due to the relatively short cooling time (1hour) in 4°C.
Preliminary Thermal-Mechanical Sizing of Metallic TPS: Process Development and Sensitivity Studies
NASA Technical Reports Server (NTRS)
Poteet, Carl C.; Abu-Khajeel, Hasan; Hsu, Su-Yuen
2002-01-01
The purpose of this research was to perform sensitivity studies and develop a process to perform thermal and structural analysis and sizing of the latest Metallic Thermal Protection System (TPS) developed at NASA LaRC (Langley Research Center). Metallic TPS is a key technology for reducing the cost of reusable launch vehicles (RLV), offering the combination of increased durability and competitive weights when compared to other systems. Accurate sizing of metallic TPS requires combined thermal and structural analysis. Initial sensitivity studies were conducted using transient one-dimensional finite element thermal analysis to determine the influence of various TPS and analysis parameters on TPS weight. The thermal analysis model was then used in combination with static deflection and failure mode analysis of the sandwich panel outer surface of the TPS to obtain minimum weight TPS configurations at three vehicle stations on the windward centerline of a representative RLV. The coupled nature of the analysis requires an iterative analysis process, which will be described herein. Findings from the sensitivity analysis are reported, along with TPS designs at the three RLV vehicle stations considered.
Emotional words facilitate lexical but not early visual processing.
Trauer, Sophie M; Kotz, Sonja A; Müller, Matthias M
2015-12-12
Emotional scenes and faces have shown to capture and bind visual resources at early sensory processing stages, i.e. in early visual cortex. However, emotional words have led to mixed results. In the current study ERPs were assessed simultaneously with steady-state visual evoked potentials (SSVEPs) to measure attention effects on early visual activity in emotional word processing. Neutral and negative words were flickered at 12.14 Hz whilst participants performed a Lexical Decision Task. Emotional word content did not modulate the 12.14 Hz SSVEP amplitude, neither did word lexicality. However, emotional words affected the ERP. Negative compared to neutral words as well as words compared to pseudowords lead to enhanced deflections in the P2 time range indicative of lexico-semantic access. The N400 was reduced for negative compared to neutral words and enhanced for pseudowords compared to words indicating facilitated semantic processing of emotional words. LPC amplitudes reflected word lexicality and thus the task-relevant response. In line with previous ERP and imaging evidence, the present results indicate that written emotional words are facilitated in processing only subsequent to visual analysis.
Rešková, Z; Koreňová, J; Kuchta, T
2014-04-01
A total of 256 isolates of Staphylococcus aureus were isolated from 98 samples (34 swabs and 64 food samples) obtained from small or medium meat- and cheese-processing plants in Slovakia. The strains were genotypically characterized by multiple locus variable number of tandem repeats analysis (MLVA), involving multiplex polymerase chain reaction (PCR) with subsequent separation of the amplified DNA fragments by an automated flow-through gel electrophoresis. With the panel of isolates, MLVA produced 31 profile types, which was a sufficient discrimination to facilitate the description of spatial and temporal aspects of contamination. Further data on MLVA discrimination were obtained by typing a subpanel of strains by multiple locus sequence typing (MLST). MLVA coupled to automated electrophoresis proved to be an effective, comparatively fast and inexpensive method for tracing S. aureus contamination of food-processing factories. Subspecies genotyping of microbial contaminants in food-processing factories may facilitate identification of spatial and temporal aspects of the contamination. This may help to properly manage the process hygiene. With S. aureus, multiple locus variable number of tandem repeats analysis (MLVA) proved to be an effective method for the purpose, being sufficiently discriminative, yet comparatively fast and inexpensive. The application of automated flow-through gel electrophoresis to separation of DNA fragments produced by multiplex PCR helped to improve the accuracy and speed of the method. © 2013 The Society for Applied Microbiology.
Comparative Analysis of Haar and Daubechies Wavelet for Hyper Spectral Image Classification
NASA Astrophysics Data System (ADS)
Sharif, I.; Khare, S.
2014-11-01
With the number of channels in the hundreds instead of in the tens Hyper spectral imagery possesses much richer spectral information than multispectral imagery. The increased dimensionality of such Hyper spectral data provides a challenge to the current technique for analyzing data. Conventional classification methods may not be useful without dimension reduction pre-processing. So dimension reduction has become a significant part of Hyper spectral image processing. This paper presents a comparative analysis of the efficacy of Haar and Daubechies wavelets for dimensionality reduction in achieving image classification. Spectral data reduction using Wavelet Decomposition could be useful because it preserves the distinction among spectral signatures. Daubechies wavelets optimally capture the polynomial trends while Haar wavelet is discontinuous and resembles a step function. The performance of these wavelets are compared in terms of classification accuracy and time complexity. This paper shows that wavelet reduction has more separate classes and yields better or comparable classification accuracy. In the context of the dimensionality reduction algorithm, it is found that the performance of classification of Daubechies wavelets is better as compared to Haar wavelet while Daubechies takes more time compare to Haar wavelet. The experimental results demonstrate the classification system consistently provides over 84% classification accuracy.
Composing across Modes: A Comparative Analysis of Adolescents' Multimodal Composing Processes
ERIC Educational Resources Information Center
Smith, Blaine E.
2017-01-01
Although the shift from page to screen has dramatically redefined conceptions of writing, very little is known about how youth compose with multiple modes in digital environments. Integrating multimodality and multiliteracies theoretical frameworks, this comparative case study examined how urban twelfth-grade students collaboratively composed…
Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu
2016-01-01
A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.
ERIC Educational Resources Information Center
Morimoto, Chie; Hida, Eisuke; Shima, Keisuke; Okamura, Hitoshi
2018-01-01
To identify a specific sensorimotor impairment feature of autism spectrum disorder (ASD), we focused on temporal processing with millisecond accuracy. A synchronized finger-tapping task was used to characterize temporal processing in individuals with ASD as compared to typically developing (TD) individuals. We found that individuals with ASD…
Finite Volume Element (FVE) discretization and multilevel solution of the axisymmetric heat equation
NASA Astrophysics Data System (ADS)
Litaker, Eric T.
1994-12-01
The axisymmetric heat equation, resulting from a point-source of heat applied to a metal block, is solved numerically; both iterative and multilevel solutions are computed in order to compare the two processes. The continuum problem is discretized in two stages: finite differences are used to discretize the time derivatives, resulting is a fully implicit backward time-stepping scheme, and the Finite Volume Element (FVE) method is used to discretize the spatial derivatives. The application of the FVE method to a problem in cylindrical coordinates is new, and results in stencils which are analyzed extensively. Several iteration schemes are considered, including both Jacobi and Gauss-Seidel; a thorough analysis of these schemes is done, using both the spectral radii of the iteration matrices and local mode analysis. Using this discretization, a Gauss-Seidel relaxation scheme is used to solve the heat equation iteratively. A multilevel solution process is then constructed, including the development of intergrid transfer and coarse grid operators. Local mode analysis is performed on the components of the amplification matrix, resulting in the two-level convergence factors for various combinations of the operators. A multilevel solution process is implemented by using multigrid V-cycles; the iterative and multilevel results are compared and discussed in detail. The computational savings resulting from the multilevel process are then discussed.
Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng
2017-12-01
Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.
2013-01-01
Background Aconitum is an indispensable entity of the traditional medicine therapy in Ayurveda and Traditional Chinese medicine (TCM), in spite of its known fatal toxicity characteristics. The prolonged use of this drug, irrespective of its known lethal effects, is governed by the practice of effective detoxification processes that have been used for decades. However, the processing methods of Ayurveda and TCM are different, and no comparative study has been carried out to evaluate their differences. The objective of the present study was to carry out comparative chemical profiling of the roots of Aconitum heterophyllum Wall, A. carmichaelii Debx., and A. kusnezoffii Reichb. after application of two detoxification methods used in Ayurveda and one method used in TCM . Results Analysis of the processed samples was carried out by ultra-high performance liquid chromatography combined with quadrupole time-of-flight mass spectrometry (UHPLC-QTOF/MS). The results obtained in the study demonstrate that all three processing methods used in Ayurveda and TCM effectively extract the diester diterpenoid alkaloids and led to their conversion into monoester diterpenoid alkaloids. The efficiency of the processes in reduction of toxic alkaloid contents can be stated as: Processing with water > Shodhana with cow milk > Shodhana with cow urine. The analysis method was validated as per ICH-Q2R1 guidelines and all the parameters were found to comply with the recommendations stated in the guidelines. Conclusions There have been no reports till date, to compare the processing methods used in Ayurveda with the methods used in TCM for detoxification of aconite roots. Our study demonstrates that, these methods used in both the traditional systems of medicine, efficiently detoxify the aconite roots. Amongst the three selected procedures, the TCM method of decoction with water is the most efficient. Through experimental evidences, we prove the conversion of toxic diester diterpenoid alkaloids to relatively safer monoester diterpenoid alkaloids. Thus, this study demonstrates that comparative study on the traditional experiences accumulated in different medical systems is useful for expanding their respective applications. PMID:24156713
Independent Orbiter Assessment (IOA): Assessment of the data processing system FMEA/CIL
NASA Technical Reports Server (NTRS)
Lowery, H. J.; Haufler, W. A.
1986-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed an analysis of the Data Processing System (DPS) hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline with proposed Post 51-L updates included. A resolution of each discrepancy from the comparison is provided through additional analysis as required. The results of that comparison is documented for the Orbiter DPS hardware.
This paper proposes a robustness analysis based on Multiple Criteria Decision Aiding (MCDA). The ensuing model was used to assess the implementation of green chemistry principles in the synthesis of silver nanoparticles. Its recommendations were also compared to an earlier develo...
Opportunities for Applied Behavior Analysis in the Total Quality Movement.
ERIC Educational Resources Information Center
Redmon, William K.
1992-01-01
This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…
"We Are Textbook 'Badnekais'!": A Bernsteinian Analysis of Textbook Culture in Science Classrooms
ERIC Educational Resources Information Center
Vijaysimha, Indira
2013-01-01
This article is an empirical study of science teaching practices using a Bernsteinian framework. It provides a comparative analysis through ethnographic examination of pedagogic recontextualisation in different school types--government, private unaided and international. Bernstein drew attention to the process of pedagogic recontextualisation and…
Reading comprehension skills of young adults with childhood diagnoses of dyslexia.
Ransby, Marilyn J; Swanson, H Lee
2003-01-01
This study explores the contribution of cognitive processes to comprehension skills in adults who suffered from childhood developmental dyslexia (CD). The performance of adults with CD (ages 17 to 23), chronological age-matched (CA) adults, and reading level-matched (RL) children was compared on measures of phonological processing, naming speed, working memory (WM), general knowledge, vocabulary, and comprehension. The results showed that adults with CD scored lower on measures of phonological processing, naming speed, WM, general knowledge, and vocabulary when compared to CA readers but were comparable to RL children on the majority of process measures. Phonological processing, naming speed, vocabulary, general knowledge, and listening comprehension contributed independent variance to reading comprehension accuracy, whereas WM, intelligence, phonological processing, and listening comprehension contributed independent variance to comprehension fluency. Adults with CD scored lower than CA adults and higher than RL children on measures of lexical processing, WM, and listening comprehension when word recognition and intelligence were partialed from the analysis. In summary, constraints in phonological processing and naming speed mediate only some of the influence of high-order processes on reading comprehension. Furthermore, adults with CD experience difficulties in WM, listening comprehension, and vocabulary independently of their word recognition problems and intellectual ability.
Rabellino, D; Tursich, M; Frewen, P A; Daniels, J K; Densmore, M; Théberge, J; Lanius, R A
2015-11-01
To investigate the functional connectivity of large-scale intrinsic connectivity networks (ICNs) in post-traumatic stress disorder (PTSD) during subliminal and supraliminal presentation of threat-related stimuli. Group independent component analysis was utilized to study functional connectivity within the ICNs most correlated with the Default-mode Network (DMN), Salience Network (SN), and Central Executive Network (CEN) in PTSD participants (n = 26) as compared to healthy controls (n = 20) during sub- and supraliminal processing of threat-related stimuli. Comparing patients with PTSD with healthy participants, prefrontal and anterior cingulate cortex involved in top-down regulation showed increased integration during subliminal threat processing within the CEN and SN and during supraliminal threat processing within the DMN. The right amygdala showed increased connectivity with the DMN during subliminal processing in PTSD as compared to controls. Brain regions associated with self-awareness and consciousness exhibited decreased connectivity during subliminal threat processing in PTSD as compared to controls: the claustrum within the SN and the precuneus within the DMN. Key nodes of the ICNs showed altered functional connectivity in PTSD as compared to controls, and differential results characterized sub- and supraliminal processing of threat-related stimuli. These findings enhance our understanding of ICNs underlying PTSD at different levels of conscious threat perception. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Pramono, H.; Pujiastuti, D. Y.; Sahidu, A. M.
2018-04-01
The effect of acid- and alkali-process on biochemical and physicochemical characteristics of fish protein isolate from red snapper (Lutjanus sp) by-product was evaluated. Protein recovered by alkali process (16.79%) was higher compared to acid process (13.75%). Reduction of lipid content and total volatile basic nitrogen (TVB-N) exhibited in both treatments indicated both process improved fish protein isolate recovered from red snapper by-product. In addition, the increasing of water holding capacity and oil binding capacity were observed. However, high peroxide value of fish protein isolate was showed in both treatment. This finding indicated that acid and alkali process can be used as a useful method to recover proteins from red snapper by-product. Alkali process gave a protein isolate with better overall quality compared to acid process.
Schneiderhan, Wilhelm; Grundt, Alexander; Wörner, Stefan; Findeisen, Peter; Neumaier, Michael
2013-11-01
Because sepsis has a high mortality rate, rapid microbiological diagnosis is required to enable efficient therapy. The effectiveness of MALDI-TOF mass spectrometry (MALDI-TOF MS) analysis in reducing turnaround times (TATs) for blood culture (BC) pathogen identification when available in a 24-h hospital setting has not been determined. On the basis of data from a total number of 912 positive BCs collected within 140 consecutive days and work flow analyses of laboratory diagnostics, we evaluated different models to assess the TATs for batch-wise and for immediate response (real-time) MALDI-TOF MS pathogen identification of positive BC results during the night shifts. The results were compared to TATs from routine BC processing and biochemical identification performed during regular working hours. Continuous BC incubation together with batch-wise MALDI-TOF MS analysis enabled significant reductions of up to 58.7 h in the mean TATs for the reporting of the bacterial species. The TAT of batch-wise MALDI-TOF MS analysis was inferior by a mean of 4.9 h when compared to the model of the immediate work flow under ideal conditions with no constraints in staff availability. Together with continuous cultivation of BC, the 24-h availability of MALDI-TOF MS can reduce the TAT for microbial pathogen identification within a routine clinical laboratory setting. Batch-wise testing of positive BC loses a few hours compared to real-time identification but is still far superior to classical BC processing. Larger prospective studies are required to evaluate the contribution of rapid around-the-clock pathogen identification to medical decision-making for septicemic patients.
Wildenberg, Manon E; Duijvestein, Marjolijn; Westera, Liset; van Viegen, Tanja; Buskens, Christianne J; van der Bilt, Jarmila D W; Stitt, Larry; Jairath, Vipul; Feagan, Brian G; Vande Casteele, Niels
2018-06-01
Flow cytometric (FC) analysis of intestinal tissue biopsies requires prompt cell isolation and processing to prevent cell death and generate valid data. We examined the effect of storage conditions prior to cell isolation and FC on viable cell yield and the proportions of immune cell phenotypes from intestinal biopsies. Biopsies (N = 224) from inflamed or non-inflamed ileal and/or colonic tissue from three patients with Crohn's disease were processed and analyzed immediately in duplicate, or stored under different conditions. Cells were isolated and stained for specific markers, followed by FC. Decreased mean live CD45+ cell counts were observed after storage of biopsies at -80 °C dimethyl sulfoxide (DMSO)/citrate buffer compared with immediate processing (1794.3 vs. 19,672.7; p = 0.006]). A non-significant decrease in CD45+ live cell count occurred after storage at -20 °C in DMSO/citrate buffer and cell yield was adequate for subsequent analysis. CD3+ cell proportions were significantly lower after storage at 4 °C in complete medium for 48 h compared with immediate analysis. Mean CD14+ cell proportions were significantly higher after storage of biopsies at -80 °C in DMSO/citrate buffer compared with immediate analysis (2.61% vs. 1.31%, p = 0.007). CD4+, CD8+ and CD4+/CD8+ cell proportions were unaffected by storage condition. Storage of intestinal tissue biopsies at -20 °C in DMSO/citrate buffer for up to 48 h resulted in sufficient viable cell yield for FC analysis without affecting subsequent marker-positive cell proportions. These findings support the potential shipping and storage of intestinal biopsies for centralized FC analysis in multicenter clinical trials. Copyright © 2018 Elsevier B.V. All rights reserved.
Impact assessment of GPS radio occultation data on Antarctic analysis and forecast using WRF 3DVAR
NASA Astrophysics Data System (ADS)
Zhang, H.; Wee, T. K.; Liu, Z.; Lin, H. C.; Kuo, Y. H.
2016-12-01
This study assesses the impact of Global Positioning System (GPS) Radio Occultation (RO) refractivity data on the analysis and forecast in the Antarctic region. The RO data are continuously assimilated into the Weather Research and Forecasting (WRF) Model using the WRF 3DVAR along with other observations that were operationally available to the National Center for Environmental Prediction (NCEP) during a month period, October 2010, including the Advance Microwave Sounding Unit (AMSU) radiance data. For the month-long data assimilation experiments, three RO datasets are used: 1) The actual operational dataset, which was produced by the near real-time RO processing at that time and provided to weather forecasting centers; 2) a post-processed dataset with posterior clock and orbit estimates, and with improved RO processing algorithms; and, 3) another post-processed dataset, produced with a variational RO processing. The data impact is evaluated with comparing the forecasts and analyses to independent driftsonde observations that are made available through the Concordiasi field campaign, in addition to utilizing other traditional means of verification. A denial of RO data (while keeping all other observations) resulted in a remarkable quality degradation of analysis and forecast, indicating the high value of RO data over the Antarctic area. The post-processed RO data showed a significantly larger positive impact compared to the near real-time data, due to extra RO data from the TerraSAR-X satellite (unavailable at the time of the near real-time processing) as well as the supposedly improved data quality as a result of the post-processing. This strongly suggests that the future polar constellation of COSMIC-2 is vital. The variational RO processing further reduced the systematic and random errors in both analysis and forecasts, for instance, leading to a smaller background departure of AMSU radiance. This indicates that the variational RO processing provides an improved reference for the bias correction of satellite radiance, making the bias correction more effective. This study finds that advanced RO data processing algorithms may further enhance the high quality of RO data in high Southern latitudes.
NASA Astrophysics Data System (ADS)
Sethuramalingam, Prabhu; Vinayagam, Babu Kupusamy
2016-07-01
Carbon nanotube mixed grinding wheel is used in the grinding process to analyze the surface characteristics of AISI D2 tool steel material. Till now no work has been carried out using carbon nanotube based grinding wheel. Carbon nanotube based grinding wheel has excellent thermal conductivity and good mechanical properties which are used to improve the surface finish of the workpiece. In the present study, the multi response optimization of process parameters like surface roughness and metal removal rate of grinding process of single wall carbon nanotube (CNT) in mixed cutting fluids is undertaken using orthogonal array with grey relational analysis. Experiments are performed with designated grinding conditions obtained using the L9 orthogonal array. Based on the results of the grey relational analysis, a set of optimum grinding parameters is obtained. Using the analysis of variance approach the significant machining parameters are found. Empirical model for the prediction of output parameters has been developed using regression analysis and the results are compared empirically, for conditions of with and without CNT grinding wheel in grinding process.
EPE analysis of sub-N10 BEoL flow with and without fully self-aligned via using Coventor SEMulator3D
NASA Astrophysics Data System (ADS)
Franke, Joern-Holger; Gallagher, Matt; Murdoch, Gayle; Halder, Sandip; Juncker, Aurelie; Clark, William
2017-03-01
During the last few decades, the semiconductor industry has been able to scale device performance up while driving costs down. What started off as simple geometrical scaling, driven mostly by advances in lithography, has recently been accompanied by advances in processing techniques and in device architectures. The trend to combine efforts using process technology and lithography is expected to intensify, as further scaling becomes ever more difficult. One promising component of future nodes are "scaling boosters", i.e. processing techniques that enable further scaling. An indispensable component in developing these ever more complex processing techniques is semiconductor process modeling software. Visualization of complex 3D structures in SEMulator3D, along with budget analysis on film thicknesses, CD and etch budgets, allow process integrators to compare flows before any physical wafers are run. Hundreds of "virtual" wafers allow comparison of different processing approaches, along with EUV or DUV patterning options for defined layers and different overlay schemes. This "virtual fabrication" technology produces massively parallel process variation studies that would be highly time-consuming or expensive in experiment. Here, we focus on one particular scaling booster, the fully self-aligned via (FSAV). We compare metal-via-metal (mevia-me) chains with self-aligned and fully-self-aligned via's using a calibrated model for imec's N7 BEoL flow. To model overall variability, 3D Monte Carlo modeling of as many variability sources as possible is critical. We use Coventor SEMulator3D to extract minimum me-me distances and contact areas and show how fully self-aligned vias allow a better me-via distance control and tighter via-me contact area variability compared with the standard self-aligned via (SAV) approach.
Changes in metal properties after thermal and electric impulse processing
NASA Astrophysics Data System (ADS)
Shaburova, N. A.
2015-04-01
The results of the experiments on processing metal melts by powerful electromagnetic impulses are given. The generator used in the experiments has the following characteristics: pulse height - 10KV, duration - 1ns, leading edge - 0.1ns, repetition rate - 1KHz, the output - 100KWt. The duration of the processing is 10-15min. The comparative analysis of the processed and unprocessed samples results in the changes of structure, increase of density, solidity, plasticity and resilience of cast metal. The result analysis of different external physical processing methods on alloys shows full agreement with the results of the ultrasonic processing of metals. The hypothesis of ultrasonic shock wave formation at the pulse front was adopted as the main mechanism of the electromagnetic impulse impact on alloys. The theoretical part of the research describes the transformation process of electromagnetic impulses into acoustic ones.
Gene Ontology-Based Analysis of Zebrafish Omics Data Using the Web Tool Comparative Gene Ontology.
Ebrahimie, Esmaeil; Fruzangohar, Mario; Moussavi Nik, Seyyed Hani; Newman, Morgan
2017-10-01
Gene Ontology (GO) analysis is a powerful tool in systems biology, which uses a defined nomenclature to annotate genes/proteins within three categories: "Molecular Function," "Biological Process," and "Cellular Component." GO analysis can assist in revealing functional mechanisms underlying observed patterns in transcriptomic, genomic, and proteomic data. The already extensive and increasing use of zebrafish for modeling genetic and other diseases highlights the need to develop a GO analytical tool for this organism. The web tool Comparative GO was originally developed for GO analysis of bacterial data in 2013 ( www.comparativego.com ). We have now upgraded and elaborated this web tool for analysis of zebrafish genetic data using GOs and annotations from the Gene Ontology Consortium.
A Comparative Analysis of the Budget Process in the Venezuelan and U.S. Navies.
1979-12-01
accounting problems in particular agencies. The DINCA developed a document, Sistema de Contabilidad de la Ejecucion Financiera del Presupuesto para...orienta- tion of the "Plan Operativo Anual" (POA) - Operative Annual Plan. During the Navy’s mid-range planning process the POA is produced. By means of
Children's Well-Being during Parents' Marital Disruption Process: A Pooled Time-Series Analysis.
ERIC Educational Resources Information Center
Sun, Yongmin; Li, Yuanzhang
2002-01-01
Examines the extent to which parents' marital disruption process affects children's academic performance and well-being both before and after parental divorce. Compared with peers in intact families, children of divorce faired less well. Discusses how family resources mediate detrimental effects over time. Similar results are noted for girls and…
Factors Affecting Christian Parents' School Choice Decision Processes: A Grounded Theory Study
ERIC Educational Resources Information Center
Prichard, Tami G.; Swezey, James A.
2016-01-01
This study identifies factors affecting the decision processes for school choice by Christian parents. Grounded theory design incorporated interview transcripts, field notes, and a reflective journal to analyze themes. Comparative analysis, including open, axial, and selective coding, was used to reduce the coded statements to five code families:…
Comparing Latent Dirichlet Allocation and Latent Semantic Analysis as Classifiers
ERIC Educational Resources Information Center
Anaya, Leticia H.
2011-01-01
In the Information Age, a proliferation of unstructured text electronic documents exists. Processing these documents by humans is a daunting task as humans have limited cognitive abilities for processing large volumes of documents that can often be extremely lengthy. To address this problem, text data computer algorithms are being developed.…
Meyer-Delpho, C; Schubert, H-J
2015-09-01
The added value of information and communications technologies should be demonstrated precisely in such areas of care in which the importance of intersectoral and interdisciplinary cooperation is particularly high. In the context of the accompanying research of a supply concept for palliative care patients, the potential of a digital documentation process was comparatively analysed with the conventional paper-based workflow. Data were collected in the form of a multi-methodological approach and processed for the project in 3 stages: (1) Development and analysis of a palliative care process with the focus on all relevant steps of documentation. (2) Questionnaire design and the comparative mapping of specific process times. (3) Sampling, selection, and analysis of patient records and their derivable insights of process iterations. With the use of ICT, the treatment time per patient is reduced by up to 53% and achieves a reduction in costs and workload by up to 901 min. The result of an up to 213% increase in the number of patient contacts allows a higher continuity of care. Although the 16% increase in documentation loyalty improves the usability of cross-team documented information, it partially extends the workload on the level of individual actors. By using a digital health record around 31% more patients could be treated with the same staffing ratio. The multi-stage analysis of the palliative care process showed that ICT has a decisive influence on the process dimension of intersectoral cooperation. Due to favourable organisational conditions the pioneering work of palliative care also provides important guidance for a successful use of ICT technologies in the context of innovative forms of care. © Georg Thieme Verlag KG Stuttgart · New York.
Synthesis and electrochemical properties of NiO nanospindles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Hai; University of Chinese Academy of Sciences, Beijing 100049; Lv, Baoliang, E-mail: lbl604@sxicc.ac.cn
2014-02-01
Graphical abstract: NiO nanospindles with a different electrochemical activity as compared to those previous reports were synthesized via an agglomeration–dissolution–recrystallization growth process without the addition of any surfactant. - Highlights: • NiO nanospindles were synthesized without the addition of any surfactant. • The agglomeration–dissolution–recrystallization growth process was used to explain the precursors’ formation process of the spindle-like NiO. • As-obtained spindle-like NiO showed a different electrochemical activity as compared to those previous reports. - Abstract: NiO nanospindles were successfully synthesized via a hydrothermal and post-treatment method. The as-synthesized nanospindles were about several hundred nanometers in width and about one micrometermore » in length. X-ray diffraction (XRD) analysis revealed that the spindle-like structure was cubic NiO phase crystalline. Scanning electron microscopy (SEM) and high-resolution transmission electron microscopy (HRTEM) analysis indicated that these NiO nanospindles were of single crystal nature. On the basis of time-dependent experiments, a possible agglomeration–dissolution–recrystallization growth process was proposed to explain the formation process of the spindle-like precursors. The cyclic voltammetry (CV) measurement showed that the as-prepared spindle-like NiO exhibited a pseudo-capacitance behavior.« less
Fu, Guang; Zhang, David Z; He, Allen N; Mao, Zhongfa; Zhang, Kaifei
2018-05-10
A deep understanding of the laser-material interaction mechanism, characterized by laser absorption, is very important in simulating the laser metal powder bed fusion (PBF) process. This is because the laser absorption of material affects the temperature distribution, which influences the thermal stress development and the final quality of parts. In this paper, a three-dimensional finite element analysis model of heat transfer taking into account the effect of material state and phase changes on laser absorption is presented to gain insight into the absorption mechanism, and the evolution of instantaneous absorptance in the laser metal PBF process. The results showed that the instantaneous absorptance was significantly affected by the time of laser radiation, as well as process parameters, such as hatch space, scanning velocity, and laser power, which were consistent with the experiment-based findings. The applicability of this model to temperature simulation was demonstrated by a comparative study, wherein the peak temperature in fusion process was simulated in two scenarios, with and without considering the effect of material state and phase changes on laser absorption, and the simulated results in the two scenarios were then compared with experimental data respectively.
NASA Astrophysics Data System (ADS)
Peer, Regina; Peer, Siegfried; Sander, Heike; Marsolek, Ingo; Koller, Wolfgang; Pappert, Dirk; Hierholzer, Johannes
2002-05-01
If new technology is introduced into medical practice it must prove to make a difference. However traditional approaches of outcome analysis failed to show a direct benefit of PACS on patient care and economical benefits are still in debate. A participatory process analysis was performed to compare workflow in a film based hospital and a PACS environment. This included direct observation of work processes, interview of involved staff, structural analysis and discussion of observations with staff members. After definition of common structures strong and weak workflow steps were evaluated. With a common workflow structure in both hospitals, benefits of PACS were revealed in workflow steps related to image reporting with simultaneous image access for ICU-physicians and radiologists, archiving of images as well as image and report distribution. However PACS alone is not able to cover the complete process of 'radiography for intensive care' from ordering of an image till provision of the final product equals image + report. Interference of electronic workflow with analogue process steps such as paper based ordering reduces the potential benefits of PACS. In this regard workflow modeling proved to be very helpful for the evaluation of complex work processes linking radiology and the ICU.
Marisch, Karoline; Bayer, Karl; Scharl, Theresa; Mairhofer, Juergen; Krempl, Peter M.; Hummel, Karin; Razzazi-Fazeli, Ebrahim; Striedner, Gerald
2013-01-01
Escherichia coli K–12 and B strains are among the most frequently used bacterial hosts for production of recombinant proteins on an industrial scale. To improve existing processes and to accelerate bioprocess development, we performed a detailed host analysis. We investigated the different behaviors of the E. coli production strains BL21, RV308, and HMS174 in response to high-glucose concentrations. Tightly controlled cultivations were conducted under defined environmental conditions for the in-depth analysis of physiological behavior. In addition to acquisition of standard process parameters, we also used DNA microarray analysis and differential gel electrophoresis (EttanTM DIGE). Batch cultivations showed different yields of the distinct strains for cell dry mass and growth rate, which were highest for BL21. In addition, production of acetate, triggered by excess glucose supply, was much higher for the K–12 strains compared to the B strain. Analysis of transcriptome data showed significant alteration in 347 of 3882 genes common among all three hosts. These differentially expressed genes included, for example, those involved in transport, iron acquisition, and motility. The investigation of proteome patterns additionally revealed a high number of differentially expressed proteins among the investigated hosts. The subsequently selected 38 spots included proteins involved in transport and motility. The results of this comprehensive analysis delivered a full genomic picture of the three investigated strains. Differentially expressed groups for targeted host modification were identified like glucose transport or iron acquisition, enabling potential optimization of strains to improve yield and process quality. Dissimilar growth profiles of the strains confirm different genotypes. Furthermore, distinct transcriptome patterns support differential regulation at the genome level. The identified proteins showed high agreement with the transcriptome data and suggest similar regulation within a host at both levels for the identified groups. Such host attributes need to be considered in future process design and operation. PMID:23950949
Marisch, Karoline; Bayer, Karl; Scharl, Theresa; Mairhofer, Juergen; Krempl, Peter M; Hummel, Karin; Razzazi-Fazeli, Ebrahim; Striedner, Gerald
2013-01-01
Escherichia coli K-12 and B strains are among the most frequently used bacterial hosts for production of recombinant proteins on an industrial scale. To improve existing processes and to accelerate bioprocess development, we performed a detailed host analysis. We investigated the different behaviors of the E. coli production strains BL21, RV308, and HMS174 in response to high-glucose concentrations. Tightly controlled cultivations were conducted under defined environmental conditions for the in-depth analysis of physiological behavior. In addition to acquisition of standard process parameters, we also used DNA microarray analysis and differential gel electrophoresis (Ettan(TM) DIGE). Batch cultivations showed different yields of the distinct strains for cell dry mass and growth rate, which were highest for BL21. In addition, production of acetate, triggered by excess glucose supply, was much higher for the K-12 strains compared to the B strain. Analysis of transcriptome data showed significant alteration in 347 of 3882 genes common among all three hosts. These differentially expressed genes included, for example, those involved in transport, iron acquisition, and motility. The investigation of proteome patterns additionally revealed a high number of differentially expressed proteins among the investigated hosts. The subsequently selected 38 spots included proteins involved in transport and motility. The results of this comprehensive analysis delivered a full genomic picture of the three investigated strains. Differentially expressed groups for targeted host modification were identified like glucose transport or iron acquisition, enabling potential optimization of strains to improve yield and process quality. Dissimilar growth profiles of the strains confirm different genotypes. Furthermore, distinct transcriptome patterns support differential regulation at the genome level. The identified proteins showed high agreement with the transcriptome data and suggest similar regulation within a host at both levels for the identified groups. Such host attributes need to be considered in future process design and operation.
Age Effects in L2 Grammar Processing as Revealed by ERPs and How (Not) to Study Them
Meulman, Nienke; Wieling, Martijn; Sprenger, Simone A.; Schmid, Monika S.
2015-01-01
In this study we investigate the effect of age of acquisition (AoA) on grammatical processing in second language learners as measured by event-related brain potentials (ERPs). We compare a traditional analysis involving the calculation of averages across a certain time window of the ERP waveform, analyzed with categorical groups (early vs. late), with a generalized additive modeling analysis, which allows us to take into account the full range of variability in both AoA and time. Sixty-six Slavic advanced learners of German listened to German sentences with correct and incorrect use of non-finite verbs and grammatical gender agreement. We show that the ERP signal depends on the AoA of the learner, as well as on the regularity of the structure under investigation. For gender agreement, a gradual change in processing strategies can be shown that varies by AoA, with younger learners showing a P600 and older learners showing a posterior negativity. For verb agreement, all learners show a P600 effect, irrespective of AoA. Based on their behavioral responses in an offline grammaticality judgment task, we argue that the late learners resort to computationally less efficient processing strategies when confronted with (lexically determined) syntactic constructions different from the L1. In addition, this study highlights the insights the explicit focus on the time course of the ERP signal in our analysis framework can offer compared to the traditional analysis. PMID:26683335
A Comparative Analysis of the Processes of Social Mobility in the USSR and in Today's Russia
ERIC Educational Resources Information Center
Shkaratan, O. I.; Iastrebov, G. A.
2012-01-01
When it comes to analyzing problems of mobility, most studies of the post-Soviet era have cited random and unconnected data with respect to the Soviet era, on the principle of comparing "the old" and "the new." The authors have deemed it possible (although based on material that is not fully comparable) to examine the late…
On selecting a prior for the precision parameter of Dirichlet process mixture models
Dorazio, R.M.
2009-01-01
In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.
NASA Technical Reports Server (NTRS)
Meng, J. C. S.; Thomson, J. A. L.
1975-01-01
A data analysis program constructed to assess LDV system performance, to validate the simulation model, and to test various vortex location algorithms is presented. Real or simulated Doppler spectra versus range and elevation is used and the spatial distributions of various spectral moments or other spectral characteristics are calculated and displayed. Each of the real or simulated scans can be processed by one of three different procedures: simple frequency or wavenumber filtering, matched filtering, and deconvolution filtering. The final output is displayed as contour plots in an x-y coordinate system, as well as in the form of vortex tracks deduced from the maxima of the processed data. A detailed analysis of run number 1023 and run number 2023 is presented to demonstrate the data analysis procedure. Vortex tracks and system range resolutions are compared with theoretical predictions.
Supporting Handoff in Asynchronous Collaborative Sensemaking Using Knowledge-Transfer Graphs.
Zhao, Jian; Glueck, Michael; Isenberg, Petra; Chevalier, Fanny; Khan, Azam
2018-01-01
During asynchronous collaborative analysis, handoff of partial findings is challenging because externalizations produced by analysts may not adequately communicate their investigative process. To address this challenge, we developed techniques to automatically capture and help encode tacit aspects of the investigative process based on an analyst's interactions, and streamline explicit authoring of handoff annotations. We designed our techniques to mediate awareness of analysis coverage, support explicit communication of progress and uncertainty with annotation, and implicit communication through playback of investigation histories. To evaluate our techniques, we developed an interactive visual analysis system, KTGraph, that supports an asynchronous investigative document analysis task. We conducted a two-phase user study to characterize a set of handoff strategies and to compare investigative performance with and without our techniques. The results suggest that our techniques promote the use of more effective handoff strategies, help increase an awareness of prior investigative process and insights, as well as improve final investigative outcomes.
Conceptual model of iCAL4LA: Proposing the components using comparative analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Zulaiha; Mutalib, Ariffin Abdul
2016-08-01
This paper discusses an on-going study that initiates an initial process in determining the common components for a conceptual model of interactive computer-assisted learning that is specifically designed for low achieving children. This group of children needs a specific learning support that can be used as an alternative learning material in their learning environment. In order to develop the conceptual model, this study extracts the common components from 15 strongly justified computer assisted learning studies. A comparative analysis has been conducted to determine the most appropriate components by using a set of specific indication classification to prioritize the applicability. The results of the extraction process reveal 17 common components for consideration. Later, based on scientific justifications, 16 of them were selected as the proposed components for the model.
Budget analysis of Escherichia coli at a southern Lake Michigan Beach
Thupaki, P.; Phanikumar, M.S.; Beletsky, D.; Schwab, D.J.; Nevers, M.B.; Whitman, R.L.
2010-01-01
Escherichia coli (EC) concentrations at two beaches impacted by river plume dynamics in southern Lake Michigan were analyzed using three-dimensional hydrodynamic and transport models. The relative importance of various physical and biological processes influencing the fate and transport of EC were examined via budget analysis and a first-order sensitivity analysis of model parameters. The along-shore advective fluxofEC(CFU/m2·s)was found to be higher compared to its crossshore counterpart; however, the sum of diffusive and advective components was of a comparable magnitude in both directions showing the importance of cross-shore exchange in EC transport. Examination of individual terms in the EC mass balance equation showed that vertical turbulent mixing in the water column dominated the overall EC transport for the summer conditions simulated. Dilution due to advection and diffusion accounted for a large portion of the total EC budget in the nearshore, and the net EC loss rate within the water column (CFU/m3·s) was an order of magnitude smaller compared to the horizontal and vertical transport rates. This result has important implications for modeling EC at recreational beaches; however, the assessment of the magnitude of EC loss rate is complicated due to the strong coupling between vertical exchange and depth-dependent EC loss processes such as sunlight inactivation and settling. Sensitivity analysis indicated that solar inactivation has the greatest impact on EC loss rates. Although these results are site-specific, they clearly bring out the relative importance of various processes involved.
NASA Technical Reports Server (NTRS)
Goldman, H.; Wolf, M.
1978-01-01
The significant economic data for the current production multiblade wafering and inner diameter slicing processes were tabulated and compared to data on the experimental and projected multiblade slurry, STC ID diamond coated blade, multiwire slurry and crystal systems fixed abrasive multiwire slicing methods. Cost calculations were performed for current production processes and for 1982 and 1986 projected wafering techniques.
Tomas, Merve; Beekwilder, Jules; Hall, Robert D; Sagdic, Osman; Boyacioglu, Dilek; Capanoglu, Esra
2017-04-01
The effect of industrial and home processing, in vitro gastrointestinal digestion, individual phenolic content, and antioxidant capacity of tomato into tomato sauce were investigated. Industrial processing of tomato fruit into sauce had an overall positive effect on the total antioxidant capacity (∼1.2-fold higher) compared to tomato fruit whereas home processing of tomato fruit into sauce led to a decrease in these values. Untargeted LC-QTOF-MS analysis revealed 31 compounds in tomato that changed upon processing, of which 18 could be putatively identified. Naringenin chalcone is only detectable in the fruit, while naringenin is strongly increased in the sauces. Rutin content increased by 36% in the industrial processed sauce whereas decreased by 26% in the home processed sauce when compared to fruit. According to the results of an in vitro gastrointestinal digestion model, industrial processing may lead to enhanced bioaccessibility of antioxidants. Copyright © 2016 Elsevier Ltd. All rights reserved.
Modeling the Structural Dynamic of Industrial Networks
NASA Astrophysics Data System (ADS)
Wilkinson, Ian F.; Wiley, James B.; Lin, Aizhong
Market systems consist of locally interacting agents who continuously pursue advantageous opportunities. Since the time of Adam Smith, a fundamental task of economics has been to understand how market systems develop and to explain their operation. During the intervening years, theory largely has stressed comparative statics analysis. Based on the assumptions of rational, utility or profit-maximizing agents, and negative, diminishing returns) feedback process, traditional economic analysis seeks to describe the, generally) unique state of an economy corresponding to an initial set of assumptions. The analysis is tatic in the sense that it does not describe the process by which an economy might get from one state to another.
Dunford, Elizabeth; Webster, Jacqui; Metzler, Adriana Blanco; Czernichow, Sebastien; Ni Mhurchu, Cliona; Wolmarans, Petro; Snowdon, Wendy; L'Abbe, Mary; Li, Nicole; Maulik, Pallab K; Barquera, Simon; Schoj, Verónica; Allemandi, Lorena; Samman, Norma; de Menezes, Elizabete Wenzel; Hassell, Trevor; Ortiz, Johana; Salazar de Ariza, Julieta; Rahman, A Rashid; de Núñez, Leticia; Garcia, Maria Reyes; van Rossum, Caroline; Westenbrink, Susanne; Thiam, Lim Meng; MacGregor, Graham; Neal, Bruce
2012-12-01
Chronic diseases are the leading cause of premature death and disability in the world with overnutrition a primary cause of diet-related ill health. Excess energy intake, saturated fat, sugar, and salt derived from processed foods are a major cause of disease burden. Our objective is to compare the nutritional composition of processed foods between countries, between food companies, and over time. Surveys of processed foods will be done in each participating country using a standardized methodology. Information on the nutrient composition for each product will be sought either through direct chemical analysis, from the product label, or from the manufacturer. Foods will be categorized into 14 groups and 45 categories for the primary analyses which will compare mean levels of nutrients at baseline and over time. Initial commitments to collaboration have been obtained from 21 countries. This collaborative approach to the collation and sharing of data will enable objective and transparent tracking of processed food composition around the world. The information collected will support government and food industry efforts to improve the nutrient composition of processed foods around the world.
ERIC Educational Resources Information Center
Meleta, Fufa E.; Zhang, Weizhong
2017-01-01
The main objective of this study is to compare the process of the senior secondary school mathematics curricula development in Ethiopia and Australia. The study was investigated qualitatively with document analysis and semi-structured interview research methods. The documents were collected from Federal Democratic Republic of Ethiopia Ministry of…
Perceived Stress in Chronic Illness: A Comparative Analysis of Four Diseases.
ERIC Educational Resources Information Center
Revenson, Tracey A.; Felton, Barbara J.
Most studies of stress and coping processes among patients with serious illnesses have focused on acute illness states. Far less research has involved systematic examination of the types and frequency of illness-related stresses experienced by individuals living with chronic illness. To compare the nature and degree of illness-related stress posed…
Comparative transcriptomic analysis of silkwormBmovo-1 and wild type silkworm ovary
Xue, Renyu; Hu, Xiaolong; Zhu, Liyuan; Cao, Guangli; Huang, Moli; Xue, Gaoxu; Song, Zuowei; Lu, Jiayu; Chen, Xueying; Gong, Chengliang
2015-01-01
The detailed molecular mechanism of Bmovo-1 regulation of ovary size is unclear. To uncover the mechanism of Bmovo-1 regulation of ovarian development and oogenesis using RNA-Seq, we compared the transcriptomes of wild type (WT) and Bmovo-1-overexpressing silkworm (silkworm+Bmovo-1) ovaries. Using a pair-end Illumina Solexa sequencing strategy, 5,296,942 total reads were obtained from silkworm+Bmovo-1 ovaries and 6,306,078 from WT ovaries. The average read length was about 100 bp. Clean read ratios were 98.79% for silkworm+Bmovo-1 and 98.87% for WT silkworm ovaries. Comparative transcriptome analysis showed 123 upregulated and 111 downregulated genes in silkworm+Bmovo-1 ovaries. These differentially expressed genes were enriched in the extracellular and extracellular spaces and involved in metabolism, genetic information processing, environmental information processing, cellular processes and organismal systems. Bmovo-1 overexpression in silkworm ovaries might promote anabolism for ovarian development and oogenesis and oocyte proliferation and transport of nutrients to ovaries by altering nutrient partitioning, which would support ovary development. Excessive consumption of nutrients for ovary development alters nutrient partitioning and deters silk protein synthesis. PMID:26643037
2014-01-01
Background Support vector regression (SVR) and Gaussian process regression (GPR) were used for the analysis of electroanalytical experimental data to estimate diffusion coefficients. Results For simulated cyclic voltammograms based on the EC, Eqr, and EqrC mechanisms these regression algorithms in combination with nonlinear kernel/covariance functions yielded diffusion coefficients with higher accuracy as compared to the standard approach of calculating diffusion coefficients relying on the Nicholson-Shain equation. The level of accuracy achieved by SVR and GPR is virtually independent of the rate constants governing the respective reaction steps. Further, the reduction of high-dimensional voltammetric signals by manual selection of typical voltammetric peak features decreased the performance of both regression algorithms compared to a reduction by downsampling or principal component analysis. After training on simulated data sets, diffusion coefficients were estimated by the regression algorithms for experimental data comprising voltammetric signals for three organometallic complexes. Conclusions Estimated diffusion coefficients closely matched the values determined by the parameter fitting method, but reduced the required computational time considerably for one of the reaction mechanisms. The automated processing of voltammograms according to the regression algorithms yields better results than the conventional analysis of peak-related data. PMID:24987463
Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R; Nestle, Frank O
2013-01-01
Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases.
Villanova, Federica; Di Meglio, Paola; Inokuma, Margaret; Aghaeepour, Nima; Perucha, Esperanza; Mollon, Jennifer; Nomura, Laurel; Hernandez-Fuentes, Maria; Cope, Andrew; Prevost, A. Toby; Heck, Susanne; Maino, Vernon; Lord, Graham; Brinkman, Ryan R.; Nestle, Frank O.
2013-01-01
Discovery of novel immune biomarkers for monitoring of disease prognosis and response to therapy in immune-mediated inflammatory diseases is an important unmet clinical need. Here, we establish a novel framework for immunological biomarker discovery, comparing a conventional (liquid) flow cytometry platform (CFP) and a unique lyoplate-based flow cytometry platform (LFP) in combination with advanced computational data analysis. We demonstrate that LFP had higher sensitivity compared to CFP, with increased detection of cytokines (IFN-γ and IL-10) and activation markers (Foxp3 and CD25). Fluorescent intensity of cells stained with lyophilized antibodies was increased compared to cells stained with liquid antibodies. LFP, using a plate loader, allowed medium-throughput processing of samples with comparable intra- and inter-assay variability between platforms. Automated computational analysis identified novel immunophenotypes that were not detected with manual analysis. Our results establish a new flow cytometry platform for standardized and rapid immunological biomarker discovery with wide application to immune-mediated diseases. PMID:23843942
Munro, Sarah A; Lund, Steven P; Pine, P Scott; Binder, Hans; Clevert, Djork-Arné; Conesa, Ana; Dopazo, Joaquin; Fasold, Mario; Hochreiter, Sepp; Hong, Huixiao; Jafari, Nadereh; Kreil, David P; Łabaj, Paweł P; Li, Sheng; Liao, Yang; Lin, Simon M; Meehan, Joseph; Mason, Christopher E; Santoyo-Lopez, Javier; Setterquist, Robert A; Shi, Leming; Shi, Wei; Smyth, Gordon K; Stralis-Pavese, Nancy; Su, Zhenqiang; Tong, Weida; Wang, Charles; Wang, Jian; Xu, Joshua; Ye, Zhan; Yang, Yong; Yu, Ying; Salit, Marc
2014-09-25
There is a critical need for standard approaches to assess, report and compare the technical performance of genome-scale differential gene expression experiments. Here we assess technical performance with a proposed standard 'dashboard' of metrics derived from analysis of external spike-in RNA control ratio mixtures. These control ratio mixtures with defined abundance ratios enable assessment of diagnostic performance of differentially expressed transcript lists, limit of detection of ratio (LODR) estimates and expression ratio variability and measurement bias. The performance metrics suite is applicable to analysis of a typical experiment, and here we also apply these metrics to evaluate technical performance among laboratories. An interlaboratory study using identical samples shared among 12 laboratories with three different measurement processes demonstrates generally consistent diagnostic power across 11 laboratories. Ratio measurement variability and bias are also comparable among laboratories for the same measurement process. We observe different biases for measurement processes using different mRNA-enrichment protocols.
NASA Technical Reports Server (NTRS)
Stoner, Mary Cecilia; Hehir, Austin R.; Ivanco, Marie L.; Domack, Marcia S.
2016-01-01
This cost-benefit analysis assesses the benefits of the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. These preliminary, rough order-of-magnitude results report a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Production cost savings of 35 to 58 percent were reported over the composite manufacturing technique used in this study for comparison; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels when compared with conventional metallic manufacturing. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. A case study compared these three alternatives for manufacturing a cylinder of specified geometry, with particular focus placed on production costs and process complexity, with cost analyses performed by the analogy and parametric methods. Furthermore, a scalability study was conducted for three tank diameters to assess the highest potential payoff of the ANNST process for manufacture of large-diameter cryogenic tanks. The analytical hierarchy process (AHP) was subsequently used with a group of selected subject matter experts to assess the value of the various benefits achieved by the ANNST method for potential stakeholders. The AHP study results revealed that decreased final cylinder mass and quality assurance were the most valued benefits of cylinder manufacturing methods, therefore emphasizing the relevance of the benefits achieved with the ANNST process for future projects.
Ellenberger, Daniel J; Miller, Dave A; Kucera, Sandra U; Williams, Robert O
2018-03-14
Vemurafenib is a poorly soluble, low permeability drug that has a demonstrated need for a solubility-enhanced formulation. However, conventional approaches for amorphous solid dispersion production are challenging due to the physiochemical properties of the compound. A suitable and novel method for creating an amorphous solid dispersion, known as solvent-controlled coprecipitation, was developed to make a material known as microprecipitated bulk powder (MBP). However, this approach has limitations in its processing and formulation space. In this study, it was hypothesized that vemurafenib can be processed by KinetiSol into the same amorphous formulation as MBP. The KinetiSol process utilizes high shear to rapidly process amorphous solid dispersions containing vemurafenib. Analysis of the material demonstrated that KinetiSol produced amorphous, single-phase material with acceptable chemical purity and stability. Values obtained were congruent to analysis conducted on the comparator material. However, the materials differed in particle morphology as the KinetiSol material was dense, smooth, and uniform while the MBP comparator was porous in structure and exhibited high surface area. The particles produced by KinetiSol had improved in-vitro dissolution and pharmacokinetic performance for vemurafenib compared to MBP due to slower drug nucleation and recrystallization which resulted in superior supersaturation maintenance during drug release. In the in-vivo rat pharmacokinetic study, both amorphous solid dispersions produced by KinetiSol exhibited mean AUC values at least two-fold that of MBP when dosed as a suspension. It was concluded that the KinetiSol process produced superior dosage forms containing vemurafenib with the potential for substantial reduction in patient pill burden.
ERIC Educational Resources Information Center
Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.
2008-01-01
Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…
ERIC Educational Resources Information Center
Abramson, Neil Remington; Senyshyn, Yaroslav
2009-01-01
Archetypal psychology suggests the possibility of a punishment archetype representing the unconscious preferences of human beings as a species about what constitutes appropriate ways for leaders (students, teachers and educational leaders) to correct followers who do harm to others. Mythological analysis compared God's process of punishment, in…
Reading in Two Languages: A Comparative Miscue Analysis
ERIC Educational Resources Information Center
Mikulec, Erin
2015-01-01
The purpose of the present study is to investigate what miscue analysis, a method described as a "window" into the reading process, can reveal about first and second language reading. Two participants, native English speakers proficient in Spanish, read and retold two folktales: one in English and one in Spanish. The researcher performed…
ERIC Educational Resources Information Center
Lyons, Elizabeth A.; Rue, Hanna C.; Luiselli, James K.; DiGennaro, Florence D.
2007-01-01
Rumination is a serious problem demonstrated by some people with developmental disabilities, but previous research has not included a functional analysis and has rarely compared intervention methods during the assessment process. We conducted functional analyses with 2 children who displayed postmeal rumination and subsequently evaluated a…
Life Cycle Impact Analysis (LCIA) has proven to be a valuable tool for systematically comparing processes and products, and has been proposed for use in Chemical Alternatives Analysis (CAA). The exposure assessment portion of the human health impact scores of LCIA has historicall...
Neutral model analysis of landscape patterns from mathematical morphology
Kurt H. Riitters; Peter Vogt; Pierre Soille; Jacek Kozak; Christine Estreguil
2007-01-01
Mathematical morphology encompasses methods for characterizing land-cover patterns in ecological research and biodiversity assessments. This paper reports a neutral model analysis of patterns in the absence of a structuring ecological process, to help set standards for comparing and interpreting patterns identified by mathematical morphology on real land-cover maps. We...
GBOOST: a GPU-based tool for detecting gene-gene interactions in genome-wide case control studies.
Yung, Ling Sing; Yang, Can; Wan, Xiang; Yu, Weichuan
2011-05-01
Collecting millions of genetic variations is feasible with the advanced genotyping technology. With a huge amount of genetic variations data in hand, developing efficient algorithms to carry out the gene-gene interaction analysis in a timely manner has become one of the key problems in genome-wide association studies (GWAS). Boolean operation-based screening and testing (BOOST), a recent work in GWAS, completes gene-gene interaction analysis in 2.5 days on a desktop computer. Compared with central processing units (CPUs), graphic processing units (GPUs) are highly parallel hardware and provide massive computing resources. We are, therefore, motivated to use GPUs to further speed up the analysis of gene-gene interactions. We implement the BOOST method based on a GPU framework and name it GBOOST. GBOOST achieves a 40-fold speedup compared with BOOST. It completes the analysis of Wellcome Trust Case Control Consortium Type 2 Diabetes (WTCCC T2D) genome data within 1.34 h on a desktop computer equipped with Nvidia GeForce GTX 285 display card. GBOOST code is available at http://bioinformatics.ust.hk/BOOST.html#GBOOST.
Barimani, Shirin; Kleinebudde, Peter
2017-10-01
A multivariate analysis method, Science-Based Calibration (SBC), was used for the first time for endpoint determination of a tablet coating process using Raman data. Two types of tablet cores, placebo and caffeine cores, received a coating suspension comprising a polyvinyl alcohol-polyethylene glycol graft-copolymer and titanium dioxide to a maximum coating thickness of 80µm. Raman spectroscopy was used as in-line PAT tool. The spectra were acquired every minute and correlated to the amount of applied aqueous coating suspension. SBC was compared to another well-known multivariate analysis method, Partial Least Squares-regression (PLS) and a simpler approach, Univariate Data Analysis (UVDA). All developed calibration models had coefficient of determination values (R 2 ) higher than 0.99. The coating endpoints could be predicted with root mean square errors (RMSEP) less than 3.1% of the applied coating suspensions. Compared to PLS and UVDA, SBC proved to be an alternative multivariate calibration method with high predictive power. Copyright © 2017 Elsevier B.V. All rights reserved.
Balz, Johanna; Roa Romero, Yadira; Keil, Julian; Krebber, Martin; Niedeggen, Michael; Gallinat, Jürgen; Senkowski, Daniel
2016-01-01
Recent behavioral and neuroimaging studies have suggested multisensory processing deficits in patients with schizophrenia (SCZ). Thus far, the neural mechanisms underlying these deficits are not well understood. Previous studies with unisensory stimulation have shown altered neural oscillations in SCZ. As such, altered oscillations could contribute to aberrant multisensory processing in this patient group. To test this assumption, we conducted an electroencephalography (EEG) study in 15 SCZ and 15 control participants in whom we examined neural oscillations and event-related potentials (ERPs) in the sound-induced flash illusion (SIFI). In the SIFI multiple auditory stimuli that are presented alongside a single visual stimulus can induce the illusory percept of multiple visual stimuli. In SCZ and control participants we compared ERPs and neural oscillations between trials that induced an illusion and trials that did not induce an illusion. On the behavioral level, SCZ (55.7%) and control participants (55.4%) did not significantly differ in illusion rates. The analysis of ERPs revealed diminished amplitudes and altered multisensory processing in SCZ compared to controls around 135 ms after stimulus onset. Moreover, the analysis of neural oscillations revealed altered 25–35 Hz power after 100 to 150 ms over occipital scalp for SCZ compared to controls. Our findings extend previous observations of aberrant neural oscillations in unisensory perception paradigms. They suggest that altered ERPs and altered occipital beta/gamma band power reflect aberrant multisensory processing in SCZ. PMID:27999553
A comparison of microwave versus direct solar heating for lunar brick production
NASA Technical Reports Server (NTRS)
Yankee, S. J.; Strenski, D. G.; Pletka, B. J.; Patil, D. S.; Mutsuddy, B. C.
1990-01-01
Two processing techniques considered suitable for producing bricks from lunar regolith are examined: direct solar heating and microwave heating. An analysis was performed to compare the two processes in terms of the amount of power and time required to fabricate bricks of various sizes. Microwave heating was shown to be significantly faster than solar heating for rapid production of realistic-size bricks. However, the relative simplicity of the solar collector(s) used for the solar furnace compared to the equipment necessary for microwave generation may present an economic tradeoff.
Enhanced round robin CPU scheduling with burst time based time quantum
NASA Astrophysics Data System (ADS)
Indusree, J. R.; Prabadevi, B.
2017-11-01
Process scheduling is a very important functionality of Operating system. The main-known process-scheduling algorithms are First Come First Serve (FCFS) algorithm, Round Robin (RR) algorithm, Priority scheduling algorithm and Shortest Job First (SJF) algorithm. Compared to its peers, Round Robin (RR) algorithm has the advantage that it gives fair share of CPU to the processes which are already in the ready-queue. The effectiveness of the RR algorithm greatly depends on chosen time quantum value. Through this research paper, we are proposing an enhanced algorithm called Enhanced Round Robin with Burst-time based Time Quantum (ERRBTQ) process scheduling algorithm which calculates time quantum as per the burst-time of processes already in ready queue. The experimental results and analysis of ERRBTQ algorithm clearly indicates the improved performance when compared with conventional RR and its variants.
Ptok, M; Meisen, R
2008-01-01
The rapid auditory processing defi-cit theory holds that impaired reading/writing skills are not caused exclusively by a cognitive deficit specific to representation and processing of speech sounds but arise due to sensory, mainly auditory, deficits. To further explore this theory we compared different measures of auditory low level skills to writing skills in school children. prospective study. School children attending third and fourth grade. just noticeable differences for intensity and frequency (JNDI, JNDF), gap detection (GD) monaural and binaural temporal order judgement (TOJb and TOJm); grade in writing, language and mathematics. correlation analysis. No relevant correlation was found between any auditory low level processing variable and writing skills. These data do not support the rapid auditory processing deficit theory.
The costs of emotional attention: affective processing inhibits subsequent lexico-semantic analysis.
Ihssen, Niklas; Heim, Sabine; Keil, Andreas
2007-12-01
The human brain has evolved to process motivationally relevant information in an optimized manner. The perceptual benefit for emotionally arousing material, termed motivated attention, is indexed by electrocortical amplification at various levels of stimulus analysis. An outstanding issue, particularly on a neuronal level, refers to whether and how perceptual enhancement for arousing signals translates into modified processing of information presented in temporal or spatial proximity to the affective cue. The present studies aimed to examine facilitation and interference effects of task-irrelevant emotional pictures on subsequent word identification. In the context of forced-choice lexical decision tasks, pictures varying in hedonic valence and emotional arousal preceded word/ pseudoword targets. Across measures and experiments, high-arousing compared to low-arousing pictures were associated with impaired processing of word targets. Arousing pleasant and unpleasant pictures prolonged word reaction times irrespective of stimulus-onset asynchrony (80 msec, 200 msec, 440 msec) and salient semantic category differences (e.g., erotica vs. mutilation pictures). On a neuronal level, interference was reflected in reduced N1 responses (204-264 msec) to both target types. Paralleling behavioral effects, suppression of the late positivity (404-704 msec) was more pronounced for word compared to pseudoword targets. Regional source modeling indicated that early reduction effects originated from inhibited cortical activity in posterior areas of the left inferior temporal cortex associated with orthographic processing. Modeling of later reduction effects argues for interference in distributed semantic networks comprising left anterior temporal and parietal sources. Thus, affective processing interferes with subsequent lexico-semantic analysis along the ventral stream.
A comparative analysis of electronic and molecular quantum dot cellular automata
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umamahesvari, H., E-mail: umamaheswarihema@gmail.com, E-mail: ajithavijay1@gmail.com; Ajitha, D., E-mail: umamaheswarihema@gmail.com, E-mail: ajithavijay1@gmail.com
This paper presents a comparative analysis of electronic quantum-dot cellular automata (EQCA) and Magnetic quantum dot Cellular Automata (MQCA). QCA is a computing paradigm that encodes and processes information by the position of individual electrons. To enhance the high dense and ultra-low power devices, various researches have been actively carried out to find an alternative way to continue and follow Moore’s law, so called “beyond CMOS technology”. There have been several proposals for physically implementing QCA, EQCA and MQCA are the two important QCAs reported so far. This paper provides a comparative study on these two QCAs.
The Future of the Space Age or how to Evaluate Innovative Ideas
NASA Astrophysics Data System (ADS)
Vollerthun, A.; Fricke, E.
2002-05-01
Based on an initiative of the German Aerospace Industry Association to foster a more transparent and structured funding of German commercial-oriented space projects a three-phased approach is suggested in this paper, to stepwise improve and evaluate proposed concepts for space-related innovations. The objective of this concept was to develop a transparent, structured, and reproducible process to select the right innovative project in terms of political, economical, and technical objectives for funding by e.g. a governmental agency. A stepwise process and related methods, that cover technical as well as economical aspects (and related sensitivities) are proposed. Based on the special needs and requirements of space industry the proposals are compared to a set of predefined top level objectives/requirements. Using an initial trades analysis with the criteria company, technology, product, and market, an initial business case is analyzed. The alternative innovative concepts are in the third process step subject to a very detailed analysis. The full economical and technical scale of the projects is evaluated and metrics for e.g. the 'Return on Investment' or 'Break Even Point' are determined, to compare the various innovations. Risks related to time, cost, and quality are considered, when performing sensitivity analysis by varying the most important factors of the project. Before discussing critical aspects of the proposed process, space-related examples will be presented to show how the process could be applied, and how different concepts should be evaluated.
A comparative concept analysis of centring vs. opening meditation processes in health care.
Birx, Ellen
2013-08-01
To report an analysis and comparison of the concepts centring and opening meditation processes in health care. Centring and opening meditation processes are included in nursing theories and frequently recommended in health care for stress management. These meditation processes are integrated into emerging psychotherapy approaches and there is a rapidly expanding body of neuroscience research distinguishing brain activity associated with different types of meditation. Currently, there is a lack of theoretical and conceptual clarity needed to guide meditation research in health care. A search of healthcare literature between 2006-2011 was conducted using Alt HealthWatch, CINAHL, PsychNET and PubMed databases using the keywords 'centring' and 'opening' alone and in combination with the term 'meditation.' For the concept centring, 10 articles and 11 books and for the concept opening 13 articles and 10 books were included as data sources. Rodgers' evolutionary method of concept analysis was used. Centring and opening are similar in that they both involve awareness in the present moment; both use a gentle, effortless approach; and both have a calming effect. Key differences include centring's focus on the individual's inner experience compared with the non-dual, spacious awareness of opening. Centring and opening are overlapping, yet distinct meditation processes. The term meditation cannot be used in a generic way in health care. The differences between centring and opening have important implications for the further development of unitary-transformative nursing theories. © 2012 Blackwell Publishing Ltd.
Wilk, N; Wierzbicka, N; Skrzekowska-Baran, I; Moćko, P; Tomassy, J; Kloc, K
2017-04-01
The aim of this study was to identify the relationship and impact between Real World Evidence (RWE) and experimental evidence (EE) in Polish decision-making processes for the drugs from selected Anatomical Therapeutic Chemical (ATC) groups. Descriptive study. A detailed analysis was performed for 58 processes from five ATC code groups in which RWE for effectiveness, or effectiveness and safety were cited in Agency for Health Technology Assessment and Tariff System's (AOTMiT) documents published between January 2012 and September 2015: Verification Analysis of AOTMiT, Statement of the Transparency Council of AOTMiT, and Recommendation of the President of AOTMiT. In 62% of the cases, RWE supported the EE and confirmed its main conclusions. The majority of studies in the EE group showed to be RCTs (97%), and the RWE group included mainly cohort studies (89%). There were more studies without a control group within RWE compared with the EE group (10% vs 1%). Our results showed that EE are more often assessed using Jadad, NICE or NOS scale by AOTMiT compared with RWE (93% vs 48%). When the best evidence within a given decision-making process is analysed, half of RWE and two-thirds of EE are considered high quality evidence. RWE plays an important role in the decision-making processes on public funding of drugs in Poland, contributing to nearly half (45%) of all the evidence considered. There exist such processes in which the proportion of RWE is dominant, with one process showing RWE as the only evidence presented. Copyright © 2016 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Isoni, V; Kumbang, D; Sharratt, P N; Khoo, H H
2018-05-15
Aligned with Singapore's commitment to sustainable development and investment in renewable resources, cleaner energy and technology (Sustainable Singapore Blueprint), we report a techno-economic analysis of the biorefinery process in Southeast Asia. The considerations in this study provide an overview of the current and future challenges in the biomass-to-chemical processes with life-cycle thinking, linking the land used for agriculture and biomass to the levulinic acid production. 7-8 kg of lignocellulosic feedstock (glucan content 30-35 wt%) from agriculture residues empty fruit bunches (EFB) or rice straw (RS) can be processed to yield 1 kg of levulinic acid. Comparisons of both traditional and "green" alternative solvents and separation techniques for the chemical process were modelled and their relative energy profiles evaluated. Using 2-methyltetrahydrofuran (2-MeTHF) as the process solvent showed to approx. 20 fold less energy demand compared to methyl isobutyl ketone (MIBK) or approx. 180 fold less energy demand compared to direct distillation from aqueous stream. Greenhouse gases emissions of the major operations throughout the supply chain (energy and solvent use, transport, field emissions) were estimated and compared against the impact of deforestation to make space for agriculture purposes. A biorefinery process for the production of 20 ktonne/year of levulinic acid from two different types of lignocellulosic feedstock was hypothesized for different scenarios. In one scenario the chemical plant producing levulinic acid was located in Singapore whereas in other scenarios, its location was placed in a neighboring country, closer to the biomass source. Results from this study show the importance of feedstock choices, as well as the associated plant locations, in the quest for sustainability objectives. Copyright © 2018 Elsevier Ltd. All rights reserved.
Network meta-analysis: an introduction for clinicians.
Rouse, Benjamin; Chaimani, Anna; Li, Tianjing
2017-02-01
Network meta-analysis is a technique for comparing multiple treatments simultaneously in a single analysis by combining direct and indirect evidence within a network of randomized controlled trials. Network meta-analysis may assist assessing the comparative effectiveness of different treatments regularly used in clinical practice and, therefore, has become attractive among clinicians. However, if proper caution is not taken in conducting and interpreting network meta-analysis, inferences might be biased. The aim of this paper is to illustrate the process of network meta-analysis with the aid of a working example on first-line medical treatment for primary open-angle glaucoma. We discuss the key assumption of network meta-analysis, as well as the unique considerations for developing appropriate research questions, conducting the literature search, abstracting data, performing qualitative and quantitative synthesis, presenting results, drawing conclusions, and reporting the findings in a network meta-analysis.
Joda, Tim; Brägger, Urs
2015-01-01
To compare time-efficiency in the production of implant crowns using a digital workflow versus the conventional pathway. This prospective clinical study used a crossover design that included 20 study participants receiving single-tooth replacements in posterior sites. Each patient received a customized titanium abutment plus a computer-aided design/computer-assisted manufacture (CAD/CAM) zirconia suprastructure (for those in the test group, using digital workflow) and a standardized titanium abutment plus a porcelain-fused-to-metal crown (for those in the control group, using a conventional pathway). The start of the implant prosthetic treatment was established as the baseline. Time-efficiency analysis was defined as the primary outcome, and was measured for every single clinical and laboratory work step in minutes. Statistical analysis was calculated with the Wilcoxon rank sum test. All crowns could be provided within two clinical appointments, independent of the manufacturing process. The mean total production time, as the sum of clinical plus laboratory work steps, was significantly different. The mean ± standard deviation (SD) time was 185.4 ± 17.9 minutes for the digital workflow process and 223.0 ± 26.2 minutes for the conventional pathway (P = .0001). Therefore, digital processing for overall treatment was 16% faster. Detailed analysis for the clinical treatment revealed a significantly reduced mean ± SD chair time of 27.3 ± 3.4 minutes for the test group compared with 33.2 ± 4.9 minutes for the control group (P = .0001). Similar results were found for the mean laboratory work time, with a significant decrease of 158.1 ± 17.2 minutes for the test group vs 189.8 ± 25.3 minutes for the control group (P = .0001). Only a few studies have investigated efficiency parameters of digital workflows compared with conventional pathways in implant dental medicine. This investigation shows that the digital workflow seems to be more time-efficient than the established conventional production pathway for fixed implant-supported crowns. Both clinical chair time and laboratory manufacturing steps could be effectively shortened with the digital process of intraoral scanning plus CAD/CAM technology.
NASA Astrophysics Data System (ADS)
Juszczyk, Michał; Leśniak, Agnieszka; Zima, Krzysztof
2013-06-01
Conceptual cost estimation is important for construction projects. Either underestimation or overestimation of building raising cost may lead to failure of a project. In the paper authors present application of a multicriteria comparative analysis (MCA) in order to select factors influencing residential building raising cost. The aim of the analysis is to indicate key factors useful in conceptual cost estimation in the early design stage. Key factors are being investigated on basis of the elementary information about the function, form and structure of the building, and primary assumptions of technological and organizational solutions applied in construction process. The mentioned factors are considered as variables of the model which aim is to make possible conceptual cost estimation fast and with satisfying accuracy. The whole analysis included three steps: preliminary research, choice of a set of potential variables and reduction of this set to select the final set of variables. Multicriteria comparative analysis is applied in problem solution. Performed analysis allowed to select group of factors, defined well enough at the conceptual stage of the design process, to be used as a describing variables of the model.
NASA Astrophysics Data System (ADS)
Yang, Wenxiu; Liu, Yanbo; Zhang, Ligai; Cao, Hong; Wang, Yang; Yao, Jinbo
2016-06-01
Needleless electrospinning technology is considered as a better avenue to produce nanofibrous materials at large scale, and electric field intensity and its distribution play an important role in controlling nanofiber diameter and quality of the nanofibrous web during electrospinning. In the current study, a novel needleless electrospinning method was proposed based on Von Koch curves of Fractal configuration, simulation and analysis on electric field intensity and distribution in the new electrospinning process were performed with Finite element analysis software, Comsol Multiphysics 4.4, based on linear and nonlinear Von Koch fractal curves (hereafter called fractal models). The result of simulation and analysis indicated that Second level fractal structure is the optimal linear electrospinning spinneret in terms of field intensity and uniformity. Further simulation and analysis showed that the circular type of Fractal spinneret has better field intensity and distribution compared to spiral type of Fractal spinneret in the nonlinear Fractal electrospinning technology. The electrospinning apparatus with the optimal Von Koch fractal spinneret was set up to verify the theoretical analysis results from Comsol simulation, achieving more uniform electric field distribution and lower energy cost, compared to the current needle and needleless electrospinning technologies.
Hoekman, Berend; Breitling, Rainer; Suits, Frank; Bischoff, Rainer; Horvatovich, Peter
2012-01-01
Data processing forms an integral part of biomarker discovery and contributes significantly to the ultimate result. To compare and evaluate various publicly available open source label-free data processing workflows, we developed msCompare, a modular framework that allows the arbitrary combination of different feature detection/quantification and alignment/matching algorithms in conjunction with a novel scoring method to evaluate their overall performance. We used msCompare to assess the performance of workflows built from modules of publicly available data processing packages such as SuperHirn, OpenMS, and MZmine and our in-house developed modules on peptide-spiked urine and trypsin-digested cerebrospinal fluid (CSF) samples. We found that the quality of results varied greatly among workflows, and interestingly, heterogeneous combinations of algorithms often performed better than the homogenous workflows. Our scoring method showed that the union of feature matrices of different workflows outperformed the original homogenous workflows in some cases. msCompare is open source software (https://trac.nbic.nl/mscompare), and we provide a web-based data processing service for our framework by integration into the Galaxy server of the Netherlands Bioinformatics Center (http://galaxy.nbic.nl/galaxy) to allow scientists to determine which combination of modules provides the most accurate processing for their particular LC-MS data sets. PMID:22318370
NASA Astrophysics Data System (ADS)
Psyk, Verena; Scheffler, Christian; Linnemann, Maik; Landgrebe, Dirk
2017-10-01
Compared to conventional joining techniques, electromagnetic pulse welding offers important advantages especially when it comes to dissimilar material connections as e.g. copper aluminum welds. However, due to missing guidelines and tools for process design, the process has not been widely implemented in industrial production, yet. In order to contribute to overcoming this obstacle, a combined numerical and experimental process analysis for electromagnetic pulse welding of Cu-DHP and EN AW-1050 was carried out and the results were consolidated in a quantitative collision parameter based process window.
Numerical Simulation of Cast Distortion in Gas Turbine Engine Components
NASA Astrophysics Data System (ADS)
Inozemtsev, A. A.; Dubrovskaya, A. S.; Dongauser, K. A.; Trufanov, N. A.
2015-06-01
In this paper the process of multiple airfoilvanes manufacturing through investment casting is considered. The mathematical model of the full contact problem is built to determine stress strain state in a cast during the process of solidification. Studies are carried out in viscoelastoplastic statement. Numerical simulation of the explored process is implemented with ProCASTsoftware package. The results of simulation are compared with the real production process. By means of computer analysis the optimization of technical process parameters is done in order to eliminate the defect of cast walls thickness variation.
Reduced order model based on principal component analysis for process simulation and optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lang, Y.; Malacina, A.; Biegler, L.
2009-01-01
It is well-known that distributed parameter computational fluid dynamics (CFD) models provide more accurate results than conventional, lumped-parameter unit operation models used in process simulation. Consequently, the use of CFD models in process/equipment co-simulation offers the potential to optimize overall plant performance with respect to complex thermal and fluid flow phenomena. Because solving CFD models is time-consuming compared to the overall process simulation, we consider the development of fast reduced order models (ROMs) based on CFD results to closely approximate the high-fidelity equipment models in the co-simulation. By considering process equipment items with complicated geometries and detailed thermodynamic property models,more » this study proposes a strategy to develop ROMs based on principal component analysis (PCA). Taking advantage of commercial process simulation and CFD software (for example, Aspen Plus and FLUENT), we are able to develop systematic CFD-based ROMs for equipment models in an efficient manner. In particular, we show that the validity of the ROM is more robust within well-sampled input domain and the CPU time is significantly reduced. Typically, it takes at most several CPU seconds to evaluate the ROM compared to several CPU hours or more to solve the CFD model. Two case studies, involving two power plant equipment examples, are described and demonstrate the benefits of using our proposed ROM methodology for process simulation and optimization.« less
NASA Astrophysics Data System (ADS)
Jolivet, S.; Mezghani, S.; El Mansori, M.
2016-09-01
The replication of topography has been generally restricted to optimizing material processing technologies in terms of statistical and single-scale features such as roughness. By contrast, manufactured surface topography is highly complex, irregular, and multiscale. In this work, we have demonstrated the use of multiscale analysis on replicates of surface finish to assess the precise control of the finished replica. Five commercial resins used for surface replication were compared. The topography of five standard surfaces representative of common finishing processes were acquired both directly and by a replication technique. Then, they were characterized using the ISO 25178 standard and multiscale decomposition based on a continuous wavelet transform, to compare the roughness transfer quality at different scales. Additionally, atomic force microscope force modulation mode was used in order to compare the resins’ stiffness properties. The results showed that less stiff resins are able to replicate the surface finish along a larger wavelength band. The method was then tested for non-destructive quality control of automotive gear tooth surfaces.
Nonstationary Dynamics Data Analysis with Wavelet-SVD Filtering
NASA Technical Reports Server (NTRS)
Brenner, Marty; Groutage, Dale; Bessette, Denis (Technical Monitor)
2001-01-01
Nonstationary time-frequency analysis is used for identification and classification of aeroelastic and aeroservoelastic dynamics. Time-frequency multiscale wavelet processing generates discrete energy density distributions. The distributions are processed using the singular value decomposition (SVD). Discrete density functions derived from the SVD generate moments that detect the principal features in the data. The SVD standard basis vectors are applied and then compared with a transformed-SVD, or TSVD, which reduces the number of features into more compact energy density concentrations. Finally, from the feature extraction, wavelet-based modal parameter estimation is applied.
NASA Astrophysics Data System (ADS)
Sawicki, J.; Siedlaczek, P.; Staszczyk, A.
2018-03-01
A numerical three-dimensional model for computing residual stresses generated in cross section of steel 42CrMo4 after nitriding is presented. The diffusion process is analyzed by the finite-element method. The internal stresses are computed using the obtained profile of the distribution of the nitrogen concentration. The special features of the intricate geometry of the treated articles including edges and angles are considered. Comparative analysis of the results of the simulation and of the experimental measurement of residual stresses is performed by the Waisman-Philips method.
Deverka, Patricia A; Lavallee, Danielle C; Desai, Priyanka J; Armstrong, Joanne; Gorman, Mark; Hole-Curry, Leah; O'Leary, James; Ruffner, B W; Watkins, John; Veenstra, David L; Baker, Laurence H; Unger, Joseph M; Ramsey, Scott D
2012-07-01
The Center for Comparative Effectiveness Research in Cancer Genomics completed a 2-year stakeholder-guided process for the prioritization of genomic tests for comparative effectiveness research studies. We sought to evaluate the effectiveness of engagement procedures in achieving project goals and to identify opportunities for future improvements. The evaluation included an online questionnaire, one-on-one telephone interviews and facilitated discussion. Responses to the online questionnaire were tabulated for descriptive purposes, while transcripts from key informant interviews were analyzed using a directed content analysis approach. A total of 11 out of 13 stakeholders completed both the online questionnaire and interview process, while nine participated in the facilitated discussion. Eighty-nine percent of questionnaire items received overall ratings of agree or strongly agree; 11% of responses were rated as neutral with the exception of a single rating of disagreement with an item regarding the clarity of how stakeholder input was incorporated into project decisions. Recommendations for future improvement included developing standard recruitment practices, role descriptions and processes for improved communication with clinical and comparative effectiveness research investigators. Evaluation of the stakeholder engagement process provided constructive feedback for future improvements and should be routinely conducted to ensure maximal effectiveness of stakeholder involvement.
Pre-Service Science and Primary School Teachers' Identification of Scientific Process Skills
ERIC Educational Resources Information Center
Birinci Konur, Kader; Yildirim, Nagihan
2016-01-01
The purpose of this study was to conduct a comparative analysis of pre-service primary school and science teachers' identification of scientific process skills. The study employed the survey method, and the sample included 95 pre-service science teachers and 95 pre-service primary school teachers from the Faculty of Education at Recep Tayyip…
ERIC Educational Resources Information Center
DiClemente, Carlo C.; And Others
1991-01-01
Tested transtheoretical model of change positing stages through which smokers move as they successfully change their smoking habit. Subjects in precontemplation (n=166), contemplation (n=794), and preparation (n=506) stages were compared on smoking history, 10 processes of change, pretest self-efficacy, decisional balance, and one- and six-month…
ERIC Educational Resources Information Center
Bloom, Robert; And Others
A study of the processes for establishing the principles and policies of measurement and disclosure in preparing financial reports examines differences in these processes in the United States, Canada, and England. Information was drawn from international accounting literature on standard setting. The differences and similarities in the…
A community in the wildland-urban interface
María Cecilia Ciampoli Halaman
2013-01-01
Communities located in the wildland-urban interface undergo a process of transformation until they can guard against fires occurring in the area. This study analyzed this process for the Estación neighborhood in the city of Esquel, Chubut Province, Argentina. The analysis was performed by comparing the level of danger diagnosed for each neighborhood home in 2004 with...
Scott, Felipe; Aroca, Germán; Caballero, José Antonio; Conejeros, Raúl
2017-07-01
The aim of this study is to analyze the techno-economic performance of process configurations for ethanol production involving solid-liquid separators and reactors in the saccharification and fermentation stage, a family of process configurations where few alternatives have been proposed. Since including these process alternatives creates a large number of possible process configurations, a framework for process synthesis and optimization is proposed. This approach is supported on kinetic models fed with experimental data and a plant-wide techno-economic model. Among 150 process configurations, 40 show an improved MESP compared to a well-documented base case (BC), almost all include solid separators and some show energy retrieved in products 32% higher compared to the BC. Moreover, 16 of them also show a lower capital investment per unit of ethanol produced per year. Several of the process configurations found in this work have not been reported in the literature. Copyright © 2017 Elsevier Ltd. All rights reserved.
Cope, Shannon; Zhang, Jie; Saletan, Stephen; Smiechowski, Brielan; Jansen, Jeroen P; Schmid, Peter
2014-06-05
The aim of this study is to outline a general process for assessing the feasibility of performing a valid network meta-analysis (NMA) of randomized controlled trials (RCTs) to synthesize direct and indirect evidence for alternative treatments for a specific disease population. Several steps to assess the feasibility of an NMA are proposed based on existing recommendations. Next, a case study is used to illustrate this NMA feasibility assessment process in order to compare everolimus in combination with hormonal therapy to alternative chemotherapies in terms of progression-free survival for women with advanced breast cancer. A general process for assessing the feasibility of an NMA is outlined that incorporates explicit steps to visualize the heterogeneity in terms of treatment and outcome characteristics (Part A) as well as the study and patient characteristics (Part B). Additionally, steps are performed to illustrate differences within and across different types of direct comparisons in terms of baseline risk (Part C) and observed treatment effects (Part D) since there is a risk that the treatment effect modifiers identified may not explain the observed heterogeneity or inconsistency in the results due to unexpected, unreported or unmeasured differences. Depending on the data available, alternative approaches are suggested: list assumptions, perform a meta-regression analysis, subgroup analysis, sensitivity analyses, or summarize why an NMA is not feasible. The process outlined to assess the feasibility of an NMA provides a stepwise framework that will help to ensure that the underlying assumptions are systematically explored and that the risks (and benefits) of pooling and indirectly comparing treatment effects from RCTs for a particular research question are transparent.
All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis.
Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L; Terés, Lluís; Baumann, Reinhard R
2016-09-21
We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement.
Imaging of breast cancer with mid- and long-wave infrared camera.
Joro, R; Lääperi, A-L; Dastidar, P; Soimakallio, S; Kuukasjärvi, T; Toivonen, T; Saaristo, R; Järvenpää, R
2008-01-01
In this novel study the breasts of 15 women with palpable breast cancer were preoperatively imaged with three technically different infrared (IR) cameras - micro bolometer (MB), quantum well (QWIP) and photo voltaic (PV) - to compare their ability to differentiate breast cancer from normal tissue. The IR images were processed, the data for frequency analysis were collected from dynamic IR images by pixel-based analysis and from each image selectively windowed regional analysis was carried out, based on angiogenesis and nitric oxide production of cancer tissue causing vasomotor and cardiogenic frequency differences compared to normal tissue. Our results show that the GaAs QWIP camera and the InSb PV camera demonstrate the frequency difference between normal and cancerous breast tissue; the PV camera more clearly. With selected image processing operations more detailed frequency analyses could be applied to the suspicious area. The MB camera was not suitable for tissue differentiation, as the difference between noise and effective signal was unsatisfactory.
Murphy, Karagh; James, Logan S; Sakata, Jon T; Prather, Jonathan F
2017-08-01
Sensorimotor integration is the process through which the nervous system creates a link between motor commands and associated sensory feedback. This process allows for the acquisition and refinement of many behaviors, including learned communication behaviors such as speech and birdsong. Consequently, it is important to understand fundamental mechanisms of sensorimotor integration, and comparative analyses of this process can provide vital insight. Songbirds offer a powerful comparative model system to study how the nervous system links motor and sensory information for learning and control. This is because the acquisition, maintenance, and control of birdsong critically depend on sensory feedback. Furthermore, there is an incredible diversity of song organizations across songbird species, ranging from songs with simple, stereotyped sequences to songs with complex sequencing of vocal gestures, as well as a wide diversity of song repertoire sizes. Despite this diversity, the neural circuitry for song learning, control, and maintenance remains highly similar across species. Here, we highlight the utility of songbirds for the analysis of sensorimotor integration and the insights about mechanisms of sensorimotor integration gained by comparing different songbird species. Key conclusions from this comparative analysis are that variation in song sequence complexity seems to covary with the strength of feedback signals in sensorimotor circuits and that sensorimotor circuits contain distinct representations of elements in the vocal repertoire, possibly enabling evolutionary variation in repertoire sizes. We conclude our review by highlighting important areas of research that could benefit from increased comparative focus, with particular emphasis on the integration of new technologies. Copyright © 2017 the American Physiological Society.
36 CFR 220.7 - Environmental assessment and decision notice.
Code of Federal Regulations, 2013 CFR
2013-07-01
... modifications and incremental design features developed through the analysis process to develop the alternatives... proposed action and any alternatives together in a comparative description or describe the impacts of each...
36 CFR 220.7 - Environmental assessment and decision notice.
Code of Federal Regulations, 2012 CFR
2012-07-01
... modifications and incremental design features developed through the analysis process to develop the alternatives... proposed action and any alternatives together in a comparative description or describe the impacts of each...
36 CFR 220.7 - Environmental assessment and decision notice.
Code of Federal Regulations, 2014 CFR
2014-07-01
... modifications and incremental design features developed through the analysis process to develop the alternatives... proposed action and any alternatives together in a comparative description or describe the impacts of each...
36 CFR 220.7 - Environmental assessment and decision notice.
Code of Federal Regulations, 2011 CFR
2011-07-01
... modifications and incremental design features developed through the analysis process to develop the alternatives... proposed action and any alternatives together in a comparative description or describe the impacts of each...
36 CFR 220.7 - Environmental assessment and decision notice.
Code of Federal Regulations, 2010 CFR
2010-07-01
... modifications and incremental design features developed through the analysis process to develop the alternatives... proposed action and any alternatives together in a comparative description or describe the impacts of each...
[Scientific connotation of processing Bombyx Batryticatus under high temperature].
Ma, Li; Wang, Xuan; Ma, Lin; Wang, Man-yuan; Qiu, Feng
2015-12-01
The aim of this study was to elucidate the scientific connotation of Bombyx Batryticatus processing with wheat bran under high temperature. The contents of soluble protein extracted from Bombyx Batryticatus and its processed products and the limited content of AFT in Bombyx Batryticatus and the processed one were compared. The concentration of protein was measured with the Bradford methods and the difference of protein between Bombyx Batryticatus and its processed products was compared by SDS-PAGE analysis. Aflatoxin B1, B2, G1, and G2 were determined by reversed-phase HPLC. The results showed that the soluble protein content of Bombyx Batryticatus and its processed products were (47.065 +/- 0.249), (29.756 +/- 1.961) mg x g(-1), correspondingly. Analysis of protein gel electrophoresis showed that there were no significant differences between the crude and processed one in protein varieties. 6 bands were detected: 31.90, 26.80, 18.71, 15.00, 10.18, 8.929 kDa. Below 10 kDa, the color of bands of the processed one was deeper than the crude one, which demonstrate that macromolecular protein was degradated into micromolecule. The content of AFG1, AFB1, AFG2, AFB2 were 0.382, 0.207, 0.223, 0.073 g x kg(-1), not exceeded 5 microg x kg(-1) while the processed one was not detected. Through processing with wheat bran under high temperature, the content of soluble protein in Bombyx Batryticatus decreased, the processing purpose for alleviating drug property was achieved. Meanwhile, the limited content of aflatoxins were reduced or cleared by processing procedure or absorbed by processing auxillary material, adding the safety of the traditional Chinese Medicine. In conclusion, as a traditional processing method, bran frying Bombyx Batryticatus was scientific and reasonable.
Boguta, Patrycja; Pieczywek, Piotr M.; Sokołowska, Zofia
2016-01-01
The main aim of this study was the application of excitation-emission fluorescence matrices (EEMs) combined with two decomposition methods: parallel factor analysis (PARAFAC) and nonnegative matrix factorization (NMF) to study the interaction mechanisms between humic acids (HAs) and Zn(II) over a wide concentration range (0–50 mg·dm−3). The influence of HA properties on Zn(II) complexation was also investigated. Stability constants, quenching degree and complexation capacity were estimated for binding sites found in raw EEM, EEM-PARAFAC and EEM-NMF data using mathematical models. A combination of EEM fluorescence analysis with one of the proposed decomposition methods enabled separation of overlapping binding sites and yielded more accurate calculations of the binding parameters. PARAFAC and NMF processing allowed finding binding sites invisible in a few raw EEM datasets as well as finding totally new maxima attributed to structures of the lowest humification. Decomposed data showed an increase in Zn complexation with an increase in humification, aromaticity and molecular weight of HAs. EEM-PARAFAC analysis also revealed that the most stable compounds were formed by structures containing the highest amounts of nitrogen. The content of oxygen-functional groups did not influence the binding parameters, mainly due to fact of higher competition of metal cation with protons. EEM spectra coupled with NMF and especially PARAFAC processing gave more adequate assessments of interactions as compared to raw EEM data and should be especially recommended for modeling of complexation processes where the fluorescence intensities (FI) changes are weak or where the processes are interfered with by the presence of other fluorophores. PMID:27782078
Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R
2005-01-01
Background: Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. Methods: A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. Results: The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Conclusions: Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities. PMID:15805453
Bonnabry, P; Cingria, L; Sadeghipour, F; Ing, H; Fonzo-Christe, C; Pfister, R E
2005-04-01
Until recently, the preparation of paediatric parenteral nutrition formulations in our institution included re-transcription and manual compounding of the mixture. Although no significant clinical problems have occurred, re-engineering of this high risk activity was undertaken to improve its safety. Several changes have been implemented including new prescription software, direct recording on a server, automatic printing of the labels, and creation of a file used to pilot a BAXA MM 12 automatic compounder. The objectives of this study were to compare the risks associated with the old and new processes, to quantify the improved safety with the new process, and to identify the major residual risks. A failure modes, effects, and criticality analysis (FMECA) was performed by a multidisciplinary team. A cause-effect diagram was built, the failure modes were defined, and the criticality index (CI) was determined for each of them on the basis of the likelihood of occurrence, the severity of the potential effect, and the detection probability. The CIs for each failure mode were compared for the old and new processes and the risk reduction was quantified. The sum of the CIs of all 18 identified failure modes was 3415 for the old process and 1397 for the new (reduction of 59%). The new process reduced the CIs of the different failure modes by a mean factor of 7. The CI was smaller with the new process for 15 failure modes, unchanged for two, and slightly increased for one. The greatest reduction (by a factor of 36) concerned re-transcription errors, followed by readability problems (by a factor of 30) and chemical cross contamination (by a factor of 10). The most critical steps in the new process were labelling mistakes (CI 315, maximum 810), failure to detect a dosage or product mistake (CI 288), failure to detect a typing error during the prescription (CI 175), and microbial contamination (CI 126). Modification of the process resulted in a significant risk reduction as shown by risk analysis. Residual failure opportunities were also quantified, allowing additional actions to be taken to reduce the risk of labelling mistakes. This study illustrates the usefulness of prospective risk analysis methods in healthcare processes. More systematic use of risk analysis is needed to guide continuous safety improvement of high risk activities.
Probabilistic fault tree analysis of a radiation treatment system.
Ekaette, Edidiong; Lee, Robert C; Cooke, David L; Iftody, Sandra; Craighead, Peter
2007-12-01
Inappropriate administration of radiation for cancer treatment can result in severe consequences such as premature death or appreciably impaired quality of life. There has been little study of vulnerable treatment process components and their contribution to the risk of radiation treatment (RT). In this article, we describe the application of probabilistic fault tree methods to assess the probability of radiation misadministration to patients at a large cancer treatment center. We conducted a systematic analysis of the RT process that identified four process domains: Assessment, Preparation, Treatment, and Follow-up. For the Preparation domain, we analyzed possible incident scenarios via fault trees. For each task, we also identified existing quality control measures. To populate the fault trees we used subjective probabilities from experts and compared results with incident report data. Both the fault tree and the incident report analysis revealed simulation tasks to be most prone to incidents, and the treatment prescription task to be least prone to incidents. The probability of a Preparation domain incident was estimated to be in the range of 0.1-0.7% based on incident reports, which is comparable to the mean value of 0.4% from the fault tree analysis using probabilities from the expert elicitation exercise. In conclusion, an analysis of part of the RT system using a fault tree populated with subjective probabilities from experts was useful in identifying vulnerable components of the system, and provided quantitative data for risk management.
Using Queue Time Predictions for Processor Allocation
1997-01-01
Diego Supercomputer Center, 1996. 19 [15] Vijay K. Naik, Sanjeev K. Setia , and Mark S. Squillante. Performance analysis of job schedul- ing policies in...Processing, pages 101{111, 1995. [19] Sanjeev K. Setia and Satish K. Tripathi. An analysis of several processor partitioning policies for parallel...computers. Technical Report CS-TR-2684, University of Maryland, May 1991. [20] Sanjeev K. Setia and Satish K. Tripathi. A comparative analysis of static
Muralidhar, Gautam S; Channappayya, Sumohana S; Slater, John H; Blinka, Ellen M; Bovik, Alan C; Frey, Wolfgang; Markey, Mia K
2008-11-06
Automated analysis of fluorescence microscopy images of endothelial cells labeled for actin is important for quantifying changes in the actin cytoskeleton. The current manual approach is laborious and inefficient. The goal of our work is to develop automated image analysis methods, thereby increasing cell analysis throughput. In this study, we present preliminary results on comparing different algorithms for cell segmentation and image denoising.
Global Gene Expression Analysis of Yeast Cells during Sake Brewing▿ †
Wu, Hong; Zheng, Xiaohong; Araki, Yoshio; Sahara, Hiroshi; Takagi, Hiroshi; Shimoi, Hitoshi
2006-01-01
During the brewing of Japanese sake, Saccharomyces cerevisiae cells produce a high concentration of ethanol compared with other ethanol fermentation methods. We analyzed the gene expression profiles of yeast cells during sake brewing using DNA microarray analysis. This analysis revealed some characteristics of yeast gene expression during sake brewing and provided a scaffold for a molecular level understanding of the sake brewing process. PMID:16997994
Thermal analysis of void cavity for heat pipe receiver under microgravity
NASA Astrophysics Data System (ADS)
Gui, Xiaohong; Song, Xiange; Nie, Baisheng
2017-04-01
Based on theoretical analysis of PCM (Phase Change Material) solidification process, the model of improved void cavity distribution tending to high temperature region is established. Numerical results are compared with NASA (National Aeronautics and Space Administration) results. Analysis results show that the outer wall temperature, the melting ratio of PCM and the temperature gradient of PCM canister, have great difference in different void cavity distribution. The form of void distribution has a great effect on the process of phase change. Based on simulation results under the model of improved void cavity distribution, phase change heat transfer process in thermal storage container is analyzed. The main goal of the improved designing for PCM canister is to take measures in reducing the concentration distribution of void cavity by adding some foam metal into phase change material.
Intraoral Laser Welding (ILW): ultrastructural and mechanical analysis
NASA Astrophysics Data System (ADS)
Fornaini, Carlo; Passaretti, Francesca; Villa, Elena; Nammour, Samir
2010-05-01
Nd:YAG, currently used since 1970 in dental laboratories to weld metals on dental prostheses has some limits such great dimensions, high costs and fixed delivery system. Recently it was proposed the possibility to use the Nd:YAG laser device commonly utilised in dental office, to repair broken fixed, removable and orthodontic prostheses and to weld metals directly into the mouth. The aim of this work is to value, through SEM (Scanning Electron Microscope), EDS (Energy Dispersive X-Ray Spectroscopy) and DMA (Dynamic Mechanical Analysis), quality and mechanical strength of the welding process comparing a device normally used in dental lab and a device normally used in dental office for oral surgery. Sixteen CoCrMo metal plates and twenty steel orthodontic wires were divided in four groups: one was welded without metal apposition by laboratory laser, one was welded with metal apposition by laboratory laser, one was welded without metal apposition by office laser and one was welded with metal apposition by office laser. The welding process was analysed by SEM, EDS and DMA to compare the differences between the different samples. By SEM analysis it was seen that the plates welded by office laser without apposition metal showed a greater number of fissurations compared with the other samples. By EDS analysis it was seen a homogeneous composition of the metals in all the samples. The mechanical tests showed a similar elastic behaviour of the samples, with minimal differences between the two devices. No wire broke even under the maximum strength by the Analyser. This study seems to demonstrate that the welding process by office Nd:YAG laser device and the welding process by laboratory Nd:YAG laser device, analysed by SEM, EDS and DMA, showed minimal and not significant differences even if these data will be confirmed by a greater number of samples.
Kudasova, E O; Vlasova, L F; Semenov, D E; Lushnikova, E L
2017-03-01
Morphological analysis of the subcutaneous fat was performed in rats after subcutaneous implantation of basic dental plastic materials with different hydrophobic and hydrophilic properties. It was shown that subcutaneous implantation of dental plastics with mostly hydrophobic surface and low biocompatibility induced destructive and inflammatory processes of various intensities, sometimes with allergic component; morphological signs of processes persisted for 6 weeks. Modification of basic plastics using glow-discharge plasma and enhancement of their hydrophilicity and biocompatibility significantly reduced the intensity of destructive and inflammatory processes and ensured more rapid (in 2 weeks) repair of the destroyed tissues with the formation of fibrous capsule around the implant.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Yunhua; Jones, Susanne B.; Biddy, Mary J.
2012-08-01
This study reports the comparison of biomass gasification based syngas-to-distillate (S2D) systems using techno-economic analysis (TEA). Three cases, state of technology (SOT) case, goal case, and conventional case, were compared in terms of performance and cost. The SOT case and goal case represent technology being developed at Pacific Northwest National Laboratory for a process starting with syngas using a single-step dual-catalyst reactor for distillate generation (S2D process). The conventional case mirrors the two-step S2D process previously utilized and reported by Mobil using natural gas feedstock and consisting of separate syngas-to-methanol and methanol-to-gasoline (MTG) processes. Analysis of the three cases revealedmore » that the goal case could indeed reduce fuel production cost over the conventional case, but that the SOT was still more expensive than the conventional. The SOT case suffers from low one-pass yield and high selectivity to light hydrocarbons, both of which drive up production cost. Sensitivity analysis indicated that light hydrocarbon yield, single pass conversion efficiency, and reactor space velocity are the key factors driving the high cost for the SOT case.« less
ERIC Educational Resources Information Center
Haller, Max; Hadler, Markus
2006-01-01
In this paper, subjective well being, as measured by survey questions on happiness and life satisfaction, is investigated from a sociological-comparative point of view. The central thesis is that happiness and satisfaction must be understood as the outcome of an interaction process between individual characteristics and aspirations on the one…
NASA Astrophysics Data System (ADS)
Zhibo, Ren; Kai, Liu; Wei, Wu
This paper analyzed and compared the competitive power of steel industry of 30 provinces in our country. At first, we extracted the data containing 16 economic indicators to reflect each province's business conditions of steel industry, then used correspondence analysis method to process the data. We can get every province's level located in the domestic steel industry and its corresponding advantage. This conclusion has important reference value for every province to develop its steel industry's policy.
NASA Technical Reports Server (NTRS)
1989-01-01
An assessment of quantitative methods and measures for measuring launch commit criteria (LCC) performance measurement trends is made. A statistical performance trending analysis pilot study was processed and compared to STS-26 mission data. This study used four selected shuttle measurement types (solid rocket booster, external tank, space shuttle main engine, and range safety switch safe and arm device) from the five missions prior to mission 51-L. After obtaining raw data coordinates, each set of measurements was processed to obtain statistical confidence bounds and mean data profiles for each of the selected measurement types. STS-26 measurements were compared to the statistical data base profiles to verify the statistical capability of assessing occurrences of data trend anomalies and abnormal time-varying operational conditions associated with data amplitude and phase shifts.
Techno-economic analysis of biocatalytic processes for production of alkene expoxides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Borole, Abhijeet P
2007-01-01
A techno-economic analysis of two different bioprocesses was conducted, one for the conversion of propylene to propylene oxide (PO) and other for conversion of styrene to styrene expoxide (SO). The first process was a lipase-mediated chemo-enzymatic reaction, whereas the second one was a one-step enzymatic process using chloroperoxidase. The PO produced through the chemo-enzymatic process is a racemic product, whereas the latter process (based on chloroperoxidase) produces an enantio-pure product. The former process thus falls under the category of high-volume commodity chemical (PO); whereas the latter is a low-volume, high-value product (SO).A simulation of the process was conducted using themore » bioprocess engineering software SuperPro Designer v6.0 (Intelligen, Inc., Scotch Plains, NJ) to determine the economic feasibility of the process. The purpose of the exercise was to compare biocatalytic processes with existing chemical processes for production of alkene expoxides. The results show that further improvements are needed in improving biocatalyst stability to make these bioprocesses competitive with chemical processes.« less
Information theoretic analysis of linear shift-invariant edge-detection operators
NASA Astrophysics Data System (ADS)
Jiang, Bo; Rahman, Zia-ur
2012-06-01
Generally, the designs of digital image processing algorithms and image gathering devices remain separate. Consequently, the performance of digital image processing algorithms is evaluated without taking into account the influences by the image gathering process. However, experiments show that the image gathering process has a profound impact on the performance of digital image processing and the quality of the resulting images. Huck et al. proposed one definitive theoretic analysis of visual communication channels, where the different parts, such as image gathering, processing, and display, are assessed in an integrated manner using Shannon's information theory. We perform an end-to-end information theory based system analysis to assess linear shift-invariant edge-detection algorithms. We evaluate the performance of the different algorithms as a function of the characteristics of the scene and the parameters, such as sampling, additive noise etc., that define the image gathering system. The edge-detection algorithm is regarded as having high performance only if the information rate from the scene to the edge image approaches its maximum possible. This goal can be achieved only by jointly optimizing all processes. Our information-theoretic assessment provides a new tool that allows us to compare different linear shift-invariant edge detectors in a common environment.
Certification-Based Process Analysis
NASA Technical Reports Server (NTRS)
Knight, Russell L.
2013-01-01
Space mission architects are often challenged with knowing which investment in technology infusion will have the highest return. Certification-based analysis (CBA) gives architects and technologists a means to communicate the risks and advantages of infusing technologies at various points in a process. Various alternatives can be compared, and requirements based on supporting streamlining or automation can be derived and levied on candidate technologies. CBA is a technique for analyzing a process and identifying potential areas of improvement. The process and analysis products are used to communicate between technologists and architects. Process means any of the standard representations of a production flow; in this case, any individual steps leading to products, which feed into other steps, until the final product is produced at the end. This sort of process is common for space mission operations, where a set of goals is reduced eventually to a fully vetted command sequence to be sent to the spacecraft. Fully vetting a product is synonymous with certification. For some types of products, this is referred to as verification and validation, and for others it is referred to as checking. Fundamentally, certification is the step in the process where one insures that a product works as intended, and contains no flaws.
Horsch, Salome; Kopczynski, Dominik; Kuthe, Elias; Baumbach, Jörg Ingo; Rahmann, Sven
2017-01-01
Motivation Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column—ion mobility spectrometry (MCC-IMS) is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process. Method We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios. Results The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i) SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions) and LM (Local Maxima) for automated peak identification, (ii) EM clustering (Expectation Maximization) and DBSCAN (Density-Based Spatial Clustering of Applications with Noise) for the clustering step and (iii) RF (Random Forest) for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology. PMID:28910313
A Unifying Framework for Causal Analysis in Set-Theoretic Multimethod Research
ERIC Educational Resources Information Center
Rohlfing, Ingo; Schneider, Carsten Q.
2018-01-01
The combination of Qualitative Comparative Analysis (QCA) with process tracing, which we call set-theoretic multimethod research (MMR), is steadily becoming more popular in empirical research. Despite the fact that both methods have an elected affinity based on set theory, it is not obvious how a within-case method operating in a single case and a…
Discourse Analysis and Development of English Listening for Non-English Majors in China
ERIC Educational Resources Information Center
Ji, Yinxiu
2015-01-01
Traditional approach of listening teaching mainly focuses on the sentence level and regards the listening process in a passive and static way. To compensate for this deficiency, a new listening approach, that is, discourse-oriented approach has been introduced into the listening classroom. Although discourse analysis is a comparatively new field…
Ghosh, Sujoy; Vivar, Juan; Nelson, Christopher P; Willenborg, Christina; Segrè, Ayellet V; Mäkinen, Ville-Petteri; Nikpay, Majid; Erdmann, Jeannette; Blankenberg, Stefan; O'Donnell, Christopher; März, Winfried; Laaksonen, Reijo; Stewart, Alexandre FR; Epstein, Stephen E; Shah, Svati H; Granger, Christopher B; Hazen, Stanley L; Kathiresan, Sekar; Reilly, Muredach P; Yang, Xia; Quertermous, Thomas; Samani, Nilesh J; Schunkert, Heribert; Assimes, Themistocles L; McPherson, Ruth
2016-01-01
Objective Genome-wide association (GWA) studies have identified multiple genetic variants affecting the risk of coronary artery disease (CAD). However, individually these explain only a small fraction of the heritability of CAD and for most, the causal biological mechanisms remain unclear. We sought to obtain further insights into potential causal processes of CAD by integrating large-scale GWA data with expertly curated databases of core human pathways and functional networks. Approaches and Results Employing pathways (gene sets) from Reactome, we carried out a two-stage gene set enrichment analysis strategy. From a meta-analyzed discovery cohort of 7 CADGWAS data sets (9,889 cases/11,089 controls), nominally significant gene-sets were tested for replication in a meta-analysis of 9 additional studies (15,502 cases/55,730 controls) from the CARDIoGRAM Consortium. A total of 32 of 639 Reactome pathways tested showed convincing association with CAD (replication p<0.05). These pathways resided in 9 of 21 core biological processes represented in Reactome, and included pathways relevant to extracellular matrix integrity, innate immunity, axon guidance, and signaling by PDRF, NOTCH, and the TGF-β/SMAD receptor complex. Many of these pathways had strengths of association comparable to those observed in lipid transport pathways. Network analysis of unique genes within the replicated pathways further revealed several interconnected functional and topologically interacting modules representing novel associations (e.g. semaphorin regulated axonal guidance pathway) besides confirming known processes (lipid metabolism). The connectivity in the observed networks was statistically significant compared to random networks (p<0.001). Network centrality analysis (‘degree’ and ‘betweenness’) further identified genes (e.g. NCAM1, FYN, FURIN etc.) likely to play critical roles in the maintenance and functioning of several of the replicated pathways. Conclusions These findings provide novel insights into how genetic variation, interpreted in the context of biological processes and functional interactions among genes, may help define the genetic architecture of CAD. PMID:25977570
Lv, Jianjian; Liu, Ping; Wang, Yu; Gao, Baoquan; Chen, Ping; Li, Jian
2013-01-01
Background The swimming crab, Portunus trituberculatus, which is naturally distributed in the coastal waters of Asia-Pacific countries, is an important farmed species in China. Salinity is one of the most important abiotic factors that influence not only the distribution and abundance of crustaceans, it is also an important factor for artificial propagation of the crab. To better understand the interaction between salinity stress and osmoregulation, we performed a transcriptome analysis in the gills of Portunus trituberculatus challenged with salinity stress, using the Illumina Deep Sequencing technology. Results We obtained 27,696,835, 28,268,353 and 33,901,271 qualified Illumina read pairs from low salinity challenged (LC), non-challenged (NC), and high salinity challenged (HC) Portunus trituberculatus cDNA libraries, respectively. The overall de novo assembly of cDNA sequence data generated 94,511 unigenes, with an average length of 644 bp. Comparative genomic analysis revealed that 1,705 genes differentially expressed in salinity stress compared to the controls, including 615 and 1,516 unigenes in NC vs LC and NC vs HC respectively. GO functional enrichment analysis results showed some differentially expressed genes were involved in crucial processes related to osmoregulation, such as ion transport processes, amino acid metabolism and synthesis processes, proteolysis process and chitin metabolic process. Conclusion This work represents the first report of the utilization of the next generation sequencing techniques for transcriptome analysis in Portunus trituberculatus and provides valuable information on salinity adaptation mechanism. Results reveal a substantial number of genes modified by salinity stress and a few important salinity acclimation pathways, which will serve as an invaluable resource for revealing the molecular basis of osmoregulation in Portunus trituberculatus. In addition, the most comprehensive sequences of transcripts reported in this study provide a rich source for identification of novel genes in the crab. PMID:24312639
Lazar, Dolores R R; Bottino, Marco C; Ozcan, Mutlu; Valandro, Luiz Felipe; Amaral, Regina; Ussui, Valter; Bressiani, Ana H A
2008-12-01
(1) To synthesize 3mol% yttria-stabilized zirconia (3Y-TZP) powders via coprecipitation route, (2) to obtain zirconia ceramic specimens, analyze surface characteristics, and mechanical properties, and (3) to compare the processed material with three reinforced dental ceramics. A coprecipitation route was used to synthesize a 3mol% yttria-stabilized zirconia ceramic processed by uniaxial compaction and pressureless sintering. Commercially available alumina or alumina/zirconia ceramics, namely Procera AllCeram (PA), In-Ceram Zirconia Block (CAZ) and In-Ceram Zirconia (IZ) were chosen for comparison. All specimens (6mmx5mmx5mm) were polished and ultrasonically cleaned. Qualitative phase analysis was performed by XRD and apparent densities were measured on the basis of Archimedes principle. Ceramics were also characterized using SEM, TEM and EDS. The hardness measurements were made employing Vickers hardness test. Fracture toughness (K(IC)) was calculated. Data were analyzed using one-way analysis of variance (ANOVA) and Tukey's test (alpha=0.05). ANOVA revealed that the Vickers hardness (p<0.0001) and fracture toughness (p<0.0001) were affected by the ceramic materials composition. It was confirmed that the PA ceramic was constituted of a rhombohedral alumina matrix, so-called alpha-alumina. Both CAZ and IZ ceramics presented tetragonal zirconia and alpha-alumina mixture of phases. The SEM/EDS analysis confirmed the presence of aluminum in PA ceramic. In the IZ and CAZ ceramics aluminum, zirconium and cerium in grains involved by a second phase containing aluminum, silicon and lanthanum were identified. PA showed significantly higher mean Vickers hardness values (H(V)) (18.4+/-0.5GPa) compared to vitreous CAZ (10.3+/-0.2GPa) and IZ (10.6+/-0.4GPa) ceramics. Experimental Y-TZP showed significantly lower results than that of the other monophased ceramic (PA) (p<0.05) but it showed significantly higher fracture toughness (6.0+/-0.2MPam(1/2)) values when compared to the other tested ceramics (p<0.05). The coprecipitation method used to synthesize zirconia powders and the adopted ceramic processing conditions led to ceramics with mechanical properties comparable to commercially available reinforced ceramic materials.
NASA Astrophysics Data System (ADS)
Lipovsky, B.; Funning, G. J.
2009-12-01
We compare several techniques for the analysis of geodetic time series with the ultimate aim to characterize the physical processes which are represented therein. We compare three methods for the analysis of these data: Principal Component Analysis (PCA), Non-Linear PCA (NLPCA), and Rotated PCA (RPCA). We evaluate each method by its ability to isolate signals which may be any combination of low amplitude (near noise level), temporally transient, unaccompanied by seismic emissions, and small scale with respect to the spatial domain. PCA is a powerful tool for extracting structure from large datasets which is traditionally realized through either the solution of an eigenvalue problem or through iterative methods. PCA is an transformation of the coordinate system of our data such that the new "principal" data axes retain maximal variance and minimal reconstruction error (Pearson, 1901; Hotelling, 1933). RPCA is achieved by an orthogonal transformation of the principal axes determined in PCA. In the analysis of meteorological data sets, RPCA has been seen to overcome domain shape dependencies, correct for sampling errors, and to determine principal axes which more closely represent physical processes (e.g., Richman, 1986). NLPCA generalizes PCA such that principal axes are replaced by principal curves (e.g., Hsieh 2004). We achieve NLPCA through an auto-associative feed-forward neural network (Scholz, 2005). We show the geophysical relevance of these techniques by application of each to a synthetic data set. Results are compared by inverting principal axes to determine deformation source parameters. Temporal variability in source parameters, estimated by each method, are also compared.
Duncan, Michael J.; Lawson, Chelsey; Walker, Leanne Jaye; Stodden, David; Eyre, Emma L. J.
2017-01-01
This study examined how supine-to-stand (STS) performance is related to process and product assessment of motor competence (MC) in children. Ninety-one children aged 5–9 years were assessed for process and product MC (10 m running speed and standing long jump) as well as process and product measures of STS. Tertiles were created for STS process and STS product scores to create 3 groups reflecting low, medium, and high STS competency. ANCOVA analysis, controlling for age, for process STS, indicated that process MC was significantly higher in children, classified as medium STS (p = 0.048) and high STS (p = 0.011) competence, and that 10 m run speed was slower for low STS compared to medium (p = 0.019) and high STS (p = 0.004). For product STS tertiles, process MC was significantly higher for children in the lowest (fastest) STS tertile compared to those in the medium highest (slowest) tertile (p = 0.01). PMID:29910427
StreakDet data processing and analysis pipeline for space debris optical observations
NASA Astrophysics Data System (ADS)
Virtanen, Jenni; Flohrer, Tim; Muinonen, Karri; Granvik, Mikael; Torppa, Johanna; Poikonen, Jonne; Lehti, Jussi; Santti, Tero; Komulainen, Tuomo; Naranen, Jyri
We describe a novel data processing and analysis pipeline for optical observations of space debris. The monitoring of space object populations requires reliable acquisition of observational data, to support the development and validation of space debris environment models, the build-up and maintenance of a catalogue of orbital elements. In addition, data is needed for the assessment of conjunction events and for the support of contingency situations or launches. The currently available, mature image processing algorithms for detection and astrometric reduction of optical data cover objects that cross the sensor field-of-view comparably slowly, and within a rather narrow, predefined range of angular velocities. By applying specific tracking techniques, the objects appear point-like or as short trails in the exposures. However, the general survey scenario is always a “track before detect” problem, resulting in streaks, i.e., object trails of arbitrary lengths, in the images. The scope of the ESA-funded StreakDet (Streak detection and astrometric reduction) project is to investigate solutions for detecting and reducing streaks from optical images, particularly in the low signal-to-noise ratio (SNR) domain, where algorithms are not readily available yet. For long streaks, the challenge is to extract precise position information and related registered epochs with sufficient precision. Although some considerations for low-SNR processing of streak-like features are available in the current image processing and computer vision literature, there is a need to discuss and compare these approaches for space debris analysis, in order to develop and evaluate prototype implementations. In the StreakDet project, we develop algorithms applicable to single images (as compared to consecutive frames of the same field) obtained with any observing scenario, including space-based surveys and both low- and high-altitude populations. The proposed processing pipeline starts from the segmentation of the acquired image (i.e., the extraction of all sources), followed by the astrometric and photometric characterization of the candidate streaks, and ends with orbital validation of the detected streaks. A central concept of the pipeline is streak classification which guides the actual characterization process by aiming to identify the interesting sources and to filter out the uninteresting ones, as well as by allowing the tailoring of algorithms for specific streak classes (e.g. point-like vs. long, disintegrated streaks). To validate the single-image detections, the processing is finalized by orbital analysis, resulting in preliminary orbital classification (Earth-bound vs. non-Earth-bound orbit) for the detected streaks.
Power-law statistics of neurophysiological processes analyzed using short signals
NASA Astrophysics Data System (ADS)
Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.
2018-04-01
We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.
The value of decision tree analysis in planning anaesthetic care in obstetrics.
Bamber, J H; Evans, S A
2016-08-01
The use of decision tree analysis is discussed in the context of the anaesthetic and obstetric management of a young pregnant woman with joint hypermobility syndrome with a history of insensitivity to local anaesthesia and a previous difficult intubation due to a tongue tumour. The multidisciplinary clinical decision process resulted in the woman being delivered without complication by elective caesarean section under general anaesthesia after an awake fibreoptic intubation. The decision process used is reviewed and compared retrospectively to a decision tree analytical approach. The benefits and limitations of using decision tree analysis are reviewed and its application in obstetric anaesthesia is discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kern, Simon; Meyer, Klas; Guhl, Svetlana; Gräßer, Patrick; Paul, Andrea; King, Rudibert; Maiwald, Michael
2018-05-01
Monitoring specific chemical properties is the key to chemical process control. Today, mainly optical online methods are applied, which require time- and cost-intensive calibration effort. NMR spectroscopy, with its advantage being a direct comparison method without need for calibration, has a high potential for enabling closed-loop process control while exhibiting short set-up times. Compact NMR instruments make NMR spectroscopy accessible in industrial and rough environments for process monitoring and advanced process control strategies. We present a fully automated data analysis approach which is completely based on physically motivated spectral models as first principles information (indirect hard modeling-IHM) and applied it to a given pharmaceutical lithiation reaction in the framework of the European Union's Horizon 2020 project CONSENS. Online low-field NMR (LF NMR) data was analyzed by IHM with low calibration effort, compared to a multivariate PLS-R (partial least squares regression) approach, and both validated using online high-field NMR (HF NMR) spectroscopy. Graphical abstract NMR sensor module for monitoring of the aromatic coupling of 1-fluoro-2-nitrobenzene (FNB) with aniline to 2-nitrodiphenylamine (NDPA) using lithium-bis(trimethylsilyl) amide (Li-HMDS) in continuous operation. Online 43.5 MHz low-field NMR (LF) was compared to 500 MHz high-field NMR spectroscopy (HF) as reference method.
A two dimensional power spectral estimate for some nonstationary processes. M.S. Thesis
NASA Technical Reports Server (NTRS)
Smith, Gregory L.
1989-01-01
A two dimensional estimate for the power spectral density of a nonstationary process is being developed. The estimate will be applied to helicopter noise data which is clearly nonstationary. The acoustic pressure from the isolated main rotor and isolated tail rotor is known to be periodically correlated (PC) and the combined noise from the main and tail rotors is assumed to be correlation autoregressive (CAR). The results of this nonstationary analysis will be compared with the current method of assuming that the data is stationary and analyzing it as such. Another method of analysis is to introduce a random phase shift into the data as shown by Papoulis to produce a time history which can then be accurately modeled as stationary. This method will also be investigated for the helicopter data. A method used to determine the period of a PC process when the period is not know is discussed. The period of a PC process must be known in order to produce an accurate spectral representation for the process. The spectral estimate is developed. The bias and variability of the estimate are also discussed. Finally, the current method for analyzing nonstationary data is compared to that of using a two dimensional spectral representation. In addition, the method of phase shifting the data is examined.
Jakusz, J.W.; Dieck, J.J.; Langrehr, H.A.; Ruhser, J.J.; Lubinski, S.J.
2016-01-11
Similar to an AA, validation involves generating random points based on the total area for each map class. However, instead of collecting field data, two or three individuals not involved with the photo-interpretative mapping separately review each of the points onscreen and record a best-fit vegetation type(s) for each site. Once the individual analyses are complete, results are joined together and a comparative analysis is performed. The objective of this initial analysis is to identify areas where the validation results were in agreement (matches) and areas where validation results were in disagreement (mismatches). The two or three individuals then perform an analysis, looking at each mismatched site, and agree upon a final validation class. (If two vegetation types at a specific site appear to be equally prevalent, the validation team is permitted to assign the site two best-fit vegetation types.) Following the validation team’s comparative analysis of vegetation assignments, the data are entered into a database and compared to the mappers’ vegetation assignments. Agreements and disagreements between the map and validation classes are identified, and a contingency table is produced. This document presents the AA processes/results for Pools 13 and La Grange, as well as the validation process/results for Pools 13 and 26 and Open River South.
Don't look at me in anger! Enhanced processing of angry faces in anticipation of public speaking.
Wieser, Matthias J; Pauli, Paul; Reicherts, Philipp; Mühlberger, Andreas
2010-03-01
Anxiety is supposed to enhance the processing of threatening information. Here, we investigated the cortical processing of angry faces during anticipated public speaking. To elicit anxiety, a group of participants was told that they would have to perform a public speech. As a control condition, another group was told that they would have to write a short essay. During anticipation of these tasks, participants saw facial expressions (angry, happy, and neutral) while electroencephalogram was recorded. Event-related potential analysis revealed larger N170 amplitudes for angry compared to happy and neutral faces in the anxiety group. The early posterior negativity as an index of motivated attention was also enhanced for angry compared to happy and neutral faces in participants anticipating public speaking. These results indicate that fear of public speaking influences early perceptual processing of faces such that especially the processing of angry faces is facilitated.
Comparative Analysis of Processes for Recovery of Rare Earths from Bauxite Residue
NASA Astrophysics Data System (ADS)
Borra, Chenna Rao; Blanpain, Bart; Pontikes, Yiannis; Binnemans, Koen; Van Gerven, Tom
2016-11-01
Environmental concerns and lack of space suggest that the management of bauxite residue needs to be re-adressed. The utilization of the residue has thus become a topic high on the agenda for both academia and industry, yet, up to date, it is only rarely used. Nonetheless, recovery of rare earth elements (REEs) with or without other metals from bauxite residue, and utilization of the left-over residue in other applications like building materials may be a viable alternative to storage. Hence, different processes developed by the authors for recovery of REEs and other metals from bauxite residue were compared. In this study, preliminary energy and cost analyses were carried out to assess the feasibility of the processes. These analyses show that the combination of alkali roasting-smelting-quenching-leaching is a promising process for the treatment of bauxite residue and that it is justified to study this process at a pilot scale.
Kwon, Jinhyeong; Cho, Hyunmin; Eom, Hyeonjin; Lee, Habeom; Suh, Young Duk; Moon, Hyunjin; Shin, Jaeho; Hong, Sukjoon; Ko, Seung Hwan
2016-05-11
Copper nanomaterials suffer from severe oxidation problem despite the huge cost effectiveness. The effect of two different processes for conventional tube furnace heating and selective laser sintering on copper nanoparticle paste is compared in the aspects of chemical, electrical and surface morphology. The thermal behavior of the copper thin films by furnace and laser is compared by SEM, XRD, FT-IR, and XPS analysis. The selective laser sintering process ensures low annealing temperature, fast processing speed with remarkable oxidation suppression even in air environment while conventional tube furnace heating experiences moderate oxidation even in Ar environment. Moreover, the laser-sintered copper nanoparticle thin film shows good electrical property and reduced oxidation than conventional thermal heating process. Consequently, the proposed selective laser sintering process can be compatible with plastic substrate for copper based flexible electronics applications.
Onto-phylogenetic aspect of myotomal myogenesis in Chordata.
Kiełbówna, Leokadia; Daczewska, Małgorzata
2004-01-01
This paper presents an onto- and phylogenetic aspect of myotoamal myogenesis in Chordata. A comparative analysis of early stages of myotomal myogenesis in Chordata indicates that the myogenic process in this phylum underwent evolutionary changes. The first stage of the process is myogenesis leading to development of mononucleate mature muscle cells, the most advanced stage is formation of multinucleate muscle fibres.
ERIC Educational Resources Information Center
Štofková, Katarína; Strícek, Ivan; Štofková, Jana
2014-01-01
The paper is aimed to evaluate the possibility of applying new methods and tools of more effective educational processes, with an emphasis on increasing their quality especially aimed on educational processes at secondary schools and universities. There are some contributions from practice for the effective implementation of time management, such…
ERIC Educational Resources Information Center
Jakku-Sihvonen, Ritva; Tissari, Varpu; Ots, Aivar; Uusiautti, Satu
2012-01-01
During the Bologna process, from 2003 to 2006, degree programmes, including teacher education curricula, were developed in line with the two-tier system--the European Credit Transfer and Accumulation System (ECTS) and modularization. The purpose of the present study is to contribute to the development of teacher education profiling measures by…
ERIC Educational Resources Information Center
Beach, Derek; Rohlfing, Ingo
2018-01-01
In recent years, there has been increasing interest in the combination of two methods on the basis of set theory. In our introduction and this special issue, we focus on two variants of cross-case set-theoretic methods--"qualitative comparative analysis" (QCA) and typological theory (TT)--and their combination with process tracing (PT).…
A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines.
Khan, Arif Ul Maula; Mikut, Ralf; Reischl, Markus
2016-01-01
The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts.
Engaging stakeholders for adaptive management using structured decision analysis
Irwin, Elise R.; Kathryn, D.; Kennedy, Mickett
2009-01-01
Adaptive management is different from other types of management in that it includes all stakeholders (versus only policy makers) in the process, uses resource optimization techniques to evaluate competing objectives, and recognizes and attempts to reduce uncertainty inherent in natural resource systems. Management actions are negotiated by stakeholders, monitored results are compared to predictions of how the system should respond, and management strategies are adjusted in a “monitor-compare-adjust” iterative routine. Many adaptive management projects fail because of the lack of stakeholder identification, engagement, and continued involvement. Primary reasons for this vary but are usually related to either stakeholders not having ownership (or representation) in decision processes or disenfranchisement of stakeholders after adaptive management begins. We present an example in which stakeholders participated fully in adaptive management of a southeastern regulated river. Structured decision analysis was used to define management objectives and stakeholder values and to determine initial flow prescriptions. The process was transparent, and the visual nature of the modeling software allowed stakeholders to see how their interests and values were represented in the decision process. The development of a stakeholder governance structure and communication mechanism has been critical to the success of the project.
A New Feedback-Based Method for Parameter Adaptation in Image Processing Routines
Mikut, Ralf; Reischl, Markus
2016-01-01
The parametrization of automatic image processing routines is time-consuming if a lot of image processing parameters are involved. An expert can tune parameters sequentially to get desired results. This may not be productive for applications with difficult image analysis tasks, e.g. when high noise and shading levels in an image are present or images vary in their characteristics due to different acquisition conditions. Parameters are required to be tuned simultaneously. We propose a framework to improve standard image segmentation methods by using feedback-based automatic parameter adaptation. Moreover, we compare algorithms by implementing them in a feedforward fashion and then adapting their parameters. This comparison is proposed to be evaluated by a benchmark data set that contains challenging image distortions in an increasing fashion. This promptly enables us to compare different standard image segmentation algorithms in a feedback vs. feedforward implementation by evaluating their segmentation quality and robustness. We also propose an efficient way of performing automatic image analysis when only abstract ground truth is present. Such a framework evaluates robustness of different image processing pipelines using a graded data set. This is useful for both end-users and experts. PMID:27764213
The neural correlates of implicit self-relevant processing in low self-esteem: an ERP study.
Yang, Juan; Guan, Lili; Dedovic, Katarina; Qi, Mingming; Zhang, Qinglin
2012-08-30
Previous neuroimaging studies have shown that implicit and explicit processing of self-relevant (schematic) material elicit activity in many of the same brain regions. Electrophysiological studies on the neural processing of explicit self-relevant cues have generally supported the view that P300 is an index of attention to self-relevant stimuli; however, there has been no study to date investigating the temporal course of implicit self-relevant processing. The current study seeks to investigate the time course involved in implicit self-processing by comparing processing of self-relevant with non-self-relevant words while subjects are making a judgment about color of the words in an implicit attention task. Sixteen low self-esteem participants were examined using event-related potentials technology (ERP). We hypothesized that this implicit attention task would involve P2 component rather than the P300 component. Indeed, P2 component has been associated with perceptual analysis and attentional allocation and may be more likely to occur in unconscious conditions such as this task. Results showed that latency of P2 component, which indexes the time required for perceptual analysis, was more prolonged in processing self-relevant words compared to processing non-self-relevant words. Our results suggested that the judgment of the color of the word interfered with automatic processing of self-relevant information and resulted in less efficient processing of self-relevant word. Together with previous ERP studies examining processing of explicit self-relevant cues, these findings suggest that the explicit and the implicit processing of self-relevant information would not elicit the same ERP components. Copyright © 2012 Elsevier B.V. All rights reserved.
Lee, Sunmin; Lee, Sarah; Singh, Digar; Oh, Ji Young; Jeon, Eun Jung; Ryu, Hyung SeoK; Lee, Dong Wan; Kim, Beom Seok; Lee, Choong Hwan
2017-04-15
Two different doenjang manufacturing processes, the industrial process (IP) and the modified industrial process (mIP) with specific microbial assortments, were subjected to metabolite profiling using liquid chromatography-mass spectrometry (LC-MS) and gas chromatography time-of-flight mass spectrometry (GC-TOF-MS). The multivariate analyses indicated that both primary and secondary metabolites exhibited distinct patterns according to the fermentation processes (IP and mIP). Microbial community analysis for doenjang using denaturing gradient gel electrophoresis (DGGE), exhibited that both bacteria and fungi contributed proportionally for each step in the process viz., soybean, steaming, drying, meju fermentation, cooling, brining, and aging. Further, correlation analysis indicated that Aspergillus population was linked to sugar metabolism, Bacillus spp. with that of fatty acids, whereas Tetragenococcus and Zygosaccharomyces were found associated with amino acids. These results suggest that the components and quality of doenjang are critically influenced by the microbial assortments in each process. Copyright © 2016 Elsevier Ltd. All rights reserved.
Comparative transcriptome analysis of soybean response to bean pyralid larvae.
Zeng, Weiying; Sun, Zudong; Cai, Zhaoyan; Chen, Huaizhu; Lai, Zhenguang; Yang, Shouzhen; Tang, Xiangmin
2017-11-13
Soybean is one of most important oilseed crop worldwide, however, its production is often limited by many insect pests. Bean pyralid is one of the major soybean leaf-feeding insects in China. To explore the defense mechanisms of soybean resistance to bean pyralid, the comparative transcriptome sequencing was completed between the leaves infested with bean pyralid larvae and no worm of soybean (Gantai-2-2 and Wan82-178) on the Illumina HiSeq™ 2000 platform. In total, we identified 1744 differentially expressed genes (DEGs) in the leaves of Gantai-2-2 (1064) and Wan82-178 (680) fed by bean pyralid for 48 h, compared to 0 h. Interestingly, 315 DEGs were shared by Gantai-2-2 and Wan82-178, while 749 and 365 DEGs specifically identified in Gantai-2-2 and Wan82-178, respectively. When comparing Gantai-2-2 with Wan82-178, 605 DEGs were identified at 0 h feeding, and 468 DEGs were identified at 48 h feeding. Gene Ontology (GO) annotation analysis revealed that the DEGs were mainly involved in the metabolic process, single-organism process, cellular process, responses to stimulus, catalytic activities and binding. Pathway analysis showed that most of the DEGs were associated with the plant-pathogen interaction, phenylpropanoid biosynthesis, phenylalanine metabolism, flavonoid biosynthesis, peroxisome, plant hormone signal transduction, terpenoid backbone biosynthesis, and so on. Finally, we used qRT-PCR to validate the expression patterns of several genes and the results showed an excellent agreement with deep sequencing. According to the comparative transcriptome analysis results and related literature reports, we concluded that the response to bean pyralid feeding might be related to the disturbed functions and metabolism pathways of some key DEGs, such as DEGs involved in the ROS removal system, plant hormone metabolism, intracellular signal transduction pathways, secondary metabolism, transcription factors, biotic and abiotic stresses. We speculated that these genes may have played an important role in synthesizing substances to resist insect attacks in soybean. Our results provide a valuable resource of soybean defense genes that will benefit other studies in this field.
Analysis of Wood Structure Connections Using Cylindrical Steel and Carbon Fiber Dowel Pins
NASA Astrophysics Data System (ADS)
Vodiannikov, Mikhail A.; Kashevarova, Galina G., Dr.
2017-06-01
In this paper, the results of the statistical analysis of corrosion processes and moisture saturation of glued laminated timber structures and their joints in corrosive environment are shown. This paper includes calculation results for dowel connections of wood structures using steel and carbon fiber reinforced plastic cylindrical dowel pins in accordance with applicable regulatory documents by means of finite element analysis in ANSYS software, as well as experimental findings. Dependence diagrams are shown; comparative analysis of the results obtained is conducted.
SERS as a tool for in vitro toxicology.
Fisher, Kate M; McLeish, Jennifer A; Jamieson, Lauren E; Jiang, Jing; Hopgood, James R; McLaughlin, Stephen; Donaldson, Ken; Campbell, Colin J
2016-06-23
Measuring markers of stress such as pH and redox potential are important when studying toxicology in in vitro models because they are markers of oxidative stress, apoptosis and viability. While surface enhanced Raman spectroscopy is ideally suited to the measurement of redox potential and pH in live cells, the time-intensive nature and perceived difficulty in signal analysis and interpretation can be a barrier to its broad uptake by the biological community. In this paper we detail the development of signal processing and analysis algorithms that allow SERS spectra to be automatically processed so that the output of the processing is a pH or redox potential value. By automating signal processing we were able to carry out a comparative evaluation of the toxicology of silver and zinc oxide nanoparticles and correlate our findings with qPCR analysis. The combination of these two analytical techniques sheds light on the differences in toxicology between these two materials from the perspective of oxidative stress.
Digital signal processing for velocity measurements in dynamical material's behaviour studies.
Devlaminck, Julien; Luc, Jérôme; Chanal, Pierre-Yves
2014-03-01
In this work, we describe different configurations of optical fiber interferometers (types Michelson and Mach-Zehnder) used to measure velocities during dynamical material's behaviour studies. We detail the algorithms of processing developed and optimized to improve the performance of these interferometers especially in terms of time and frequency resolutions. Three methods of analysis of interferometric signals were studied. For Michelson interferometers, the time-frequency analysis of signals by Short-Time Fourier Transform (STFT) is compared to a time-frequency analysis by Continuous Wavelet Transform (CWT). The results have shown that the CWT was more suitable than the STFT for signals with low signal-to-noise, and low velocity and high acceleration areas. For Mach-Zehnder interferometers, the measurement is carried out by analyzing the phase shift between three interferometric signals (Triature processing). These three methods of digital signal processing were evaluated, their measurement uncertainties estimated, and their restrictions or operational limitations specified from experimental results performed on a pulsed power machine.
BactoGeNIE: A large-scale comparative genome visualization for big displays
Aurisano, Jillian; Reda, Khairi; Johnson, Andrew; ...
2015-08-13
The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE throughmore » a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. In conclusion, BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics.« less
BactoGeNIE: a large-scale comparative genome visualization for big displays
2015-01-01
Background The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. Results In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE through a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. Conclusions BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics. PMID:26329021
BactoGeNIE: A large-scale comparative genome visualization for big displays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aurisano, Jillian; Reda, Khairi; Johnson, Andrew
The volume of complete bacterial genome sequence data available to comparative genomics researchers is rapidly increasing. However, visualizations in comparative genomics--which aim to enable analysis tasks across collections of genomes--suffer from visual scalability issues. While large, multi-tiled and high-resolution displays have the potential to address scalability issues, new approaches are needed to take advantage of such environments, in order to enable the effective visual analysis of large genomics datasets. In this paper, we present Bacterial Gene Neighborhood Investigation Environment, or BactoGeNIE, a novel and visually scalable design for comparative gene neighborhood analysis on large display environments. We evaluate BactoGeNIE throughmore » a case study on close to 700 draft Escherichia coli genomes, and present lessons learned from our design process. In conclusion, BactoGeNIE accommodates comparative tasks over substantially larger collections of neighborhoods than existing tools and explicitly addresses visual scalability. Given current trends in data generation, scalable designs of this type may inform visualization design for large-scale comparative research problems in genomics.« less
Loizzo, Monica Rosa; Pugliese, Alessandro; Bonesi, Marco; De Luca, Damiano; O'Brien, Nora; Menichini, Francesco; Tundis, Rosa
2013-03-01
The present study evaluates the influence of drying and cooking processes on the health properties of two bell Capsicum annuum L. cultivars Roggiano and Senise compared with fresh peppers. The content of phytochemicals decreased in the order fresh>dried>dried frying processes. HPLC analysis was applied to quantify five flavonoids from peppers. Apigenin was identified as main constituent. Its content was affected by drying and dried frying processes. The antioxidant activity was evaluated by DPPH, ABTS, β-carotene bleaching test and Fe-chelating activity assay. A comparable radical scavenging activity was observed for both cultivars. Interestingly, frying process did not influenced this property. Roggiano peppers exhibited the highest antioxidant activity using β-carotene bleaching test with IC(50) values of 38.1 and 24.9 μg/mL for total extract and n-hexane fraction, respectively. GC-MS analysis of lipophilic fraction revealed the presence of fatty acids and vitamin E as major components. In the inhibition of the carbohydrate-hydrolyzing enzymes fresh Senise peppers exerted the strongest activity against α-amylase with an IC(50) value of 55.3 μg/mL. Our results indicate that C. annuum cultivars Roggiano and Senise have an interestingly potential health benefits not influenced by processes that are used before consumption. Copyright © 2012 Elsevier Ltd. All rights reserved.
Wang, Fang; Ouyang, Guang; Zhou, Changsong; Wang, Suiping
2015-01-01
A number of studies have explored the time course of Chinese semantic and syntactic processing. However, whether syntactic processing occurs earlier than semantics during Chinese sentence reading is still under debate. To further explore this issue, an event-related potentials (ERPs) experiment was conducted on 21 native Chinese speakers who read individually-presented Chinese simple sentences (NP1+VP+NP2) word-by-word for comprehension and made semantic plausibility judgments. The transitivity of the verbs was manipulated to form three types of stimuli: congruent sentences (CON), sentences with a semantically violated NP2 following a transitive verb (semantic violation, SEM), and sentences with a semantically violated NP2 following an intransitive verb (combined semantic and syntactic violation, SEM+SYN). The ERPs evoked from the target NP2 were analyzed by using the Residue Iteration Decomposition (RIDE) method to reconstruct the ERP waveform blurred by trial-to-trial variability, as well as by using the conventional ERP method based on stimulus-locked averaging. The conventional ERP analysis showed that, compared with the critical words in CON, those in SEM and SEM+SYN elicited an N400-P600 biphasic pattern. The N400 effects in both violation conditions were of similar size and distribution, but the P600 in SEM+SYN was bigger than that in SEM. Compared with the conventional ERP analysis, RIDE analysis revealed a larger N400 effect and an earlier P600 effect (in the time window of 500-800 ms instead of 570-810ms). Overall, the combination of conventional ERP analysis and the RIDE method for compensating for trial-to-trial variability confirmed the non-significant difference between SEM and SEM+SYN in the earlier N400 time window. Converging with previous findings on other Chinese structures, the current study provides further precise evidence that syntactic processing in Chinese does not occur earlier than semantic processing.
Thirteenth NASTRAN (R) Users' Colloquium
NASA Technical Reports Server (NTRS)
1985-01-01
The application of finite element methods in engineering is discussed and the use of NASTRAN is compared with other approaches. Specific applications, pre- and post-processing or auxiliary programs, and additional methods of analysis with NASTRAN are covered.
Keil, Julian; Balz, Johanna; Gallinat, Jürgen; Senkowski, Daniel
2016-01-01
Our brain generates predictions about forthcoming stimuli and compares predicted with incoming input. Failures in predicting events might contribute to hallucinations and delusions in schizophrenia (SZ). When a stimulus violates prediction, neural activity that reflects prediction error (PE) processing is found. While PE processing deficits have been reported in unisensory paradigms, it is unknown whether SZ patients (SZP) show altered crossmodal PE processing. We measured high-density electroencephalography and applied source estimation approaches to investigate crossmodal PE processing generated by audiovisual speech. In SZP and healthy control participants (HC), we used an established paradigm in which high- and low-predictive visual syllables were paired with congruent or incongruent auditory syllables. We examined crossmodal PE processing in SZP and HC by comparing differences in event-related potentials and neural oscillations between incongruent and congruent high- and low-predictive audiovisual syllables. In both groups event-related potentials between 206 and 250 ms were larger in high- compared with low-predictive syllables, suggesting intact audiovisual incongruence detection in the auditory cortex of SZP. The analysis of oscillatory responses revealed theta-band (4–7 Hz) power enhancement in high- compared with low-predictive syllables between 230 and 370 ms in the frontal cortex of HC but not SZP. Thus aberrant frontal theta-band oscillations reflect crossmodal PE processing deficits in SZ. The present study suggests a top-down multisensory processing deficit and highlights the role of dysfunctional frontal oscillations for the SZ psychopathology. PMID:27358314
Bayesian Computation for Log-Gaussian Cox Processes: A Comparative Analysis of Methods
Teng, Ming; Nathoo, Farouk S.; Johnson, Timothy D.
2017-01-01
The Log-Gaussian Cox Process is a commonly used model for the analysis of spatial point pattern data. Fitting this model is difficult because of its doubly-stochastic property, i.e., it is an hierarchical combination of a Poisson process at the first level and a Gaussian Process at the second level. Various methods have been proposed to estimate such a process, including traditional likelihood-based approaches as well as Bayesian methods. We focus here on Bayesian methods and several approaches that have been considered for model fitting within this framework, including Hamiltonian Monte Carlo, the Integrated nested Laplace approximation, and Variational Bayes. We consider these approaches and make comparisons with respect to statistical and computational efficiency. These comparisons are made through several simulation studies as well as through two applications, the first examining ecological data and the second involving neuroimaging data. PMID:29200537
Gurieff, Nicholas; Lant, Paul
2007-12-01
A life cycle assessment and financial analysis of mixed culture PHA (PHA(MC)) and biogas production was undertaken based on treating an industrial wastewater. Internal rate of return (IRR) and non-renewable CO(2)eq emissions were used to quantify financial viability and environmental impact. PHA(MC) was preferable to biogas production for treating the specified industrial effluent. PHA(MC) was also financially attractive in comparison to pure culture PHA production. Both PHA production processes had similar environmental impacts that were significantly lower than HDPE production. A large potential for optimisation exists for the PHA(MC) process as financial and environmental costs were primarily due to energy use for downstream processing. Under the conditions used in this work PHA(MC) was shown to be a viable biopolymer production process and an effective industrial wastewater treatment technology. This is the first study of its kind and provides valuable insight into the PHA(MC) process.
Prediction of Cutting Force in Turning Process-an Experimental Approach
NASA Astrophysics Data System (ADS)
Thangarasu, S. K.; Shankar, S.; Thomas, A. Tony; Sridhar, G.
2018-02-01
This Paper deals with a prediction of Cutting forces in a turning process. The turning process with advanced cutting tool has a several advantages over grinding such as short cycle time, process flexibility, compatible surface roughness, high material removal rate and less environment problems without the use of cutting fluid. In this a full bridge dynamometer has been used to measure the cutting forces over mild steel work piece and cemented carbide insert tool for different combination of cutting speed, feed rate and depth of cut. The experiments are planned based on taguchi design and measured cutting forces were compared with the predicted forces in order to validate the feasibility of the proposed design. The percentage contribution of each process parameter had been analyzed using Analysis of Variance (ANOVA). Both the experimental results taken from the lathe tool dynamometer and the designed full bridge dynamometer were analyzed using Taguchi design of experiment and Analysis of Variance.
Autologous Fat Grafting to the Breast Using REVOLVE System to Reduce Clinical Costs.
Brzezienski, Mark A; Jarrell, John A
2016-09-01
With the increasing popularity of fat grafting over the past decade, the techniques for harvest, processing and preparation, and transfer of the fat cells have evolved to improve efficiency and consistency. The REVOLVE System is a fat processing device used in autologous fat grafting which eliminates much of the specialized equipment as well as the labor intensive and time consuming efforts of the original Coleman technique of fat processing. This retrospective study evaluates the economics of fat grafting, comparing traditional Coleman processing to the REVOLVE System. From June 2013 through December 2013, 88 fat grafting cases by a single-surgeon were reviewed. Timed procedures using either the REVOLVE System or Coleman technique were extracted from the group. Data including fat grafting procedure time, harvested volume, harvest and recipient sites, and concurrent procedures were gathered. Cost and utilization assessments were performed comparing the economics between the groups using standard values of operating room costs provided by the study hospital. Thirty-seven patients with timed procedures were identified, 13 of which were Coleman technique patients and twenty-four (24) were REVOLVE System patients. The average rate of fat transfer was 1.77 mL/minute for the Coleman technique and 4.69 mL/minute for the REVOLVE System, which was a statistically significant difference (P < 0.0001) between the 2 groups. Cost analysis comparing the REVOLVE System and Coleman techniques demonstrates a dramatic divergence in the price per mL of transferred fat at 75 mL when using the previously calculated rates for each group. This single surgeon's experience with the REVOLVE System for fat processing establishes economic support for its use in specific high-volume fat grafting cases. Cost analysis comparing the REVOLVE System and Coleman techniques suggests that in cases of planned fat transfer of 75 mL or more, using the REVOLVE System for fat processing is more economically beneficial. This study may serve as a guide to plastic surgeons in deciding which cases might be appropriate for the use of the REVOLVE System and is the first report comparing economics of fat grafting with the traditional Coleman technique and the REVOLVE System.
NASA Astrophysics Data System (ADS)
Konkol, Jakub; Bałachowski, Lech
2017-03-01
In this paper, the whole process of pile construction and performance during loading is modelled via large deformation finite element methods such as Coupled Eulerian Lagrangian (CEL) and Updated Lagrangian (UL). Numerical study consists of installation process, consolidation phase and following pile static load test (SLT). The Poznań site is chosen as the reference location for the numerical analysis, where series of pile SLTs have been performed in highly overconsolidated clay (OCR ≈ 12). The results of numerical analysis are compared with corresponding field tests and with so-called "wish-in-place" numerical model of pile, where no installation effects are taken into account. The advantages of using large deformation numerical analysis are presented and its application to the pile designing is shown.
Steam gasification of tyre waste, poplar, and refuse-derived fuel: a comparative analysis.
Galvagno, S; Casciaro, G; Casu, S; Martino, M; Mingazzini, C; Russo, A; Portofino, S
2009-02-01
In the field of waste management, thermal disposal is a treatment option able to recover resources from "end of life" products. Pyrolysis and gasification are emerging thermal treatments that work under less drastic conditions in comparison with classic direct combustion, providing for reduced gaseous emissions of heavy metals. Moreover, they allow better recovery efficiency since the process by-products can be used as fuels (gas, oils), for both conventional (classic engines and heaters) and high efficiency apparatus (gas turbines and fuel cells), or alternatively as chemical sources or as raw materials for other processes. This paper presents a comparative study of a steam gasification process applied to three different waste types (refuse-derived fuel, poplar wood and scrap tyres), with the aim of comparing the corresponding yields and product compositions and exploring the most valuable uses of the by-products.
Acoustic emission analysis for the detection of appropriate cutting operations in honing processes
NASA Astrophysics Data System (ADS)
Buj-Corral, Irene; Álvarez-Flórez, Jesús; Domínguez-Fernández, Alejandro
2018-01-01
In the present paper, acoustic emission was studied in honing experiments obtained with different abrasive densities, 15, 30, 45 and 60. In addition, 2D and 3D roughness, material removal rate and tool wear were determined. In order to treat the sound signal emitted during the machining process, two methods of analysis were compared: Fast Fourier Transform (FFT) and Hilbert Huang Transform (HHT). When density 15 is used, the number of cutting grains is insufficient to provide correct cutting, while clogging appears with densities 45 and 60. The results were confirmed by means of treatment of the sound signal. In addition, a new parameter S was defined as the relationship between energy in low and high frequencies contained within the emitted sound. The selected density of 30 corresponds to S values between 0.1 and 1. Correct cutting operations in honing processes are dependent on the density of the abrasive employed. The density value to be used can be selected by means of measurement and analysis of acoustic emissions during the honing operation. Thus, honing processes can be monitored without needing to stop the process.
Thermodynamic performance of multi-stage gradational lead screw vacuum pump
NASA Astrophysics Data System (ADS)
Zhao, Fan; Zhang, Shiwei; Sun, Kun; Zhang, Zhijun
2018-02-01
As a kind of dry mechanical vacuum pump, the twin-screw vacuum pump has an outstanding pumping performance during operation, widely used in the semiconductor industry. Compared with the constant lead screw (CLS) vacuum pump, the gradational lead screw (GLS) vacuum pump is more popularly applied in recent years. Nevertheless, not many comparative studies on the thermodynamic performance of GLS vacuum pump can be found in the literature. Our study focuses on one type of GLS vacuum pump, the multi-stage gradational lead screw (MGLS) vacuum pump, gives a detailed description of its construction and illustrates it with the drawing. Based on the structural analysis, the thermodynamic procedure is divided into four distinctive processes, including sucking process, transferring (compressing) process, backlashing process and exhausting process. The internal mechanism of each process is qualitatively illustrated and the mathematical expressions of seven thermodynamic parameters are given under the ideal situation. The performance curves of MGLS vacuum pump are plotted by MATLAB software and compared with those of the CLS vacuum pump in the same case. The results can well explain why the MGLS vacuum pump has more favorable pumping performance than the CLS vacuum pump in saving energy, reducing noise and heat dissipation.
NASA Technical Reports Server (NTRS)
Arevalo, Ricardo, Jr.; Coyle, Barry; Paulios, Demetrios; Stysley, Paul; Feng, Steve; Getty, Stephanie; Binkerhoff, William
2015-01-01
Compared to wet chemistry and pyrolysis techniques, in situ laser-based methods of chemical analysis provide an ideal way to characterize precious planetary materials without requiring extensive sample processing. In particular, laser desorption and ablation techniques allow for rapid, reproducible and robust data acquisition over a wide mass range, plus: Quantitative, spatially-resolved measurements of elemental and molecular (organic and inorganic) abundances; Low analytical blanks and limits-of-detection ( ng g-1); and, the destruction of minimal quantities of sample ( g) compared to traditional solution and/or pyrolysis analyses (mg).
Categorical data processing for real estate objects valuation using statistical analysis
NASA Astrophysics Data System (ADS)
Parygin, D. S.; Malikov, V. P.; Golubev, A. V.; Sadovnikova, N. P.; Petrova, T. M.; Finogeev, A. G.
2018-05-01
Theoretical and practical approaches to the use of statistical methods for studying various properties of infrastructure objects are analyzed in the paper. Methods of forecasting the value of objects are considered. A method for coding categorical variables describing properties of real estate objects is proposed. The analysis of the results of modeling the price of real estate objects using regression analysis and an algorithm based on a comparative approach is carried out.
NASA Astrophysics Data System (ADS)
Zhao, Qian; Sun, Yeqing; Wang, Wei
2016-07-01
Highly ionizing radiation (HZE) in space is considered as a main factor causing biological effects on plant seeds. To investigate the different effects on genome-wide gene expression of low-dose and high-dose ion radiation, we carried out ground-base carbon particle HZE experiments with different cumulative doses (0Gy, 0.2Gy, 2Gy) to rice seeds and then performed comparative transcriptome analysis of the rice seedlings. We identified a total of 2551 and 1464 differentially expressed genes (DEGs) in low-dose and high-dose radiation groups, respectively. Gene ontology analyses indicated that low-dose and high-dose ion radiation both led to multiple physiological and biochemical activities changes in rice. By Gene Ontology analyses, the results showed that only one process-oxidation reduction process was enriched in the biological process category after high-dose ion radiation, while more processes such as response to biotic stimulus, heme binding, tetrapyrrole binding, oxidoreductase activity, catalytic activity and oxidoreductase activity were significantly enriched after low-dose ion radiation. The results indicated that the rice plants only focused on the process of oxidation reduction to response to high-dose ion radiation, whereas it was a coordination of multiple biological processes to response to low-dose ion radiation. To elucidate the transcriptional regulation of radiation stress-responsive genes, we identified several DEGs-encoding TFs. AP2/EREBP, bHLH, C2H2, MYB and WRKY TF families were altered significantly in response to ion radiation. Mapman analysis speculated that the biological effects on rice seedlings caused by the radiation stress might share similar mechanisms with the biotic stress. Our findings highlight important alterations in the expression of radiation response genes, metabolic pathways, and TF-encoding genes in rice seedlings exposed to low-dose and high-dose ion radiation.
NASA Astrophysics Data System (ADS)
Pourattar, Parisa
The cementation process of making Egyptian faience, reported by Hans Wulff from a workshop in Qom, Iran, has not been easy to replicate and various views have been set forth to understand the transport of materials from the glazing powder to the surfaces of the crushed quartz beads. Replications of the process fired to 950° C and under-fired to 850° C were characterized by electron beam microprobe analysis (EPMA), petrographic thin section analysis, and scanning electron microscopy with energy dispersive x-ray analysis (SEM-EDS). Chemical variations were modeled using thermal data, phase diagrams, and copper vaporization experiments. These replications were compared to 52 examples from various collections, including 20th century ethnographic collections of beads, glazing powder and plant ash, 12th century CE beads and glazing powder from Fustat (Old Cairo), Egypt, and to an earlier example from Abydos, Egypt in the New Kingdom and to an ash example from the Smithsonian Institution National Museum of Natural History.
Prospects for discovery by epigenome comparison
Milosavljevic, Aleksandar
2010-01-01
Epigenomic analysis efforts have so far focused on the multiple layers of epigenomic information within individual cell types. With the rapidly increasing diversity of epigenomically mapped cell types, unprecedented opportunities for comparative analysis of epigenomes are opening up. One such opportunity is to map the bifurcating tree of cellular differentiation. Another is to understand the epigenomically mediated effects of mutations, environmental influences, and disease processes. Comparative analysis of epigenomes therefore has the potential to provide wide-ranging fresh insights into basic biology and human disease. The realization of this potential will critically depend on availability of a cyberinfrastructure that will scale with the volume of data and diversity of applications and a number of other computational challenges. PMID:20944597
Phase 2 of the Array Automated Assembly Task for the Low Cost Solar Array Project
NASA Technical Reports Server (NTRS)
Campbell, R. B.; Rai-Choundhury, P.; Seman, E. J.; Rohatgi, A.; Davis, J. R.; Ostroski, J. W.; Stapleton, R. E.
1979-01-01
Two process specifications supplied by contractors were tested. The aluminum silk screening process resulted in cells comparable to those from sputtered Al. The electroless plating of contacts specification could be used only with extensive modification. Several experiments suggest that there is some degradation of the front junction during the Al back surface field (BSF) fabrication. A revised process sequence was defined which incorporates Al BSF formation. A cost analysis of this process yielded a selling price of $0.75/watt peak in 1980.
NASA Technical Reports Server (NTRS)
Wolf, M.
1981-01-01
The effect of solar cell metallization pattern design on solar cell performance and the costs and performance effects of different metallization processes are discussed. Definitive design rules for the front metallization pattern for large area solar cells are presented. Chemical and physical deposition processes for metallization are described and compared. An economic evaluation of the 6 principal metallization options is presented. Instructions for preparing Format A cost data for solar cell manufacturing processes from UPPC forms for input into the SAMIC computer program are presented.
System for monitoring an industrial or biological process
Gross, Kenneth C.; Wegerich, Stephan W.; Vilim, Rick B.; White, Andrew M.
1998-01-01
A method and apparatus for monitoring and responding to conditions of an industrial process. Industrial process signals, such as repetitive manufacturing, testing and operational machine signals, are generated by a system. Sensor signals characteristic of the process are generated over a time length and compared to reference signals over the time length. The industrial signals are adjusted over the time length relative to the reference signals, the phase shift of the industrial signals is optimized to the reference signals and the resulting signals output for analysis by systems such as SPRT.
System for monitoring an industrial or biological process
Gross, K.C.; Wegerich, S.W.; Vilim, R.B.; White, A.M.
1998-06-30
A method and apparatus are disclosed for monitoring and responding to conditions of an industrial process. Industrial process signals, such as repetitive manufacturing, testing and operational machine signals, are generated by a system. Sensor signals characteristic of the process are generated over a time length and compared to reference signals over the time length. The industrial signals are adjusted over the time length relative to the reference signals, the phase shift of the industrial signals is optimized to the reference signals and the resulting signals output for analysis by systems such as SPRT. 49 figs.
Baumgart, André; Denz, Christof; Bender, Hans-Joachim; Schleppers, Alexander
2009-01-01
The complexity of the operating room (OR) requires that both structural (eg, department layout) and behavioral (eg, staff interactions) patterns of work be considered when developing quality improvement strategies. In our study, we investigated how these contextual factors influence outpatient OR processes and the quality of care delivered. The study setting was a German university-affiliated hospital performing approximately 6000 outpatient surgeries annually. During the 3-year-study period, the hospital significantly changed its outpatient OR facility layout from a decentralized (ie, ORs in adjacent areas of the building) to a centralized (ie, ORs in immediate vicinity of each other) design. To study the impact of the facility change on OR processes, we used a mixed methods approach, including process analysis, process modeling, and social network analysis of staff interactions. The change in facility layout was seen to influence OR processes in ways that could substantially affect patient outcomes. For example, we found a potential for more errors during handovers in the new centralized design due to greater interdependency between tasks and staff. Utilization of the mixed methods approach in our analysis, as compared with that of a single assessment method, enabled a deeper understanding of the OR work context and its influence on outpatient OR processes.
USDA-ARS?s Scientific Manuscript database
Cold-induced sweetening in potato tubers is a costly problem for food industry. To systematically identify the proteins associated with this process, we employed a comparative proteomics approach using isobaric, stable isotope coded labels to compare the proteomes of potato tubers after 0 and 5 mont...
Exploring Access and Equity in Higher Education: Policy and Performance in a Comparative Perspective
ERIC Educational Resources Information Center
Clancy, Patrick; Goastellec, Gaele
2007-01-01
A comparative analysis of how access and equity are defined and how policies have evolved reveals a number of commonalities and differences between countries. The overall trend is a movement from the priority given to "inherited merit" in the admission process through a commitment to formal equality, towards the application of some modes of…
ERIC Educational Resources Information Center
Oh, Hunseok; Choi, Yeseul; Choi, Myungweon
2013-01-01
The purpose of this study was to assess, evaluate, and compare the competitive advantages of the human resource development systems of advanced countries. The Global Human Resource Development Index was utilized for this study, since it has been validated through an expert panel's content review and analytic hierarchy process. Using a sample of 34…
Ren, Jian-xun; Liu, Jian-xun; Lin, Cheng-ren
2010-04-01
To comparatively analyse the objective characteristics of different syndrome types of qi-disturbance-induced blood stasis syndrome (QDBS) in the pathogenetic evolution of unstable angina coronary heart disease (UA-CHD). Seventy-eight patients with UA-CHD of QDBS were differentiated into 2 groups: 55 in the qi-deficiency-induced blood-stasis syndrome group (A) and 23 in the qi-stagnation-induced blood-stasis syndrome group (B). The comparative analysis on them was carried out through comparing their blood pressure, glucose and lipid metabolisms, coagulation function, thyroid function and inflammation reaction changes, etc. In the pathogenetic process of qi-disturbance induced blood stasis, the initiating age, levels of HbA1c, TSH, PT and APTT between the two groups were significantly different (P < 0.05). Levels of TNF-alpha and LN were higher and levels of sIgA lower in patients than those in healthy subjects (P < 0.05). Inflammation immune reaction may play an important role in the pathogenetic process of blood-stasis syndrome, and the functional disturbance of hypothalamus, pituitary and endocrinal secretion induced by emotional stress is possibly the essence of qi-stagnation induced blood stasis syndrome.
Consolidation of lunar regolith: Microwave versus direct solar heating
NASA Technical Reports Server (NTRS)
Kunitzer, J.; Strenski, D. G.; Yankee, S. J.; Pletka, B. J.
1991-01-01
The production of construction materials on the lunar surface will require an appropriate fabrication technique. Two processing methods considered as being suitable for producing dense, consolidated products such as bricks are direct solar heating and microwave heating. An analysis was performed to compare the two processes in terms of the amount of power and time required to fabricate bricks of various size. The regolith was considered to be a mare basalt with an overall density of 60 pct. of theoretical. Densification was assumed to take place by vitrification since this process requires moderate amounts of energy and time while still producing dense products. Microwave heating was shown to be significantly faster compared to solar furnace heating for rapid production of realistic-size bricks.
Study on electrochemically deposited Mg metal
NASA Astrophysics Data System (ADS)
Matsui, Masaki
An electrodeposition process of magnesium metal from Grignard reagent based electrolyte was studied by comparing with lithium. The electrodeposition of magnesium was performed at various current densities. The obtained magnesium deposits did not show dendritic morphologies while all the lithium deposits showed dendritic products. Two different crystal growth modes in the electrodeposition process of magnesium metal were confirmed by an observation using scanning electron micro scope (SEM) and a crystallographic analysis using X-ray diffraction (XRD). An electrochemical study of the deposition/dissolution process of the magnesium showed a remarkable dependency of the overpotential of magnesium deposition on the electrolyte concentration compared with lithium. This result suggests that the dependency of the overpotential on the electrolyte concentration prevent the locally concentrated current resulting to form very uniform deposits.
Power processing methodology. [computerized design of spacecraft electric power systems
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hansen, I. G.; Hayden, J. H.
1974-01-01
Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.
[Bayesian statistics in medicine -- part II: main applications and inference].
Montomoli, C; Nichelatti, M
2008-01-01
Bayesian statistics is not only used when one is dealing with 2-way tables, but it can be used for inferential purposes. Using the basic concepts presented in the first part, this paper aims to give a simple overview of Bayesian methods by introducing its foundation (Bayes' theorem) and then applying this rule to a very simple practical example; whenever possible, the elementary processes at the basis of analysis are compared to those of frequentist (classical) statistical analysis. The Bayesian reasoning is naturally connected to medical activity, since it appears to be quite similar to a diagnostic process.
Multicriteria decision analysis: Overview and implications for environmental decision making
Hermans, Caroline M.; Erickson, Jon D.; Erickson, Jon D.; Messner, Frank; Ring, Irene
2007-01-01
Environmental decision making involving multiple stakeholders can benefit from the use of a formal process to structure stakeholder interactions, leading to more successful outcomes than traditional discursive decision processes. There are many tools available to handle complex decision making. Here we illustrate the use of a multicriteria decision analysis (MCDA) outranking tool (PROMETHEE) to facilitate decision making at the watershed scale, involving multiple stakeholders, multiple criteria, and multiple objectives. We compare various MCDA methods and their theoretical underpinnings, examining methods that most realistically model complex decision problems in ways that are understandable and transparent to stakeholders.
Generic Modeling of a Life Support System for Process Technology Comparison
NASA Technical Reports Server (NTRS)
Ferrall, J. F.; Seshan, P. K.; Rohatgi, N. K.; Ganapathi, G. B.
1993-01-01
This paper describes a simulation model called the Life Support Systems Analysis Simulation Tool (LiSSA-ST), the spreadsheet program called the Life Support Systems Analysis Trade Tool (LiSSA-TT), and the Generic Modular Flow Schematic (GMFS) modeling technique. Results of using the LiSSA-ST and the LiSSA-TT will be presented for comparing life support system and process technology options for a Lunar Base with a crew size of 4 and mission lengths of 90 and 600 days. System configurations to minimize the life support system weight and power are explored.
NASA Technical Reports Server (NTRS)
Tornatore, Vincenza
2013-01-01
The main activities carried out at the PMD (Politecnico di Milano DIIAR) IVS Analysis Center during 2012 are briefly higlighted, and future plans for 2013 are sketched out. We principally continued to process European VLBI sessions using different approaches to evaluate possible differences due to various processing choices. Then VLBI solutions were also compared to the GPS ones as well as the ones calculated at co-located sites. Concerning the observational aspect, several tests were performed to identify the most suitable method to achieve the highest possible accuracy in the determination of GNSS (GLOBAL NAVIGATION SATELLITE SYSTEM) satellite positions using the VLBI technique.
Ben-David, Boaz M; Nguyen, Linh L T; van Lieshout, Pascal H H M
2011-03-01
The color word Stroop test is the most common tool used to assess selective attention in persons with traumatic brain injury (TBI). A larger Stroop effect for TBI patients, as compared to controls, is generally interpreted as reflecting a decrease in selective attention. Alternatively, it has been suggested that this increase in Stroop effects is influenced by group differences in generalized speed of processing (SOP). The current study describes an overview and meta-analysis of 10 studies, where persons with TBI (N = 324) were compared to matched controls (N = 501) on the Stroop task. The findings confirmed that Stroop interference was significantly larger for TBI groups (p = .008). However, these differences may be strongly biased by TBI-related slowdown in generalized SOP (r² = .81 in a Brinley analysis). We also found that TBI-related changes in sensory processing may affect group differences. Mainly, a TBI-related increase in the latency difference between reading and naming the font color of a color-neutral word (r² = .96) was linked to Stroop effects. Our results suggest that, in using Stroop, it seems prudent to control for both sensory factors and SOP to differentiate potential changes in selective attention from other changes following TBI.
Analysis of 3D printing parameters of gears for hybrid manufacturing
NASA Astrophysics Data System (ADS)
Budzik, Grzegorz; Przeszlowski, Łukasz; Wieczorowski, Michal; Rzucidlo, Arkadiusz; Gapinski, Bartosz; Krolczyk, Grzegorz
2018-05-01
The paper deals with analysis and selection of parameters of rapid prototyping of gears by selective sintering of metal powders. Presented results show wide spectrum of application of RP systems in manufacturing processes of machine elements, basing on analysis of market in term of application of additive manufacturing technology in different sectors of industry. Considerable growth of these methods over the past years can be observed. The characteristic errors of printed model with respect to ideal one for each technique were pointed out. Special attention was paid to the method of preparation of numerical data CAD/STL/RP. Moreover the analysis of manufacturing processes of gear type elements was presented. The tested gears were modeled with different allowances for final machining and made by DMLS. Metallographic analysis and strength tests on prepared specimens were performed. The above mentioned analysis and tests were used to compare the real properties of material with the nominal ones. To improve the quality of surface after sintering the gears were subjected to final machining. The analysis of geometry of gears after hybrid manufacturing method was performed (fig.1). The manufacturing process was defined in a traditional way as well as with the aid of modern manufacturing techniques. Methodology and obtained results can be used for other machine elements than gears and constitutes the general theory of production processes in rapid prototyping methods as well as in designing and implementation of production.
An Improved Spectral Analysis Method for Fatigue Damage Assessment of Details in Liquid Cargo Tanks
NASA Astrophysics Data System (ADS)
Zhao, Peng-yuan; Huang, Xiao-ping
2018-03-01
Errors will be caused in calculating the fatigue damages of details in liquid cargo tanks by using the traditional spectral analysis method which is based on linear system, for the nonlinear relationship between the dynamic stress and the ship acceleration. An improved spectral analysis method for the assessment of the fatigue damage in detail of a liquid cargo tank is proposed in this paper. Based on assumptions that the wave process can be simulated by summing the sinusoidal waves in different frequencies and the stress process can be simulated by summing the stress processes induced by these sinusoidal waves, the stress power spectral density (PSD) is calculated by expanding the stress processes induced by the sinusoidal waves into Fourier series and adding the amplitudes of each harmonic component with the same frequency. This analysis method can take the nonlinear relationship into consideration and the fatigue damage is then calculated based on the PSD of stress. Take an independent tank in an LNG carrier for example, the accuracy of the improved spectral analysis method is proved much better than that of the traditional spectral analysis method by comparing the calculated damage results with the results calculated by the time domain method. The proposed spectral analysis method is more accurate in calculating the fatigue damages in detail of ship liquid cargo tanks.
Pavlov, A N; Pavlova, O N; Abdurashitov, A S; Sindeeva, O A; Semyachkina-Glushkovskaya, O V; Kurths, J
2018-01-01
The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.
NASA Astrophysics Data System (ADS)
Pavlov, A. N.; Pavlova, O. N.; Abdurashitov, A. S.; Sindeeva, O. A.; Semyachkina-Glushkovskaya, O. V.; Kurths, J.
2018-01-01
The scaling properties of complex processes may be highly influenced by the presence of various artifacts in experimental recordings. Their removal produces changes in the singularity spectra and the Hölder exponents as compared with the original artifacts-free data, and these changes are significantly different for positively correlated and anti-correlated signals. While signals with power-law correlations are nearly insensitive to the loss of significant parts of data, the removal of fragments of anti-correlated signals is more crucial for further data analysis. In this work, we study the ability of characterizing scaling features of chaotic and stochastic processes with distinct correlation properties using a wavelet-based multifractal analysis, and discuss differences between the effect of missed data for synchronous and asynchronous oscillatory regimes. We show that even an extreme data loss allows characterizing physiological processes such as the cerebral blood flow dynamics.
Cinque, Kathy; Jayasuriya, Niranjali
2010-12-01
To ensure the protection of drinking water an understanding of the catchment processes which can affect water quality is important as it enables targeted catchment management actions to be implemented. In this study factor analysis (FA) and comparing event mean concentrations (EMCs) with baseline values were techniques used to asses the relationships between water quality parameters and linking those parameters to processes within an agricultural drinking water catchment. FA found that 55% of the variance in the water quality data could be explained by the first factor, which was dominated by parameters usually associated with erosion. Inclusion of pathogenic indicators in an additional FA showed that Enterococcus and Clostridium perfringens (C. perfringens) were also related to the erosion factor. Analysis of the EMCs found that most parameters were significantly higher during periods of rainfall runoff. This study shows that the most dominant processes in an agricultural catchment are surface runoff and erosion. It also shows that it is these processes which mobilise pathogenic indicators and are therefore most likely to influence the transport of pathogens. Catchment management efforts need to focus on reducing the effect of these processes on water quality.
Ghosh, Sujoy; Vivar, Juan; Nelson, Christopher P; Willenborg, Christina; Segrè, Ayellet V; Mäkinen, Ville-Petteri; Nikpay, Majid; Erdmann, Jeannette; Blankenberg, Stefan; O'Donnell, Christopher; März, Winfried; Laaksonen, Reijo; Stewart, Alexandre F R; Epstein, Stephen E; Shah, Svati H; Granger, Christopher B; Hazen, Stanley L; Kathiresan, Sekar; Reilly, Muredach P; Yang, Xia; Quertermous, Thomas; Samani, Nilesh J; Schunkert, Heribert; Assimes, Themistocles L; McPherson, Ruth
2015-07-01
Genome-wide association studies have identified multiple genetic variants affecting the risk of coronary artery disease (CAD). However, individually these explain only a small fraction of the heritability of CAD and for most, the causal biological mechanisms remain unclear. We sought to obtain further insights into potential causal processes of CAD by integrating large-scale GWA data with expertly curated databases of core human pathways and functional networks. Using pathways (gene sets) from Reactome, we carried out a 2-stage gene set enrichment analysis strategy. From a meta-analyzed discovery cohort of 7 CAD genome-wide association study data sets (9889 cases/11 089 controls), nominally significant gene sets were tested for replication in a meta-analysis of 9 additional studies (15 502 cases/55 730 controls) from the Coronary ARtery DIsease Genome wide Replication and Meta-analysis (CARDIoGRAM) Consortium. A total of 32 of 639 Reactome pathways tested showed convincing association with CAD (replication P<0.05). These pathways resided in 9 of 21 core biological processes represented in Reactome, and included pathways relevant to extracellular matrix (ECM) integrity, innate immunity, axon guidance, and signaling by PDRF (platelet-derived growth factor), NOTCH, and the transforming growth factor-β/SMAD receptor complex. Many of these pathways had strengths of association comparable to those observed in lipid transport pathways. Network analysis of unique genes within the replicated pathways further revealed several interconnected functional and topologically interacting modules representing novel associations (eg, semaphoring-regulated axonal guidance pathway) besides confirming known processes (lipid metabolism). The connectivity in the observed networks was statistically significant compared with random networks (P<0.001). Network centrality analysis (degree and betweenness) further identified genes (eg, NCAM1, FYN, FURIN, etc) likely to play critical roles in the maintenance and functioning of several of the replicated pathways. These findings provide novel insights into how genetic variation, interpreted in the context of biological processes and functional interactions among genes, may help define the genetic architecture of CAD. © 2015 American Heart Association, Inc.
NASA Technical Reports Server (NTRS)
Head, J. W. (Editor)
1978-01-01
Developments reported at a meeting of principal investigators for NASA's planetology geology program are summarized. Topics covered include: constraints on solar system formation; asteriods, comets, and satellites; constraints on planetary interiors; volatiles and regoliths; instrument development techniques; planetary cartography; geological and geochemical constraints on planetary evolution; fluvial processes and channel formation; volcanic processes; Eolian processes; radar studies of planetary surfaces; cratering as a process, landform, and dating method; and the Tharsis region of Mars. Activities at a planetary geology field conference on Eolian processes are reported and techniques recommended for the presentation and analysis of crater size-frequency data are included.
Real-time radar signal processing using GPGPU (general-purpose graphic processing unit)
NASA Astrophysics Data System (ADS)
Kong, Fanxing; Zhang, Yan Rockee; Cai, Jingxiao; Palmer, Robert D.
2016-05-01
This study introduces a practical approach to develop real-time signal processing chain for general phased array radar on NVIDIA GPUs(Graphical Processing Units) using CUDA (Compute Unified Device Architecture) libraries such as cuBlas and cuFFT, which are adopted from open source libraries and optimized for the NVIDIA GPUs. The processed results are rigorously verified against those from the CPUs. Performance benchmarked in computation time with various input data cube sizes are compared across GPUs and CPUs. Through the analysis, it will be demonstrated that GPGPUs (General Purpose GPU) real-time processing of the array radar data is possible with relatively low-cost commercial GPUs.
Deverka, Patricia A; Lavallee, Danielle C; Desai, Priyanka J; Armstrong, Joanne; Gorman, Mark; Hole-Curry, Leah; O’Leary, James; Ruffner, BW; Watkins, John; Veenstra, David L; Baker, Laurence H; Unger, Joseph M; Ramsey, Scott D
2013-01-01
Aims The Center for Comparative Effectiveness Research in Cancer Genomics completed a 2-year stakeholder-guided process for the prioritization of genomic tests for comparative effectiveness research studies. We sought to evaluate the effectiveness of engagement procedures in achieving project goals and to identify opportunities for future improvements. Materials & methods The evaluation included an online questionnaire, one-on-one telephone interviews and facilitated discussion. Responses to the online questionnaire were tabulated for descriptive purposes, while transcripts from key informant interviews were analyzed using a directed content analysis approach. Results A total of 11 out of 13 stakeholders completed both the online questionnaire and interview process, while nine participated in the facilitated discussion. Eighty-nine percent of questionnaire items received overall ratings of agree or strongly agree; 11% of responses were rated as neutral with the exception of a single rating of disagreement with an item regarding the clarity of how stakeholder input was incorporated into project decisions. Recommendations for future improvement included developing standard recruitment practices, role descriptions and processes for improved communication with clinical and comparative effectiveness research investigators. Conclusions Evaluation of the stakeholder engagement process provided constructive feedback for future improvements and should be routinely conducted to ensure maximal effectiveness of stakeholder involvement. PMID:23459832
ERIC Educational Resources Information Center
Andic, Branko; Kadic, Srdan; Grujicic, Rade; Malidžan, Desanka
2018-01-01
This paper provides an overview of the attitudes of students and teachers toward the use of educational games in the teaching process. The study encompassed a didactic experiment, and adopted interviewing techniques and theoretical analysis. Likert distributions of attitudes to particular game types are presented in tables and the arithmetic means…
ERIC Educational Resources Information Center
Smyth, Emer; Gangl, Markus; Raffe, David; Hannan, Damian F.; McCoy, Selina
This project aimed to develop a more comprehensive conceptual framework of school-to-work transitions in different national contexts and apply this framework to the empirical analysis of transition processes across European countries. It drew on these two data sources: European Community Labor Force Survey and integrated databases on national…
ERIC Educational Resources Information Center
Wang, Hung-Yuan; Duh, Henry Been-Lirn; Li, Nai; Lin, Tzung-Jin; Tsai, Chin-Chung
2014-01-01
The purpose of this study is to investigate and compare students' collaborative inquiry learning behaviors and their behavior patterns in an augmented reality (AR) simulation system and a traditional 2D simulation system. Their inquiry and discussion processes were analyzed by content analysis and lag sequential analysis (LSA). Forty…
ERIC Educational Resources Information Center
Warren, Steven F.; Gilkerson, Jill; Richards, Jeffrey A.; Oller, D. Kimbrough; Xu, Dongxin; Yapanel, Umit; Gray, Sharmistha
2010-01-01
The study compared the vocal production and language learning environments of 26 young children with autism spectrum disorder (ASD) to 78 typically developing children using measures derived from automated vocal analysis. A digital language processor and audio-processing algorithms measured the amount of adult words to children and the amount of…
Geoffrey H. Donovan; Peter. Noordijk
2005-01-01
To determine the optimal suppression strategy for escaped wildfires, federal land managers are requiredto conduct a wildland fire situation analysis (WFSA). As part of the WFSA process, fire managers estimate final fire size and suppression costs. Estimates from 58 WFSAs conducted during the 2002 fire season are compared to actual outcomes. Results indicate that...
Analysis and quality control of carbohydrates in therapeutic proteins with fluorescence HPLC
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Kun; Huang, Jian; Center for Informational Biology, University of Electronic Science and Technology of China, Chengdu 610054
Conbercept is an Fc fusion protein with very complicated carbohydrate profiles which must be carefully monitored through manufacturing process. Here, we introduce an optimized fluorescence derivatization high-performance liquid chromatographic method for glycan mapping in conbercept. Compared with conventional glycan analysis method, this method has much better resolution and higher reproducibility making it excellent for product quality control.
ERIC Educational Resources Information Center
Ross, Sarah Gwen
2012-01-01
Response to intervention (RTI) is increasingly being used in educational settings to make high-stakes, special education decisions. Because of this, the accurate use and analysis of single-case designs to monitor intervention effectiveness has become important to the RTI process. Effect size methods for single-case designs provide a useful way to…
DNA Damage and Genetic Instability as Harbingers of Prostate Cancer
2013-01-01
incidence of prostate cancer as compared to placebo. Primary analysis of this trial indicated no statistically significant effect of selenium...Identification, isolation, staining, processing, and statistical analysis of slides for ERG and PTEN markers (aim 1) and interpretation of these results...participating in this study being conducted under Investigational New Drug #29829 from the Food and Drug Administration. STANDARD TREATMENT Patients
Task analysis exemplified: the process of resolving unfinished business.
Greenberg, L S; Foerster, F S
1996-06-01
The steps of a task-analytic research program designed to identify the in-session performances involved in resolving lingering bad feelings toward a significant other are described. A rational-empirical methodology of repeatedly cycling between rational conjecture and empirical observations is demonstrated as a method of developing an intervention manual and the components of client processes of resolution. A refined model of the change process developed by these procedures is validated by comparing 11 successful and 11 unsuccessful performances. Four performance components-intense expression of feeling, expression of need, shift in representation of other, and self-validation or understanding of the other-were found to discriminate between resolution and nonresolution performances. These components were measured on 4 process measures: the Structural Analysis of Social Behavior, the Experiencing Scale, the Client's Emotional Arousal Scale, and a need scale.
Science with High Spatial Resolution Far-Infrared Data
NASA Technical Reports Server (NTRS)
Terebey, Susan (Editor); Mazzarella, Joseph M. (Editor)
1994-01-01
The goal of this workshop was to discuss new science and techniques relevant to high spatial resolution processing of far-infrared data, with particular focus on high resolution processing of IRAS data. Users of the maximum correlation method, maximum entropy, and other resolution enhancement algorithms applicable to far-infrared data gathered at the Infrared Processing and Analysis Center (IPAC) for two days in June 1993 to compare techniques and discuss new results. During a special session on the third day, interested astronomers were introduced to IRAS HIRES processing, which is IPAC's implementation of the maximum correlation method to the IRAS data. Topics discussed during the workshop included: (1) image reconstruction; (2) random noise; (3) imagery; (4) interacting galaxies; (5) spiral galaxies; (6) galactic dust and elliptical galaxies; (7) star formation in Seyfert galaxies; (8) wavelet analysis; and (9) supernova remnants.
Human factors process failure modes and effects analysis (HF PFMEA) software tool
NASA Technical Reports Server (NTRS)
Chandler, Faith T. (Inventor); Relvini, Kristine M. (Inventor); Shedd, Nathaneal P. (Inventor); Valentino, William D. (Inventor); Philippart, Monica F. (Inventor); Bessette, Colette I. (Inventor)
2011-01-01
Methods, computer-readable media, and systems for automatically performing Human Factors Process Failure Modes and Effects Analysis for a process are provided. At least one task involved in a process is identified, where the task includes at least one human activity. The human activity is described using at least one verb. A human error potentially resulting from the human activity is automatically identified, the human error is related to the verb used in describing the task. A likelihood of occurrence, detection, and correction of the human error is identified. The severity of the effect of the human error is identified. The likelihood of occurrence, and the severity of the risk of potential harm is identified. The risk of potential harm is compared with a risk threshold to identify the appropriateness of corrective measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olimov, K., E-mail: olimov@uzsci.net; Glagolev, V. V.; Gulamov, K. G.
2014-12-15
The results of a comparative analysis of channels involving the inclusive production of deuterons and tritons in {sup 16}Op collisions at a projectile momentum of 3.25 GeV/c per nucleon are presented. The mechanisms governing proton, deuteron, and triton production in the fragmentation of oxygen nuclei are found to be independent. It is shown that the observed proton-multiplicity correlations are associated predominantly with the character of the primary event of a proton-nucleon collision in {sup 16}Op interactions. It is found that, in reactions involving triton production, the contributions of processes leading to an increase in the mean proton multiplicity (n →more » p + π{sup −} and np → pn) and processes leading to its decrease (p → n + π{sup +}) compensate each other.« less
NASA Technical Reports Server (NTRS)
Poeschel, R. L.; Hawthorne, E. I.; Weisman, Y. C.; Frisman, M.; Benson, G. C.; Mcgrath, R. J.; Martinelli, R. M.; Linsenbardt, T. L.; Beattie, J. R.
1977-01-01
Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30 cm engineering model thruster as the technology base. Emphasis was placed on relatively high power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentrator solar array concept and is designed to interface with the Space Shuttle.
Precise and fast spatial-frequency analysis using the iterative local Fourier transform.
Lee, Sukmock; Choi, Heejoo; Kim, Dae Wook
2016-09-19
The use of the discrete Fourier transform has decreased since the introduction of the fast Fourier transform (fFT), which is a numerically efficient computing process. This paper presents the iterative local Fourier transform (ilFT), a set of new processing algorithms that iteratively apply the discrete Fourier transform within a local and optimal frequency domain. The new technique achieves 210 times higher frequency resolution than the fFT within a comparable computation time. The method's superb computing efficiency, high resolution, spectrum zoom-in capability, and overall performance are evaluated and compared to other advanced high-resolution Fourier transform techniques, such as the fFT combined with several fitting methods. The effectiveness of the ilFT is demonstrated through the data analysis of a set of Talbot self-images (1280 × 1024 pixels) obtained with an experimental setup using grating in a diverging beam produced by a coherent point source.
Kostic, Z G; Stefanovic, P L; Pavlović, P B
2000-07-10
Thermal plasmas may solve one of the biggest toxic waste disposal problems. The disposal of polychlorinated biphenyls (PCBs) is a long standing problem which will get worse in the coming years, when 180000 tons of PCB-containing wastes are expected to accumulate in Europe (Hot ions break down toxic chemicals, New Scientist, 16 April 1987, p. 24.). The combustion of PCBs in ordinary incinerators (at temperature T approximately 1100 K, as measured near the inner wall of the combustion chamber (European Parliament and Council Directive on Incineration of Waste (COM/99/330), Europe energy, 543, Sept. 17, 1999, 1-23.)) can cause more problems than it solves, because highly toxic dioxins and dibenzofurans are formed if the combustion temperature is too low (T<1400 K). The paper presents a thermodynamic consideration and comparative analysis of PCB decomposition processes in air or argon (+oxygen) thermal plasmas.
Diffusion Tensor Tractography Reveals Disrupted Structural Connectivity during Brain Aging
NASA Astrophysics Data System (ADS)
Lin, Lan; Tian, Miao; Wang, Qi; Wu, Shuicai
2017-10-01
Brain aging is one of the most crucial biological processes that entail many physical, biological, chemical, and psychological changes, and also a major risk factor for most common neurodegenerative diseases. To improve the quality of life for the elderly, it is important to understand how the brain is changed during the normal aging process. We compared diffusion tensor imaging (DTI)-based brain networks in a cohort of 75 healthy old subjects by using graph theory metrics to describe the anatomical networks and connectivity patterns, and network-based statistic (NBS) analysis was used to identify pairs of regions with altered structural connectivity. The NBS analysis revealed a significant network comprising nine distinct fiber bundles linking 10 different brain regions showed altered white matter structures in young-old group compare with middle-aged group (p < .05, family-wise error-corrected). Our results might guide future studies and help to gain a better understanding of brain aging.
Extended performance solar electric propulsion thrust system study. Volume 2: Baseline thrust system
NASA Technical Reports Server (NTRS)
Poeschel, R. L.; Hawthorne, E. I.
1977-01-01
Several thrust system design concepts were evaluated and compared using the specifications of the most advanced 30- cm engineering model thruster as the technology base. Emphasis was placed on relatively high-power missions (60 to 100 kW) such as a Halley's comet rendezvous. The extensions in thruster performance required for the Halley's comet mission were defined and alternative thrust system concepts were designed in sufficient detail for comparing mass, efficiency, reliability, structure, and thermal characteristics. Confirmation testing and analysis of thruster and power-processing components were performed, and the feasibility of satisfying extended performance requirements was verified. A baseline design was selected from the alternatives considered, and the design analysis and documentation were refined. The baseline thrust system design features modular construction, conventional power processing, and a concentractor solar array concept and is designed to interface with the space shuttle.
Process characteristics for microwave assisted hydrothermal carbonization of cellulose.
Zhang, Junting; An, Ying; Borrion, Aiduan; He, Wenzhi; Wang, Nan; Chen, Yirong; Li, Guangming
2018-07-01
The process characteristics of microwave assisted hydrothermal carbonization of cellulose was investigated and a first order kinetics model based on carbon concentration was developed. Chemical properties analysis showed that comparing to conventional hydrothermal carbonization, hydrochar with comparable energy properties can be obtained with 5-10 times decrease in reaction time with assistance of microwave heating. Results from kinetics study was in great agreement with experimental analysis, that they both illustrated the predominant mechanism of the reaction depend on variations in the reaction rates of two co-existent pathways. Particularly, the pyrolysis-like intramolecular dehydration reaction was proved to be the predominant mechanism for hydrochar generation under high temperatures. Finally, the enhancement effects of microwave heating were reflected under both soluble and solid pathways in this research, suggesting microwave-assisted hydrothermal carbonization as a more attracting method for carbon-enriched hydrochar recovery. Copyright © 2018 Elsevier Ltd. All rights reserved.
Data of ERPs and spectral alpha power when attention is engaged on visual or verbal/auditory imagery
Villena-González, Mario; López, Vladimir; Rodríguez, Eugenio
2016-01-01
This article provides data from statistical analysis of event-related brain potentials (ERPs) and spectral power from 20 participants during three attentional conditions. Specifically, P1, N1 and P300 amplitude of ERP were compared when participant׳s attention was oriented to an external task, to a visual imagery and to an inner speech. The spectral power from alpha band was also compared in these three attentional conditions. These data are related to the research article where sensory processing of external information was compared during these three conditions entitled “Orienting attention to visual or verbal/auditory imagery differentially impairs the processing of visual stimuli” (Villena-Gonzalez et al., 2016) [1]. PMID:27077090
Biointervention makes leather processing greener: an integrated cleansing and tanning system.
Thanikaivelan, Palanisamy; Rao, Jonnalagadda Raghava; Nair, Balachandran Unni; Ramasami, Thirumalachari
2003-06-01
The do-undo methods adopted in conventional leather processing generate huge amounts of pollutants. In other words, conventional methods employed in leather processing subject the skin/hide to wide variations in pH. Pretanning and tanning processes alone contribute more than 90% of the total pollution from leather processing. Included in this is a great deal of solid wastes such as lime and chrome sludge. In the approach described here, the hair and flesh removal as well as fiber opening have been achieved using biocatalysts at pH 8.0 for cow hides. This was followed by a pickle-free chrome tanning, which does not require a basification step. Hence, this tanning technique involves primarily three steps, namely, dehairing, fiber opening, and tanning. It has been found that the extent of hair removal, opening up of fiber bundles, and penetration and distribution of chromium are comparable to that produced by traditional methods. This has been substantiated through scanning electron microscopic, stratigraphic chrome distribution analysis, and softness measurements. Performance of the leathers is shown to be on par with conventionally processed leathers through physical and hand evaluation. Importantly, softness of the leathers is numerically proven to be comparable with that of control. The process also demonstrates reduction in chemical oxygen demand load by 80%, total solids load by 85%, and chromium load by 80% as compared to the conventional process, thereby leading toward zero discharge. The input-output audit shows that the biocatalytic three-step tanning process employs a very low amount of chemicals, thereby reducing the discharge by 90% as compared to the conventional multistep processing. Furthermore, it is also demonstrated that the process is technoeconomically viable.
Event-Related Potentials of Bottom-Up and Top-Down Processing of Emotional Faces
Moradi, Afsane; Mehrinejad, Seyed Abolghasem; Ghadiri, Mohammad; Rezaei, Farzin
2017-01-01
Introduction: Emotional stimulus is processed automatically in a bottom-up way or can be processed voluntarily in a top-down way. Imaging studies have indicated that bottom-up and top-down processing are mediated through different neural systems. However, temporal differentiation of top-down versus bottom-up processing of facial emotional expressions has remained to be clarified. The present study aimed to explore the time course of these processes as indexed by the emotion-specific P100 and late positive potential (LPP) event-related potential (ERP) components in a group of healthy women. Methods: Fourteen female students of Alzahra University, Tehran, Iran aged 18–30 years, voluntarily participated in the study. The subjects completed 2 overt and covert emotional tasks during ERP acquisition. Results: The results indicated that fearful expressions significantly produced greater P100 amplitude compared to other expressions. Moreover, the P100 findings showed an interaction between emotion and processing conditions. Further analysis indicated that within the overt condition, fearful expressions elicited more P100 amplitude compared to other emotional expressions. Also, overt conditions created significantly more LPP latencies and amplitudes compared to covert conditions. Conclusion: Based on the results, early perceptual processing of fearful face expressions is enhanced in top-down way compared to bottom-up way. It also suggests that P100 may reflect an attentional bias toward fearful emotions. However, no such differentiation was observed within later processing stages of face expressions, as indexed by the ERP LPP component, in a top-down versus bottom-up way. Overall, this study provides a basis for further exploring of bottom-up and top-down processes underlying emotion and may be typically helpful for investigating the temporal characteristics associated with impaired emotional processing in psychiatric disorders. PMID:28446947
NASA Astrophysics Data System (ADS)
Ferreira, G. G.; Borges, E.; Braga, J. P.; Belchior, J. C.
Cluster structures are discussed in a nonrigid analysis, using a modified minima search method based on stochastic processes and classical dynamics simulations. The relaxation process is taken into account considering the internal motion of the Cl2 molecule. Cluster structures are compared with previous works in which the Cl2 molecule is assumed to be rigid. The interactions are modeled using pair potentials: the Aziz and Lennard-Jones potentials for the Ar==Ar interaction, a Morse potential for the Cl==Cl interaction, and a fully spherical/anisotropic Morse-Spline-van der Waals (MSV) potential for the Ar==Cl interaction. As expected, all calculated energies are lower than those obtained in a rigid approximation; one reason may be attributed to the nonrigid contributions of the internal motion of the Cl2 molecule. Finally, the growing processes in molecular clusters are discussed, and it is pointed out that the growing mechanism can be affected due to the nonrigid initial conditions of smaller clusters such as ArnCl2 (n ? 4 or 5), which are seeds for higher-order clusters.
Wall, Christopher E; Cozza, Steven; Riquelme, Cecilia A; McCombie, W Richard; Heimiller, Joseph K; Marr, Thomas G; Leinwand, Leslie A
2011-01-01
The infrequently feeding Burmese python (Python molurus) experiences significant and rapid postprandial cardiac hypertrophy followed by regression as digestion is completed. To begin to explore the molecular mechanisms of this response, we have sequenced and assembled the fasted and postfed Burmese python heart transcriptomes with Illumina technology using the chicken (Gallus gallus) genome as a reference. In addition, we have used RNA-seq analysis to identify differences in the expression of biological processes and signaling pathways between fasted, 1 day postfed (DPF), and 3 DPF hearts. Out of a combined transcriptome of ∼2,800 mRNAs, 464 genes were differentially expressed. Genes showing differential expression at 1 DPF compared with fasted were enriched for biological processes involved in metabolism and energetics, while genes showing differential expression at 3 DPF compared with fasted were enriched for processes involved in biogenesis, structural remodeling, and organization. Moreover, we present evidence for the activation of physiological and not pathological signaling pathways in this rapid, novel model of cardiac growth in pythons. Together, our data provide the first comprehensive gene expression profile for a reptile heart.
Jiménez, Juan E; Marco, Isaac; Suárez, Natalia; González, Desirée
This study had two purposes: examining the internal structure of the Test Estandarizado para la Evaluación Inicial de la Escritura con Teclado (TEVET; Spanish Keyboarding Writing Test), and analyzing the development of keyboarding skills in Spanish elementary school children with and without learning disabilities (LD) in writing. A group of 1,168 elementary school children carried out the following writing tasks: writing the alphabet in order from memory, allograph selection, word copying, writing dictated words with inconsistent spelling, writing pseudowords from dictation, and independent composition of sentence. For this purpose, exploratory factor analysis for the TEVET was conducted. Principal component analysis with a varimax rotation identified three factors with eigenvalues greater than 1.0. Based on factorial analysis, we analyzed the keyboarding skills across grades in Spanish elementary school children with and without LD (i.e., poor handwriters compared with poor spellers, who in turn were compared with mixed compared with typically achieving writers). The results indicated that poor handwriters did not differ from typically achieving writers in phonological processing, visual-orthographic processing, and sentence production components by keyboarding. The educational implications of the findings are analyzed with regard to acquisition of keyboarding skills in children with and without LD in transcription.
Espinoza, Manuel Antonio; Manca, Andrea; Claxton, Karl; Sculpher, Mark
2018-02-01
Evidence about cost-effectiveness is increasingly being used to inform decisions about the funding of new technologies that are usually implemented as guidelines from centralized decision-making bodies. However, there is also an increasing recognition for the role of patients in determining their preferred treatment option. This paper presents a method to estimate the value of implementing a choice-based decision process using the cost-effectiveness analysis toolbox. This value is estimated for 3 alternative scenarios. First, it compares centralized decisions, based on population average cost-effectiveness, against a decision process based on patient choice. Second, it compares centralized decision based on patients' subgroups versus an individual choice-based decision process. Third, it compares a centralized process based on average cost-effectiveness against a choice-based process where patients choose according to a different measure of outcome to that used by the centralized decision maker. The methods are applied to a case study for the management of acute coronary syndrome. It is concluded that implementing a choice-based process of treatment allocation may be an option in collectively funded health systems. However, its value will depend on the specific health problem and the social values considered relevant to the health system. Copyright © 2017 John Wiley & Sons, Ltd.
Shin, Sun Kyoung; Kim, Woo-Il; Jeon, Tae-Wan; Kang, Young-Yeul; Jeong, Seong-Kyeong; Yeon, Jin-Mo; Somasundaram, Swarnalatha
2013-09-15
Ministry of Environment, Republic of Korea (South Korea) is in progress of converting its current hazardous waste classification system to harmonize it with the international standard and to set-up the regulatory standards for toxic substances present in the hazardous waste. In the present work, the concentrations along with the trend of 13 heavy metals, F(-), CN(-) and 19 PAH present in the hazardous waste generated among various thermal processes (11 processes) in South Korea were analyzed along with their leaching characteristics. In all thermal processes, the median concentrations of Cu (3.58-209,000 mg/kg), Ni (BDL-1560 mg/kg), Pb (7.22-5132.25mg/kg) and Zn (83.02-31419 mg/kg) were comparatively higher than the other heavy metals. Iron & Steel thermal process showed the highest median value of the heavy metals Cd (14.76 mg/kg), Cr (166.15 mg/kg) and Hg (2.38 mg/kg). Low molecular weight PAH (BDL-37.59 mg/kg) was predominant in sludge & filter cake samples present in most of the thermal processes. Comparatively flue gas dust present in most of the thermal processing units resulted in the higher leaching of the heavy metals. Copyright © 2013 Elsevier B.V. All rights reserved.
Wang, Lin; Mao, Jiugeng; Zhao, Hejuan; Li, Min; Wei, Qishun; Zhou, Ying; Shao, Heping
2016-09-01
Rice straw (RS) is an important raw material for the preparation of Agaricus bisporus compost in China. In this study, the characterization of composting process from RS and wheat straw (WS) was compared for mushroom production. The results showed that the temperature in RS compost increased rapidly compared with WS compost, and the carbon (C)/nitrogen (N) ratio decreased quickly. The microbial changes during the Phase I and Phase II composting process were monitored using denaturing gradient gel electrophoresis (DGGE) and phospholipid fatty acid (PLFA) analysis. Bacteria were the dominant species during the process of composting and the bacterial community structure dramatically changed during heap composting according to the DGGE results. The bacterial community diversity of RS compost was abundant compared with WS compost at stages 4-5, but no distinct difference was observed after the controlled tunnel Phase II process. The total amount of PLFAs of RS compost, as an indicator of microbial biomass, was higher than that of WS. Clustering by DGGE and principal component analysis of the PLFA compositions revealed that there were differences in both the microbial population and community structure between RS- and WS-based composts. Our data indicated that composting of RS resulted in improved degradation and assimilation of breakdown products by A. bisporus, and suggested that the RS compost was effective for sustaining A. bisporus mushroom growth as well as conventional WS compost.
Tariq, Saadia R; Shah, Munir H; Shaheen, Nazia
2009-09-30
Two tanning units of Pakistan, namely, Kasur and Mian Channun were investigated with respect to the tanning processes (chrome and vegetable, respectively) and the effects of the tanning agents on the quality of soil in vicinity of tanneries were evaluated. The effluent and soil samples from 16 tanneries each of Kasur and Mian Channun were collected. The levels of selected metals (Na, K, Ca, Mg, Fe, Cr, Mn, Co, Cd, Ni, Pb and Zn) were determined by using flame atomic absorption spectrophotometer under optimum analytical conditions. The data thus obtained were subjected to univariate and multivariate statistical analyses. Most of the metals exhibited considerably higher concentrations in the effluents and soils of Kasur compared with those of Mian Channun. It was observed that the soil of Kasur was highly contaminated by Na, K, Ca and Mg emanating from various processes of leather manufacture. Furthermore, the levels of Cr were also present at much enhanced levels than its background concentration due to the adoption of chrome tanning. The levels of Cr determined in soil samples collected from the vicinity of Mian Channun tanneries were almost comparable to the background levels. The soil of this city was found to have contaminated only by the metals originating from pre-tanning processes. The apportionment of selected metals in the effluent and soil samples was determined by a multivariate cluster analysis, which revealed significant differences in chrome and vegetable tanning processes.
Schaefer, Alexander; Brach, Jennifer S.; Perera, Subashan; Sejdić, Ervin
2013-01-01
Background The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f) = 1/fβ. The scaling exponent β is thus often interpreted as a “biomarker” of relative health and decline. New Method This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. Results The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Comparison with Existing Methods: Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. Conclusions The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. PMID:24200509
Titaley, Ivan A; Ogba, O Maduka; Chibwe, Leah; Hoh, Eunha; Cheong, Paul H-Y; Simonich, Staci L Massey
2018-03-16
Non-targeted analysis of environmental samples, using comprehensive two-dimensional gas chromatography coupled with time-of-flight mass spectrometry (GC × GC/ToF-MS), poses significant data analysis challenges due to the large number of possible analytes. Non-targeted data analysis of complex mixtures is prone to human bias and is laborious, particularly for comparative environmental samples such as contaminated soil pre- and post-bioremediation. To address this research bottleneck, we developed OCTpy, a Python™ script that acts as a data reduction filter to automate GC × GC/ToF-MS data analysis from LECO ® ChromaTOF ® software and facilitates selection of analytes of interest based on peak area comparison between comparative samples. We used data from polycyclic aromatic hydrocarbon (PAH) contaminated soil, pre- and post-bioremediation, to assess the effectiveness of OCTpy in facilitating the selection of analytes that have formed or degraded following treatment. Using datasets from the soil extracts pre- and post-bioremediation, OCTpy selected, on average, 18% of the initial suggested analytes generated by the LECO ® ChromaTOF ® software Statistical Compare feature. Based on this list, 63-100% of the candidate analytes identified by a highly trained individual were also selected by OCTpy. This process was accomplished in several minutes per sample, whereas manual data analysis took several hours per sample. OCTpy automates the analysis of complex mixtures of comparative samples, reduces the potential for human error during heavy data handling and decreases data analysis time by at least tenfold. Copyright © 2018 Elsevier B.V. All rights reserved.
Schaefer, Alexander; Brach, Jennifer S; Perera, Subashan; Sejdić, Ervin
2014-01-30
The time evolution and complex interactions of many nonlinear systems, such as in the human body, result in fractal types of parameter outcomes that exhibit self similarity over long time scales by a power law in the frequency spectrum S(f)=1/f(β). The scaling exponent β is thus often interpreted as a "biomarker" of relative health and decline. This paper presents a thorough comparative numerical analysis of fractal characterization techniques with specific consideration given to experimentally measured gait stride interval time series. The ideal fractal signals generated in the numerical analysis are constrained under varying lengths and biases indicative of a range of physiologically conceivable fractal signals. This analysis is to complement previous investigations of fractal characteristics in healthy and pathological gait stride interval time series, with which this study is compared. The results of our analysis showed that the averaged wavelet coefficient method consistently yielded the most accurate results. Class dependent methods proved to be unsuitable for physiological time series. Detrended fluctuation analysis as most prevailing method in the literature exhibited large estimation variances. The comparative numerical analysis and experimental applications provide a thorough basis for determining an appropriate and robust method for measuring and comparing a physiologically meaningful biomarker, the spectral index β. In consideration of the constraints of application, we note the significant drawbacks of detrended fluctuation analysis and conclude that the averaged wavelet coefficient method can provide reasonable consistency and accuracy for characterizing these fractal time series. Copyright © 2013 Elsevier B.V. All rights reserved.
Bencala, Kenneth E.
1984-01-01
Solute transport in streams is determined by the interaction of physical and chemical processes. Data from an injection experiment for chloride and several cations indicate significant influence of solutestreambed processes on transport in a mountain stream. These data are interpreted in terms of transient storage processes for all tracers and sorption processes for the cations. Process parameter values are estimated with simulations based on coupled quasi-two-dimensional transport and first-order mass transfer sorption. Comparative simulations demonstrate the relative roles of the physical and chemical processes in determining solute transport. During the first 24 hours of the experiment, chloride concentrations were attenuated relative to expected plateau levels. Additional attenuation occurred for the sorbing cation strontium. The simulations account for these storage processes. Parameter values determined by calibration compare favorably with estimates from other studies in mountain streams. Without further calibration, the transport of potassium and lithium is adequately simulated using parameters determined in the chloride-strontium simulation and with measured cation distribution coefficients.
[Demographic processes in the countries of Eastern Europe 1945-1990].
Shchepin, O P; Vladimirova, L I
1990-01-01
An analysis is made of changes in the demographic processes in the countries of Eastern Europe over the period from 1945 to 1990 within both the general regularities and national peculiarities according to the parameters of statics and dynamics of population movement. The positive tendencies in the demographic processes are pointed out, first of all in infant mortality rates and mean expectation of life at birth in Eastern European countries by decades reflecting the peculiarities of changes as compared with developed countries.
Compiled visualization with IPI method for analysing of liquid liquid mixing process
NASA Astrophysics Data System (ADS)
Jasikova, Darina; Kotek, Michal; Kysela, Bohus; Sulc, Radek; Kopecky, Vaclav
2018-06-01
The article deals with the research of mixing process using visualization techniques and IPI method. Characteristics of the size distribution and the evolution of two liquid-liquid phase's disintegration were studied. A methodology has been proposed for visualization and image analysis of data acquired during the initial phase of the mixing process. IPI method was used for subsequent detailed study of the disintegrated droplets. The article describes advantages of usage of appropriate method, presents the limits of each method, and compares them.
On time-dependent diffusion coefficients arising from stochastic processes with memory
NASA Astrophysics Data System (ADS)
Carpio-Bernido, M. Victoria; Barredo, Wilson I.; Bernido, Christopher C.
2017-08-01
Time-dependent diffusion coefficients arise from anomalous diffusion encountered in many physical systems such as protein transport in cells. We compare these coefficients with those arising from analysis of stochastic processes with memory that go beyond fractional Brownian motion. Facilitated by the Hida white noise functional integral approach, diffusion propagators or probability density functions (pdf) are obtained and shown to be solutions of modified diffusion equations with time-dependent diffusion coefficients. This should be useful in the study of complex transport processes.
Multi-threaded Event Processing with DANA
DOE Office of Scientific and Technical Information (OSTI.GOV)
David Lawrence; Elliott Wolin
2007-05-14
The C++ data analysis framework DANA has been written to support the next generation of Nuclear Physics experiments at Jefferson Lab commensurate with the anticipated 12GeV upgrade. The DANA framework was designed to allow multi-threaded event processing with a minimal impact on developers of reconstruction software. This document describes how DANA implements multi-threaded event processing and compares it to simply running multiple instances of a program. Also presented are relative reconstruction rates for Pentium4, Xeon, and Opteron based machines.
Lewis, Grace E. M.; Gross, Andrew J.; Kasprzyk‐Hordern, Barbara; Lubben, Anneke T.
2015-01-01
An electrochemical flow cell with a boron‐doped diamond dual‐plate microtrench electrode has been developed and demonstrated for hydroquinone flow injection electroanalysis in phosphate buffer pH 7. Using the electrochemical generator‐collector feedback detector improves the sensitivity by one order of magnitude (when compared to a single working electrode detector). The diffusion process is switched from an analyte consuming “external” process to an analyte regenerating “internal” process with benefits in selectivity and sensitivity. PMID:25735831
ERIC Educational Resources Information Center
Schulz, Wolfram
2005-01-01
The process of political socialisation of adolescents includes more than the acquisition of knowledge about society, citizenship and the political system. In a democracy, citizens are expected to participate actively in the political process. Active participation, however, requires citizens to believe in their own ability to influence the course…
Progress research of non-Cz silicon material
NASA Technical Reports Server (NTRS)
Campbell, R. B.
1983-01-01
The simultaneous diffusion of liquid boron and liquid phosphorus dopants into N-type dendritic silicon web for solar cells was investigated. It is planned that the diffusion parameters required to achieve the desired P(+)NN(+) cell structure be determined and the resultant cell properties be compared to cells produced in a sequential differential process. A cost analysis of the simultaneous junction formation process is proposed.
van Mierlo, Pieter; Lie, Octavian; Staljanssens, Willeke; Coito, Ana; Vulliémoz, Serge
2018-04-26
We investigated the influence of processing steps in the estimation of multivariate directed functional connectivity during seizures recorded with intracranial EEG (iEEG) on seizure-onset zone (SOZ) localization. We studied the effect of (i) the number of nodes, (ii) time-series normalization, (iii) the choice of multivariate time-varying connectivity measure: Adaptive Directed Transfer Function (ADTF) or Adaptive Partial Directed Coherence (APDC) and (iv) graph theory measure: outdegree or shortest path length. First, simulations were performed to quantify the influence of the various processing steps on the accuracy to localize the SOZ. Afterwards, the SOZ was estimated from a 113-electrodes iEEG seizure recording and compared with the resection that rendered the patient seizure-free. The simulations revealed that ADTF is preferred over APDC to localize the SOZ from ictal iEEG recordings. Normalizing the time series before analysis resulted in an increase of 25-35% of correctly localized SOZ, while adding more nodes to the connectivity analysis led to a moderate decrease of 10%, when comparing 128 with 32 input nodes. The real-seizure connectivity estimates localized the SOZ inside the resection area using the ADTF coupled to outdegree or shortest path length. Our study showed that normalizing the time-series is an important pre-processing step, while adding nodes to the analysis did only marginally affect the SOZ localization. The study shows that directed multivariate Granger-based connectivity analysis is feasible with many input nodes (> 100) and that normalization of the time-series before connectivity analysis is preferred.
Surface roughness analysis of SiO2 for PECVD, PVD and IBD on different substrates
NASA Astrophysics Data System (ADS)
Amirzada, Muhammad Rizwan; Tatzel, Andreas; Viereck, Volker; Hillmer, Hartmut
2016-02-01
This study compares surface roughness of SiO2 thin layers which are deposited by three different processes (plasma-enhanced chemical vapor deposition, physical vapor deposition and ion beam deposition) on three different substrates (glass, Si and polyethylene naphthalate). Plasma-enhanced chemical vapor deposition (PECVD) processes using a wide range of deposition temperatures from 80 to 300 °C have been applied and compared. It was observed that the nature of the substrate does not influence the surface roughness of the grown layers very much. It is also perceived that the value of the surface roughness keeps on increasing as the deposition temperature of the PECVD process increases. This is due to the increase in the surface diffusion length with the rise in substrate temperature. The layers which have been deposited on Si wafer by ion beam deposition (IBD) process are found to be smoother as compared to the other two techniques. The layers which have been deposited on the glass substrates using PECVD reveal the highest surface roughness values in comparison with the other substrate materials and techniques. Different existing models describing the dynamics of clusters on surfaces are compared and discussed.
Bronchart, Filip; De Paepe, Michel; Dewulf, Jo; Schrevens, Eddie; Demeyer, Peter
2013-04-15
In Flanders and the Netherlands greenhouse production systems produce economically important quantities of vegetables, fruit and ornamentals. Indoor environmental control has resulted in high primary energy use. Until now, the research on saving primary energy in greenhouse systems has been mainly based on analysis of energy balances. However, according to the thermodynamic theory, an analysis based on the concept of exergy (free energy) and energy can result in new insights and primary energy savings. Therefore in this paper, we analyse the exergy and energy of various processes, inputs and outputs of a general greenhouse system. Also a total system analysis is then performed by linking the exergy analysis with a dynamic greenhouse climate growth simulation model. The exergy analysis indicates that some processes ("Sources") lie at the origin of several other processes, both destroying the exergy of primary energy inputs. The exergy destruction of these Sources is caused primarily by heat and vapour loss. Their impact can be compensated by exergy input from heating, solar radiation, or both. If the exergy destruction of these Sources is reduced, the necessary compensation can also be reduced. This can be accomplished through insulating the greenhouse and making the building more airtight. Other necessary Sources, namely transpiration and loss of CO2, have a low exergy destruction compared to the other Sources. They are therefore the best candidate for "pump" technologies ("vapour heat pump" and "CO2 pump") designed to have a low primary energy use. The combination of these proposed technologies results in an exergy efficient greenhouse with the highest primary energy savings. It can be concluded that exergy analyses add additional information compared to only energy analyses and it supports the development of primary energy efficient greenhouse systems. Copyright © 2013 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choi, J.; Mazumder, J.
1996-12-31
Networking three fields of welding--thermal, microstructure, and stress--was attempted and produced a reliable model using a numerical method with the finite element analysis technique. Model prediction was compared with experimental data in order to validate the model. The effects of welding process parameters on these welding fields were analyzed and reported. The effort to correlate the residual stress and solidification was initiated, with some valuable results. The solidification process was simulated using the formulation based on the Hunt-Trivedi model. Based on the temperature history, solidification speed and primary dendrite arm spacing were predicted at given nodes of interest. Results showmore » that the variation during solidification is usually within an order of magnitude. The temperature gradient was generally in the range of 10{sup 4}--10{sup 5} K/m for the given welding conditions (welding power = 6 kW and welding speed = 3.3867 to 7.62 mm/sec), while solidification speed appeared to slow down from an order of 10{sup {minus}1} to 10{sup {minus}2} m/sec during solidification. SEM images revealed that the primary dendrite arm spacing (PDAS) fell in the range of 10{sup 1}--10{sup 2} {micro}m. For grain growth at the heat affected zone (HAZ), Ashby`s model was employed. The prediction was in agreement with experimental results. For the residual stress calculation, the same mesh generation used in the heat transfer analysis was applied to make the simulation consistent. The analysis consisted of a transient heat analysis followed by a thermal stress analysis. An experimentally measured strain history was compared with the simulated result. The relationship between microstructure and the stress/strain field of welding was also obtained. 64 refs., 18 figs., 9 tabs.« less
Flight Operations Analysis Tool
NASA Technical Reports Server (NTRS)
Easter, Robert; Herrell, Linda; Pomphrey, Richard; Chase, James; Wertz Chen, Julie; Smith, Jeffrey; Carter, Rebecca
2006-01-01
Flight Operations Analysis Tool (FLOAT) is a computer program that partly automates the process of assessing the benefits of planning spacecraft missions to incorporate various combinations of launch vehicles and payloads. Designed primarily for use by an experienced systems engineer, FLOAT makes it possible to perform a preliminary analysis of trade-offs and costs of a proposed mission in days, whereas previously, such an analysis typically lasted months. FLOAT surveys a variety of prior missions by querying data from authoritative NASA sources pertaining to 20 to 30 mission and interface parameters that define space missions. FLOAT provides automated, flexible means for comparing the parameters to determine compatibility or the lack thereof among payloads, spacecraft, and launch vehicles, and for displaying the results of such comparisons. Sparseness, typical of the data available for analysis, does not confound this software. FLOAT effects an iterative process that identifies modifications of parameters that could render compatible an otherwise incompatible mission set.
Analysis of cold worked holes for structural life extension
NASA Technical Reports Server (NTRS)
Wieland, David H.; Cutshall, Jon T.; Burnside, O. Hal; Cardinal, Joseph W.
1994-01-01
Cold working holes for improved fatigue life of fastener holes are widely used on aircraft. This paper presents methods used by the authors to determine the percent of cold working to be applied and to analyze fatigue crack growth of cold worked fastener holes. An elastic, perfectly-plastic analysis of a thick-walled tube is used to determine the stress field during the cold working process and the residual stress field after the process is completed. The results of the elastic/plastic analysis are used to determine the amount of cold working to apply to a hole. The residual stress field is then used to perform damage tolerance analysis of a crack growing out of a cold worked fastener hole. This analysis method is easily implemented in existing crack growth computer codes so that the cold worked holes can be used to extend the structural life of aircraft. Analytical results are compared to test data where appropriate.
Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul
2013-11-01
This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
Semi-automation of Doppler Spectrum Image Analysis for Grading Aortic Valve Stenosis Severity.
Niakšu, O; Balčiunaitė, G; Kizlaitis, R J; Treigys, P
2016-01-01
Doppler echocardiography analysis has become a golden standard in the modern diagnosis of heart diseases. In this paper, we propose a set of techniques for semi-automated parameter extraction for aortic valve stenosis severity grading. The main objectives of the study is to create echocardiography image processing techniques, which minimize manual image processing work of clinicians and leads to reduced human error rates. Aortic valve and left ventricle output tract spectrogram images have been processed and analyzed. A novel method was developed to trace systoles and to extract diagnostic relevant features. The results of the introduced method have been compared to the findings of the participating cardiologists. The experimental results showed the accuracy of the proposed method is comparable to the manual measurement performed by medical professionals. Linear regression analysis of the calculated parameters and the measurements manually obtained by the cardiologists resulted in the strongly correlated values: peak systolic velocity's and mean pressure gradient's R2 both equal to 0.99, their means' differences equal to 0.02 m/s and 4.09 mmHg, respectively, and aortic valve area's R2 of 0.89 with the two methods means' difference of 0.19 mm. The introduced Doppler echocardiography images processing method can be used as a computer-aided assistance in the aortic valve stenosis diagnostics. In our future work, we intend to improve precision of left ventricular outflow tract spectrogram measurements and apply data mining methods to propose a clinical decision support system for diagnosing aortic valve stenosis.
The effects of facial color and inversion on the N170 event-related potential (ERP) component.
Minami, T; Nakajima, K; Changvisommid, L; Nakauchi, S
2015-12-17
Faces are important for social interaction because much can be perceived from facial details, including a person's race, age, and mood. Recent studies have shown that both configural (e.g. face shape and inversion) and surface information (e.g. surface color and reflectance properties) are important for face perception. Therefore, the present study examined the effects of facial color and inverted face properties on event-related potential (ERP) responses, particularly the N170 component. Stimuli consisted of natural and bluish-colored faces. Faces were presented in both upright and upside down orientations. An ANOVA was used to analyze N170 amplitudes and verify the effects of the main independent variables. Analysis of N170 amplitude revealed the significant interactions between stimulus orientation and color. Subsequent analysis indicated that N170 was larger for bluish-colored faces than natural-colored faces, and N170 to natural-colored faces was larger in response to inverted stimulus as compared to upright stimulus. Additionally, a multivariate pattern analysis (MVPA) investigated face-processing dynamics without any prior assumptions. Results distinguished, above chance, both facial color and orientation from single-trial electroencephalogram (EEG) signals. Decoding performance for color classification of inverted faces was significantly diminished as compared to an upright orientation. This suggests that processing orientation is predominant over facial color. Taken together, the present findings elucidate the temporal and spatial distribution of orientation and color processing during face processing. Copyright © 2015 IBRO. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Vikram, K. Arun; Ratnam, Ch; Lakshmi, VVK; Kumar, A. Sunny; Ramakanth, RT
2018-02-01
Meta-heuristic multi-response optimization methods are widely in use to solve multi-objective problems to obtain Pareto optimal solutions during optimization. This work focuses on optimal multi-response evaluation of process parameters in generating responses like surface roughness (Ra), surface hardness (H) and tool vibration displacement amplitude (Vib) while performing operations like tangential and orthogonal turn-mill processes on A-axis Computer Numerical Control vertical milling center. Process parameters like tool speed, feed rate and depth of cut are considered as process parameters machined over brass material under dry condition with high speed steel end milling cutters using Taguchi design of experiments (DOE). Meta-heuristic like Dragonfly algorithm is used to optimize the multi-objectives like ‘Ra’, ‘H’ and ‘Vib’ to identify the optimal multi-response process parameters combination. Later, the results thus obtained from multi-objective dragonfly algorithm (MODA) are compared with another multi-response optimization technique Viz. Grey relational analysis (GRA).
Comparison of approaches for mobile document image analysis using server supported smartphones
NASA Astrophysics Data System (ADS)
Ozarslan, Suleyman; Eren, P. Erhan
2014-03-01
With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.
NASA Technical Reports Server (NTRS)
Nairn, John A.
1992-01-01
A combined analytical and experimental study was conducted to analyze microcracking, microcrack-induced delamination, and longitudinal splitting in polymer matrix composites. Strain energy release rates, calculated by a variational analysis, were used in a failure criterion to predict microcracking. Predictions and test results were compared for static, fatigue, and cyclic thermal loading. The longitudinal splitting analysis accounted for the effects of fiber bridging. Test data are analyzed and compared for longitudinal splitting and delamination under mixed-mode loading. This study emphasizes the importance of using fracture mechanics analyses to understand the complex failure processes that govern composite strength and life.
Comparability of river suspended-sediment sampling and laboratory analysis methods
Groten, Joel T.; Johnson, Gregory D.
2018-03-06
Accurate measurements of suspended sediment, a leading water-quality impairment in many Minnesota rivers, are important for managing and protecting water resources; however, water-quality standards for suspended sediment in Minnesota are based on grab field sampling and total suspended solids (TSS) laboratory analysis methods that have underrepresented concentrations of suspended sediment in rivers compared to U.S. Geological Survey equal-width-increment or equal-discharge-increment (EWDI) field sampling and suspended sediment concentration (SSC) laboratory analysis methods. Because of this underrepresentation, the U.S. Geological Survey, in collaboration with the Minnesota Pollution Control Agency, collected concurrent grab and EWDI samples at eight sites to compare results obtained using different combinations of field sampling and laboratory analysis methods.Study results determined that grab field sampling and TSS laboratory analysis results were biased substantially low compared to EWDI sampling and SSC laboratory analysis results, respectively. Differences in both field sampling and laboratory analysis methods caused grab and TSS methods to be biased substantially low. The difference in laboratory analysis methods was slightly greater than field sampling methods.Sand-sized particles had a strong effect on the comparability of the field sampling and laboratory analysis methods. These results indicated that grab field sampling and TSS laboratory analysis methods fail to capture most of the sand being transported by the stream. The results indicate there is less of a difference among samples collected with grab field sampling and analyzed for TSS and concentration of fines in SSC. Even though differences are present, the presence of strong correlations between SSC and TSS concentrations provides the opportunity to develop site specific relations to address transport processes not captured by grab field sampling and TSS laboratory analysis methods.
Silverman, Merav H.; Jedd, Kelly; Luciana, Monica
2015-01-01
Behavioral responses to, and the neural processing of, rewards change dramatically during adolescence and may contribute to observed increases in risk-taking during this developmental period. Functional MRI (fMRI) studies suggest differences between adolescents and adults in neural activation during reward processing, but findings are contradictory, and effects have been found in non-predicted directions. The current study uses an activation likelihood estimation (ALE) approach for quantitative meta-analysis of functional neuroimaging studies to: 1) confirm the network of brain regions involved in adolescents’ reward processing, 2) identify regions involved in specific stages (anticipation, outcome) and valence (positive, negative) of reward processing, and 3) identify differences in activation likelihood between adolescent and adult reward-related brain activation. Results reveal a subcortical network of brain regions involved in adolescent reward processing similar to that found in adults with major hubs including the ventral and dorsal striatum, insula, and posterior cingulate cortex (PCC). Contrast analyses find that adolescents exhibit greater likelihood of activation in the insula while processing anticipation relative to outcome and greater likelihood of activation in the putamen and amygdala during outcome relative to anticipation. While processing positive compared to negative valence, adolescents show increased likelihood for activation in the posterior cingulate cortex (PCC) and ventral striatum. Contrasting adolescent reward processing with the existing ALE of adult reward processing (Liu et al., 2011) reveals increased likelihood for activation in limbic, frontolimbic, and striatal regions in adolescents compared with adults. Unlike adolescents, adults also activate executive control regions of the frontal and parietal lobes. These findings support hypothesized elevations in motivated activity during adolescence. PMID:26254587
Micko, B; Lusceac, S A; Zimmermann, H; Rössler, E A
2013-02-21
We study the main (α-) and secondary (β-) relaxation in the plastically crystalline (PC) phase of cyanocyclohexane by various 2H nuclear magnetic resonance (NMR) methods (line-shape, spin-lattice relaxation, stimulated echo, and two-dimensional spectra) above and below the glass transition temperature T(g) = 134 K. Our results regarding the α-process demonstrate that molecular motion is not governed by the symmetry of the lattice. Rather it is similar to the one reported for structural glass formers and can be modeled by a reorientation proceeding via a distribution of small and large angular jumps. A solid-echo line-shape analysis regarding the β-process below T(g) yields again very similar results when compared to those of the structural glass formers ethanol and toluene. Hence we cannot confirm an intramolecular origin for the β-process in cyanocyclohexane. The fast β-process in the PC phase allows for the first time a detailed 2H NMR study of the process also at T > T(g): an additional minimum in the spin-lattice relaxation time reflecting the β-process is found. Furthermore the solid-echo spectra show a distinct deviation from the rigid limit Pake pattern, which allows a direct determination of the temperature dependent spatial restriction of the process. In Part II of this work, a quantitative analysis is carried out, where we demonstrate that within the model of a "wobbling in a cone" the mean cone angle increases above T(g) and the corresponding relaxation strength is compared to dielectric results.
Schenone, Mauro; Ziebarth, Sarah; Duncan, Jose; Stokes, Lea; Hernandez, Angela
2018-02-05
To investigate the proportion of documented ultrasound findings that were unsupported by stored ultrasound images in the obstetric ultrasound unit, before and after the implementation of a quality improvement process consisting of a checklist and feedback. A quality improvement process was created involving utilization of a checklist and feedback from physician to sonographer. The feedback was based on findings of the physician's review of the report and images using a check list. To assess the impact of this process, two groups were compared. Group 1 consisted of 58 ultrasound reports created prior to initiation of the process. Group 2 included 65 ultrasound reports created after process implementation. Each chart was reviewed by a physician and a sonographer. Findings considered unsupported by stored images by both reviewers were used for analysis, and the proportion of unsupported findings was compared between the two groups. Results are expressed as mean ± standard error. A p value of < .05 was used to determine statistical significance. Univariate analysis of baseline characteristics and potential confounders showed no statistically significant difference between the groups. The mean proportion of unsupported findings in Group 1 was 5.1 ± 0.87, with Group 2 having a significantly lower proportion (2.6 ± 0.62) (p value = .018). Results suggest a significant decrease in the proportion of unsupported findings in ultrasound reports after quality improvement process implementation. Thus, we present a simple yet effective quality improvement process to reduce unsupported ultrasound findings.
NASA Astrophysics Data System (ADS)
Winiwarter, Susanne; Middleton, Brian; Jones, Barry; Courtney, Paul; Lindmark, Bo; Page, Ken M.; Clark, Alan; Landqvist, Claire
2015-09-01
We demonstrate here a novel use of statistical tools to study intra- and inter-site assay variability of five early drug metabolism and pharmacokinetics in vitro assays over time. Firstly, a tool for process control is presented. It shows the overall assay variability but allows also the following of changes due to assay adjustments and can additionally highlight other, potentially unexpected variations. Secondly, we define the minimum discriminatory difference/ratio to support projects to understand how experimental values measured at different sites at a given time can be compared. Such discriminatory values are calculated for 3 month periods and followed over time for each assay. Again assay modifications, especially assay harmonization efforts, can be noted. Both the process control tool and the variability estimates are based on the results of control compounds tested every time an assay is run. Variability estimates for a limited set of project compounds were computed as well and found to be comparable. This analysis reinforces the need to consider assay variability in decision making, compound ranking and in silico modeling.
NASA Astrophysics Data System (ADS)
Li, Junye; Hu, Jinglei; Wang, Binyu; Sheng, Liang; Zhang, Xinming
2018-03-01
In order to investigate the effect of abrasive flow polishing surface variable diameter pipe parts, with high precision dispensing needles as the research object, the numerical simulation of the process of polishing high precision dispensing needle was carried out. Analysis of different volume fraction conditions, the distribution of the dynamic pressure and the turbulence viscosity of the abrasive flow field in the high precision dispensing needle, through comparative analysis, the effectiveness of the abrasive grain polishing high precision dispensing needle was studied, controlling the volume fraction of silicon carbide can change the viscosity characteristics of the abrasive flow during the polishing process, so that the polishing quality of the abrasive grains can be controlled.
Analysis of Electrowetting Dynamics with Level Set Method
NASA Astrophysics Data System (ADS)
Park, Jun Kwon; Hong, Jiwoo; Kang, Kwan Hyoung
2009-11-01
Electrowetting is a versatile tool to handle tiny droplets and forms a backbone of digital microfluidics. Numerical analysis is necessary to fully understand the dynamics of electrowetting, especially in designing electrowetting-based liquid lenses and reflective displays. We developed a numerical method to analyze the general contact-line problems, incorporating dynamic contact angle models. The method was applied to the analysis of spreading process of a sessile droplet for step input voltages in electrowetting. The result was compared with experimental data and analytical result which is based on the spectral method. It is shown that contact line friction significantly affects the contact line motion and the oscillation amplitude. The pinning process of contact line was well represented by including the hysteresis effect in the contact angle models.
Efficient Workflows for Curation of Heterogeneous Data Supporting Modeling of U-Nb Alloy Aging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ward, Logan Timothy; Hackenberg, Robert Errol
These are slides from a presentation summarizing a graduate research associate's summer project. The following topics are covered in these slides: data challenges in materials, aging in U-Nb Alloys, Building an Aging Model, Different Phase Trans. in U-Nb, the Challenge, Storing Materials Data, Example Data Source, Organizing Data: What is a Schema?, What does a "XML Schema" look like?, Our Data Schema: Nice and Simple, Storing Data: Materials Data Curation System (MDCS), Problem with MDCS: Slow Data Entry, Getting Literature into MDCS, Staging Data in Excel Document, Final Result: MDCS Records, Analyzing Image Data, Process for Making TTT Diagram, Bottleneckmore » Number 1: Image Analysis, Fitting a TTP Boundary, Fitting a TTP Curve: Comparable Results, How Does it Compare to Our Data?, Image Analysis Workflow, Curating Hardness Records, Hardness Data: Two Key Decisions, Before Peak Age? - Automation, Interactive Viz, Which Transformation?, Microstructure-Informed Model, Tracking the Entire Process, General Problem with Property Models, Pinyon: Toolkit for Managing Model Creation, Tracking Individual Decisions, Jupyter: Docs and Code in One File, Hardness Analysis Workflow, Workflow for Aging Models, and conclusions.« less
Virtual reality measures in neuropsychological assessment: a meta-analytic review.
Neguț, Alexandra; Matu, Silviu-Andrei; Sava, Florin Alin; David, Daniel
2016-02-01
Virtual reality-based assessment is a new paradigm for neuropsychological evaluation, that might provide an ecological assessment, compared to paper-and-pencil or computerized neuropsychological assessment. Previous research has focused on the use of virtual reality in neuropsychological assessment, but no meta-analysis focused on the sensitivity of virtual reality-based measures of cognitive processes in measuring cognitive processes in various populations. We found eighteen studies that compared the cognitive performance between clinical and healthy controls on virtual reality measures. Based on a random effects model, the results indicated a large effect size in favor of healthy controls (g = .95). For executive functions, memory and visuospatial analysis, subgroup analysis revealed moderate to large effect sizes, with superior performance in the case of healthy controls. Participants' mean age, type of clinical condition, type of exploration within virtual reality environments, and the presence of distractors were significant moderators. Our findings support the sensitivity of virtual reality-based measures in detecting cognitive impairment. They highlight the possibility of using virtual reality measures for neuropsychological assessment in research applications, as well as in clinical practice.
An alternative respiratory sounds classification system utilizing artificial neural networks.
Oweis, Rami J; Abdulhay, Enas W; Khayal, Amer; Awad, Areen
2015-01-01
Computerized lung sound analysis involves recording lung sound via an electronic device, followed by computer analysis and classification based on specific signal characteristics as non-linearity and nonstationarity caused by air turbulence. An automatic analysis is necessary to avoid dependence on expert skills. This work revolves around exploiting autocorrelation in the feature extraction stage. All process stages were implemented in MATLAB. The classification process was performed comparatively using both artificial neural networks (ANNs) and adaptive neuro-fuzzy inference systems (ANFIS) toolboxes. The methods have been applied to 10 different respiratory sounds for classification. The ANN was superior to the ANFIS system and returned superior performance parameters. Its accuracy, specificity, and sensitivity were 98.6%, 100%, and 97.8%, respectively. The obtained parameters showed superiority to many recent approaches. The promising proposed method is an efficient fast tool for the intended purpose as manifested in the performance parameters, specifically, accuracy, specificity, and sensitivity. Furthermore, it may be added that utilizing the autocorrelation function in the feature extraction in such applications results in enhanced performance and avoids undesired computation complexities compared to other techniques.
Statistical analysis for validating ACO-KNN algorithm as feature selection in sentiment analysis
NASA Astrophysics Data System (ADS)
Ahmad, Siti Rohaidah; Yusop, Nurhafizah Moziyana Mohd; Bakar, Azuraliza Abu; Yaakub, Mohd Ridzwan
2017-10-01
This research paper aims to propose a hybrid of ant colony optimization (ACO) and k-nearest neighbor (KNN) algorithms as feature selections for selecting and choosing relevant features from customer review datasets. Information gain (IG), genetic algorithm (GA), and rough set attribute reduction (RSAR) were used as baseline algorithms in a performance comparison with the proposed algorithm. This paper will also discuss the significance test, which was used to evaluate the performance differences between the ACO-KNN, IG-GA, and IG-RSAR algorithms. This study evaluated the performance of the ACO-KNN algorithm using precision, recall, and F-score, which were validated using the parametric statistical significance tests. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. The evaluation process has statistically proven that this ACO-KNN algorithm has been significantly improved compared to the baseline algorithms. In addition, the experimental results have proven that the ACO-KNN can be used as a feature selection technique in sentiment analysis to obtain quality, optimal feature subset that can represent the actual data in customer review data.
All-inkjet-printed thin-film transistors: manufacturing process reliability by root cause analysis
Sowade, Enrico; Ramon, Eloi; Mitra, Kalyan Yoti; Martínez-Domingo, Carme; Pedró, Marta; Pallarès, Jofre; Loffredo, Fausta; Villani, Fulvia; Gomes, Henrique L.; Terés, Lluís; Baumann, Reinhard R.
2016-01-01
We report on the detailed electrical investigation of all-inkjet-printed thin-film transistor (TFT) arrays focusing on TFT failures and their origins. The TFT arrays were manufactured on flexible polymer substrates in ambient condition without the need for cleanroom environment or inert atmosphere and at a maximum temperature of 150 °C. Alternative manufacturing processes for electronic devices such as inkjet printing suffer from lower accuracy compared to traditional microelectronic manufacturing methods. Furthermore, usually printing methods do not allow the manufacturing of electronic devices with high yield (high number of functional devices). In general, the manufacturing yield is much lower compared to the established conventional manufacturing methods based on lithography. Thus, the focus of this contribution is set on a comprehensive analysis of defective TFTs printed by inkjet technology. Based on root cause analysis, we present the defects by developing failure categories and discuss the reasons for the defects. This procedure identifies failure origins and allows the optimization of the manufacturing resulting finally to a yield improvement. PMID:27649784
Dyrlund, Thomas F; Poulsen, Ebbe T; Scavenius, Carsten; Sanggaard, Kristian W; Enghild, Jan J
2012-09-01
Data processing and analysis of proteomics data are challenging and time consuming. In this paper, we present MS Data Miner (MDM) (http://sourceforge.net/p/msdataminer), a freely available web-based software solution aimed at minimizing the time required for the analysis, validation, data comparison, and presentation of data files generated in MS software, including Mascot (Matrix Science), Mascot Distiller (Matrix Science), and ProteinPilot (AB Sciex). The program was developed to significantly decrease the time required to process large proteomic data sets for publication. This open sourced system includes a spectra validation system and an automatic screenshot generation tool for Mascot-assigned spectra. In addition, a Gene Ontology term analysis function and a tool for generating comparative Excel data reports are included. We illustrate the benefits of MDM during a proteomics study comprised of more than 200 LC-MS/MS analyses recorded on an AB Sciex TripleTOF 5600, identifying more than 3000 unique proteins and 3.5 million peptides. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Simões, Bárbara Dos Santos; Cardoso, Letícia de Oliveira; Benseñor, Isabela Judith Martins; Schmidt, Maria Inês; Duncan, Bruce Bartholow; Luft, Vivian Cristine; Molina, Maria Del Carmen Bisi; Barreto, Sandhi Maria; Levy, Renata Bertazzi; Giatti, Luana
2018-03-05
The objective of the study was to estimate the contribution of ultra-processed foods to total caloric intake and investigate whether it differs according to socioeconomic position. We analyzed baseline data from the Brazilian Longitudinal Study of Adult Health (ELSA-Brasil 2008-2010; N = 14.378) and data on dietary intake using a food frequency questionnaire, assigning it into three categories: unprocessed or minimally processed foods and processed culinary ingredients, processed foods, and ultra-processed foods. We measured the associations between socioeconomic position (education, per capita household income, and occupational social class) and the percentage of caloric contribution of ultra-processed foods, using generalized linear regression models adjusted for age and sex. Unprocessed or minimally processed foods and processed culinary ingredients contributed to 65.7% of the total caloric intake, followed by ultra-processed foods (22.7%). After adjustments, the percentage of caloric contribution of ultra-processed foods was 20% lower among participants with incomplete elementary school when compared to postgraduates. Compared to individuals from upper income classes, the caloric contribution of ultra-processed foods was 10%, 15% and 20% lower among the ones from the three lowest income, respectively. The caloric contribution of ultra-processed foods was also 7%, 12%, 12%, and 17% lower among participants in the lowest occupational social class compared to those from high social classes. Results suggest that the caloric contribution of ultra-processed foods is higher among individuals from high socioeconomic positions with a dose-response relationship for the associations.
Lei, Tianli; Chen, Shifeng; Wang, Kai; Zhang, Dandan; Dong, Lin; Lv, Chongning; Wang, Jing; Lu, Jincai
2018-02-01
Bupleuri Radix is a commonly used herb in clinic, and raw and vinegar-baked Bupleuri Radix are both documented in the Pharmacopoeia of People's Republic of China. According to the theories of traditional Chinese medicine, Bupleuri Radix possesses different therapeutic effects before and after processing. However, the chemical mechanism of this processing is still unknown. In this study, ultra-high-performance liquid chromatography with quadruple time-of-flight mass spectrometry coupled with multivariate statistical analysis including principal component analysis and orthogonal partial least square-discriminant analysis was developed to holistically compare the difference between raw and vinegar-baked Bupleuri Radix for the first time. As a result, 50 peaks in raw and processed Bupleuri Radix were detected, respectively, and a total of 49 peak chemical compounds were identified. Saikosaponin a, saikosaponin d, saikosaponin b 3 , saikosaponin e, saikosaponin c, saikosaponin b 2 , saikosaponin b 1 , 4''-O-acetyl-saikosaponin d, hyperoside and 3',4'-dimethoxy quercetin were explored as potential markers of raw and vinegar-baked Bupleuri Radix. This study has been successfully applied for global analysis of raw and vinegar-processed samples. Furthermore, the underlying hepatoprotective mechanism of Bupleuri Radix was predicted, which was related to the changes of chemical profiling. Copyright © 2017 John Wiley & Sons, Ltd.
Arocha, Mariana A; Basilio, Juan; Llopis, Jaume; Di Bella, Enrico; Roig, Miguel; Ardu, Stefano; Mayoral, Juan R
2014-07-01
The aim of this study was to determine, by using a spectrophotometer device, the colour stainability of two indirect CAD/CAM processed composites in comparison with two conventionally laboratory-processed composites after being immersed 4 weeks in staining solutions such as coffee, black tea and red wine, using distilled water as control group. Two indirect CAD/CAM composites (Lava Ultimate and Paradigm MZ100) and two conventionally laboratory-processed composites (SR Adoro and Premise Indirect) of shade A2 were selected (160 disc samples). Colour stainability was measured after 4 weeks of immersion in three staining solutions (black tea, coffee, red wine) and distilled water. Specimen's colour was measured each week by means of a spectrophotometer (CIE L*a*b* system). Statistical analysis was carried out performing repeated ANOVA measurements and Tukey's HSD test to evaluate differences in ΔE00 measurements between groups; the interactions among composites, staining solutions and time duration were also evaluated. All materials showed significant discoloration (p<0.01) when compared to control group. The highest ΔE00 observed was with red wine, whereas black tea showed the lowest one. Indirect laboratory-processed resin composites showed the highest colour stability compared with CAD/CAM resin blocks. CAD/CAM processed composites immersed in staining solutions showed lower colour stability when compared to conventionally laboratory-processed resin composites. The demand for CAD/CAM restorations has been increasing; however, colour stainability for such material has been insufficiently studied. Moreover, this has not been performed comparing CAD/CAM processed composites versus laboratory-processed indirect composites by immersing in staining solutions for long immersion periods. Copyright © 2014 Elsevier Ltd. All rights reserved.
Global analysis of the yeast lipidome by quantitative shotgun mass spectrometry.
Ejsing, Christer S; Sampaio, Julio L; Surendranath, Vineeth; Duchoslav, Eva; Ekroos, Kim; Klemm, Robin W; Simons, Kai; Shevchenko, Andrej
2009-02-17
Although the transcriptome, proteome, and interactome of several eukaryotic model organisms have been described in detail, lipidomes remain relatively uncharacterized. Using Saccharomyces cerevisiae as an example, we demonstrate that automated shotgun lipidomics analysis enabled lipidome-wide absolute quantification of individual molecular lipid species by streamlined processing of a single sample of only 2 million yeast cells. By comparative lipidomics, we achieved the absolute quantification of 250 molecular lipid species covering 21 major lipid classes. This analysis provided approximately 95% coverage of the yeast lipidome achieved with 125-fold improvement in sensitivity compared with previous approaches. Comparative lipidomics demonstrated that growth temperature and defects in lipid biosynthesis induce ripple effects throughout the molecular composition of the yeast lipidome. This work serves as a resource for molecular characterization of eukaryotic lipidomes, and establishes shotgun lipidomics as a powerful platform for complementing biochemical studies and other systems-level approaches.
The design of an m-Health monitoring system based on a cloud computing platform
NASA Astrophysics Data System (ADS)
Xu, Boyi; Xu, Lida; Cai, Hongming; Jiang, Lihong; Luo, Yang; Gu, Yizhi
2017-01-01
Compared to traditional medical services provided within hospitals, m-Health monitoring systems (MHMSs) face more challenges in personalised health data processing. To achieve personalised and high-quality health monitoring by means of new technologies, such as mobile network and cloud computing, in this paper, a framework of an m-Health monitoring system based on a cloud computing platform (Cloud-MHMS) is designed to implement pervasive health monitoring. Furthermore, the modules of the framework, which are Cloud Storage and Multiple Tenants Access Control Layer, Healthcare Data Annotation Layer, and Healthcare Data Analysis Layer, are discussed. In the data storage layer, a multiple tenant access method is designed to protect patient privacy. In the data annotation layer, linked open data are adopted to augment health data interoperability semantically. In the data analysis layer, the process mining algorithm and similarity calculating method are implemented to support personalised treatment plan selection. These three modules cooperate to implement the core functions in the process of health monitoring, which are data storage, data processing, and data analysis. Finally, we study the application of our architecture in the monitoring of antimicrobial drug usage to demonstrate the usability of our method in personal healthcare analysis.
Rodríguez, Luis F; Li, Changying; Khanna, Madhu; Spaulding, Aslihan D; Lin, Tao; Eckhoff, Steven R
2010-07-01
An engineering economic model, which is mass balanced and compositionally driven, was developed to compare the conventional corn dry-grind process and the pre-fractionation process called quick germ-quick fiber (QQ). In this model, documented in a companion article, the distillers dried grains with solubles (DDGS) price was linked with its protein and fiber content as well as with the long-term average relationship with the corn price. The detailed economic analysis showed that the QQ plant retrofitted from conventional dry-grind ethanol plant reduces the manufacturing cost of ethanol by 13.5 cent/gallon and has net present value of nearly $4 million greater than the conventional dry-grind plant at an interest rate of 4% in 15years. Ethanol and feedstock price sensitivity analysis showed that the QQ plant gains more profits when ethanol price increases than conventional dry-grind ethanol plant. An optimistic analysis of the QQ process suggests that the greater value of the modified DDGS would provide greater resistance to fluctuations in corn price for QQ facilities. This model can be used to provide decision support for ethanol producers. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
The search for cognitive terminology: an analysis of comparative psychology journal titles.
Whissell, Cynthia; Abramson, Charles I; Barber, Kelsey R
2013-03-01
This research examines the employment of cognitive or mentalist words in the titles of articles from three comparative psychology journals (Journal of Comparative Psychology, International Journal of Comparative Psychology, Journal of Experimental Psychology: Animal Behavior Processes; 8,572 titles, >100,000 words). The Dictionary of Affect in Language, coupled with a word search of titles, was employed to demonstrate cognitive creep. The use of cognitive terminology increased over time (1940-2010) and the increase was especially notable in comparison to the use of behavioral words, highlighting a progressively cognitivist approach to comparative research. Problems associated with the use of cognitive terminology in this domain include a lack of operationalization and a lack of portability. There were stylistic differences among journals including an increased use of words rated as pleasant and concrete across years for Journal of Comparative Psychology, and a greater use of emotionally unpleasant and concrete words in Journal of Experimental Psychology: Animal Behavior Processes.
Siéssere, S; de Albuquerque Lima, N; Semprini, M; de Sousa, L G; Paulo Mardegan Issa, J; Aparecida Caldeira Monteiro, S; Cecílio Hallak Regalo, S
2009-11-01
The masseter and temporal muscles of patients with maxillary and mandibular osteoporosis were submitted to electromyographic analysis and compared with a control group. In conclusion, individuals with osteoporosis did not show significantly lower masticatory cycle performance and efficiency compared to the control group during the proposal mastications. This study aimed to examine electromyographically the masseter and temporal muscles of patients with maxillary and mandibular osteoporosis and compare these patients with control patients. Sixty individuals of both genders with an average age of 53.0 +/- 5 years took part in the study, distributed in two groups with 30 individuals each: (1) individuals with osteoporosis; (2) control patients during the habitual and non-habitual mastication. The electromyographic apparel used was a Myosystem-BR1-DataHomins Technology Ltda., with five channels of acquisition and electrodes active differentials. Statistical analysis of the results was performed using SPSS version 15.0 (Chicago, IL, USA). The result of the Student's t test indicated no significant differences (p > 0.05) between the normalized values of the ensemble average obtained in masticatory cycles in both groups. Based on the results of this study, it was concluded that individuals with osteoporosis did not show significantly lower masticatory cycle performance and efficiency compared to control subjects during the habitual and non-habitual mastications. This result is very important because it demonstrates the functionality of the complex physiological process of mastication in individuals with osteoporosis at the bones that compose the face.
Inter-laboratory comparison of the in vivo comet assay including three image analysis systems.
Plappert-Helbig, Ulla; Guérard, Melanie
2015-12-01
To compare the extent of potential inter-laboratory variability and the influence of different comet image analysis systems, in vivo comet experiments were conducted using the genotoxicants ethyl methanesulfonate and methyl methanesulfonate. Tissue samples from the same animals were processed and analyzed-including independent slide evaluation by image analysis-in two laboratories with extensive experience in performing the comet assay. The analysis revealed low inter-laboratory experimental variability. Neither the use of different image analysis systems, nor the staining procedure of DNA (propidium iodide vs. SYBR® Gold), considerably impacted the results or sensitivity of the assay. In addition, relatively high stability of the staining intensity of propidium iodide-stained slides was found in slides that were refrigerated for over 3 months. In conclusion, following a thoroughly defined protocol and standardized routine procedures ensures that the comet assay is robust and generates comparable results between different laboratories. © 2015 Wiley Periodicals, Inc.
Baumeister, A A; Bacharach, V R; Baumeister, A A
1997-11-01
Controversy about the amount and nature of funding for mental retardation research has persisted since the creation of NICHD. An issue that has aroused considerable debate, within the mental retardation research community as well as beyond, is distribution of funds between large group research grants, such as the program project (PO1) and the individual grant (RO1). Currently within the Mental Retardation and Developmental Disabilities Branch, more money is allocated to the PO1 mechanism than the RO1. We compared the two types of grants, focusing on success rates, productivity, costs, impact, publication practices, and outcome and conducted a comparative analysis of biomedical and behavioral research. Other related issues were considered, including review processes and cost-effectiveness.
Wave power potential in Malaysian territorial waters
NASA Astrophysics Data System (ADS)
Asmida Mohd Nasir, Nor; Maulud, Khairul Nizam Abdul
2016-06-01
Up until today, Malaysia has used renewable energy technology such as biomass, solar and hydro energy for power generation and co-generation in palm oil industries and also for the generation of electricity, yet, we are still far behind other countries which have started to optimize waves for similar production. Wave power is a renewable energy (RE) transported by ocean waves. It is very eco-friendly and is easily reachable. This paper presents an assessment of wave power potential in Malaysian territorial waters including waters of Sabah and Sarawak. In this research, data from Malaysia Meteorology Department (MetMalaysia) is used and is supported by a satellite imaginary obtained from National Aeronautics and Space Administration (NASA) and Malaysia Remote Sensing Agency (ARSM) within the time range of the year 1992 until 2007. There were two types of analyses conducted which were mask analysis and comparative analysis. Mask analysis of a research area is the analysis conducted to filter restricted and sensitive areas. Meanwhile, comparative analysis is an analysis conducted to determine the most potential area for wave power generation. Four comparative analyses which have been carried out were wave power analysis, comparative analysis of wave energy power with the sea topography, hot-spot area analysis and comparative analysis of wave energy with the wind speed. These four analyses underwent clipping processes using Geographic Information System (GIS) to obtain the final result. At the end of this research, the most suitable area to develop a wave energy converter was found, which is in the waters of Terengganu and Sarawak. Besides that, it was concluded that the average potential energy that can be generated in Malaysian territorial waters is between 2.8kW/m to 8.6kW/m.
A Meta-Analysis and Review of Holistic Face Processing
Richler, Jennifer J.; Gauthier, Isabel
2014-01-01
The concept of holistic processing is a cornerstone of face recognition research, yet central questions related to holistic processing remain unanswered, and debates have thus far failed to reach a resolution despite accumulating empirical evidence. We argue that a considerable source of confusion in this literature stems from a methodological problem. Specifically, two different measures of holistic processing based on the composite paradigm (complete design and partial design) are used in the literature, but they often lead to qualitatively different results. First, we present a comprehensive review of the work that directly compares the two designs, and which clearly favors the complete design over the partial design. Second, we report a meta-analysis of holistic face processing according to both designs, and use this as further evidence for one design over the other. The meta-analysis effect size of holistic processing in the complete design is nearly three times that of the partial design. Effect sizes were not correlated between measures, consistent with the suggestion that they do not measure the same thing. Our meta-analysis also examines the correlation between conditions in the complete design of the composite task, and suggests that in an individual differences context, little is gained by including a misaligned baseline. Finally, we offer a comprehensive review of the state of knowledge about holistic processing based on evidence gathered from the measure we favor based on the first sections of our review—the complete design—and outline outstanding research questions in that new context. PMID:24956123
Qualitative and quantitative interpretation of SEM image using digital image processing.
Saladra, Dawid; Kopernik, Magdalena
2016-10-01
The aim of the this study is improvement of qualitative and quantitative analysis of scanning electron microscope micrographs by development of computer program, which enables automatic crack analysis of scanning electron microscopy (SEM) micrographs. Micromechanical tests of pneumatic ventricular assist devices result in a large number of micrographs. Therefore, the analysis must be automatic. Tests for athrombogenic titanium nitride/gold coatings deposited on polymeric substrates (Bionate II) are performed. These tests include microshear, microtension and fatigue analysis. Anisotropic surface defects observed in the SEM micrographs require support for qualitative and quantitative interpretation. Improvement of qualitative analysis of scanning electron microscope images was achieved by a set of computational tools that includes binarization, simplified expanding, expanding, simple image statistic thresholding, the filters Laplacian 1, and Laplacian 2, Otsu and reverse binarization. Several modifications of the known image processing techniques and combinations of the selected image processing techniques were applied. The introduced quantitative analysis of digital scanning electron microscope images enables computation of stereological parameters such as area, crack angle, crack length, and total crack length per unit area. This study also compares the functionality of the developed computer program of digital image processing with existing applications. The described pre- and postprocessing may be helpful in scanning electron microscopy and transmission electron microscopy surface investigations. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.
Direct microscopic image and measurement of the atomization process of a port fuel injector
NASA Astrophysics Data System (ADS)
Esmail, Mohamed; Kawahara, Nobuyuki; Tomita, Eiji; Sumida, Mamoru
2010-07-01
The main objective of this study is to observe and investigate the phenomena of atomization, i.e. the fuel break-up process very close to the nozzle exit of a practical port fuel injector (PFI). In order to achieve this objective, direct microscopic images of the atomization process were obtained using an ultra-high-speed video camera that could record 102 frames at rates of up to 1 Mfps, coupled with a long-distance microscope and Barlow lens. The experiments were carried out using a PFI in a closed chamber at atmospheric pressure. Time-series images of the spray behaviour were obtained with a high temporal resolution using backlighting. The direct microscopic images of a liquid column break-up were compared with experimental results from laser-induced exciplex fluorescence (LIEF), and the wavelength obtained from the experimental results compared with that predicated from the Kelvin-Helmholtz break-up model. The droplet size diameters from a ligament break-up were compared with results predicated from Weber's analysis. Furthermore, experimental results of the mean droplet diameter from a direct microscopic image were compared with the results obtained from phase Doppler anemometry (PDA) experimental results. Three conclusions were obtained from this study. The atomization processes and detailed characterizations of the break-up of a liquid column were identified; the direct microscopic image results were in good agreement with the results obtained from LIEF, experimental results of the wavelength were in good agreement with those from the Kelvin-Helmholtz break-up model. The break-up process of liquid ligaments into droplets was investigated, and Weber's analysis of the predicated droplet diameter from ligament break-up was found to be applicable only at larger wavelengths. Finally, the direct microscopic image method and PDA method give qualitatively similar trends for droplet size distribution and quantitatively similar values of Sauter mean diameter.
Face-elicited ERPs and affective attitude: brain electric microstate and tomography analyses.
Pizzagalli, D; Lehmann, D; Koenig, T; Regard, M; Pascual-Marqui, R D
2000-03-01
Although behavioral studies have demonstrated that normative affective traits modulate the processing of facial and emotionally charged stimuli, direct electrophysiological evidence for this modulation is still lacking. Event-related potential (ERP) data associated with personal, traitlike approach- or withdrawal-related attitude (assessed post-recording and 14 months later) were investigated in 18 subjects during task-free (i.e. unrequested, spontaneous) emotional evaluation of faces. Temporal and spatial aspects of 27 channel ERP were analyzed with microstate analysis and low resolution electromagnetic tomography (LORETA), a new method to compute 3 dimensional cortical current density implemented in the Talairach brain atlas. Microstate analysis showed group differences 132-196 and 196-272 ms poststimulus, with right-shifted electric gravity centers for subjects with negative affective attitude. During these (over subjects reliably identifiable) personality-modulated, face-elicited microstates, LORETA revealed activation of bilateral occipito-temporal regions, reportedly associated with facial configuration extraction processes. Negative compared to positive affective attitude showed higher activity right temporal; positive compared to negative attitude showed higher activity left temporo-parieto-occipital. These temporal and spatial aspects suggest that the subject groups differed in brain activity at early, automatic, stimulus-related face processing steps when structural face encoding (configuration extraction) occurs. In sum, the brain functional microstates associated with affect-related personality features modulate brain mechanisms during face processing already at early information processing stages.
Siemann, Julia; Herrmann, Manfred; Galashan, Daniela
2018-01-25
The present study examined whether feature-based cueing affects early or late stages of flanker conflict processing using EEG and fMRI. Feature cues either directed participants' attention to the upcoming colour of the target or were neutral. Validity-specific modulations during interference processing were investigated using the N200 event-related potential (ERP) component and BOLD signal differences. Additionally, both data sets were integrated using an fMRI-constrained source analysis. Finally, the results were compared with a previous study in which spatial instead of feature-based cueing was applied to an otherwise identical flanker task. Feature-based and spatial attention recruited a common fronto-parietal network during conflict processing. Irrespective of attention type (feature-based; spatial), this network responded to focussed attention (valid cueing) as well as context updating (invalid cueing), hinting at domain-general mechanisms. However, spatially and non-spatially directed attention also demonstrated domain-specific activation patterns for conflict processing that were observable in distinct EEG and fMRI data patterns as well as in the respective source analyses. Conflict-specific activity in visual brain regions was comparable between both attention types. We assume that the distinction between spatially and non-spatially directed attention types primarily applies to temporal differences (domain-specific dynamics) between signals originating in the same brain regions (domain-general localization).
Just, Sarah; Toschkoff, Gregor; Funke, Adrian; Djuric, Dejan; Scharrer, Georg; Khinast, Johannes; Knop, Klaus; Kleinebudde, Peter
2013-03-01
Coating of solid dosage forms is an important unit operation in the pharmaceutical industry. In recent years, numerical simulations of drug manufacturing processes have been gaining interest as process analytical technology tools. The discrete element method (DEM) in particular is suitable to model tablet-coating processes. For the development of accurate simulations, information on the material properties of the tablets is required. In this study, the mechanical parameters Young's modulus, coefficient of restitution (CoR), and coefficients of friction (CoF) of gastrointestinal therapeutic systems (GITS) and of active-coated GITS were measured experimentally. The dynamic angle of repose of these tablets in a drum coater was investigated to revise the CoF. The resulting values were used as input data in DEM simulations to compare simulation and experiment. A mean value of Young's modulus of 31.9 MPa was determined by the uniaxial compression test. The CoR was found to be 0.78. For both tablet-steel and tablet-tablet friction, active-coated GITS showed a higher CoF compared with GITS. According to the values of the dynamic angle of repose, the CoF was adjusted to obtain consistent tablet motion in the simulation and in the experiment. On the basis of this experimental characterization, mechanical parameters are integrated into DEM simulation programs to perform numerical analysis of coating processes.
Do compensation processes impair mental health? A meta-analysis.
Elbers, Nieke A; Hulst, Liesbeth; Cuijpers, Pim; Akkermans, Arno J; Bruinvels, David J
2013-05-01
Victims who are involved in a compensation processes generally have more health complaints compared to victims who are not involved in a compensation process. Previous research regarding the effect of compensation processes has concentrated on the effect on physical health. This meta-analysis focuses on the effect of compensation processes on mental health. Prospective cohort studies addressing compensation and mental health after traffic accidents, occupational accidents or medical errors were identified using PubMed, EMBASE, PsycInfo, CINAHL, and the Cochrane Library. Relevant studies published between January 1966 and 10 June 2011 were selected for inclusion. Ten studies were included. The first finding was that the compensation group already had higher mental health complaints at baseline compared to the non-compensation group (standardised mean difference (SMD)=-0.38; 95% confidence interval (CI) -0.66 to -0.10; p=.01). The second finding was that mental health between baseline and post measurement improved less in the compensation group compared to the non-compensation group (SMD=-0.35; 95% CI -0.70 to -0.01; p=.05). However, the quality of evidence was limited, mainly because of low quality study design and heterogeneity. Being involved in a compensation process is associated with higher mental health complaints but three-quarters of the difference appeared to be already present at baseline. The findings of this study should be interpreted with caution because of the limited quality of evidence. The difference at baseline may be explained by a selection bias or more anger and blame about the accident in the compensation group. The difference between baseline and follow-up may be explained by secondary gain and secondary victimisation. Future research should involve assessment of exposure to compensation processes, should analyse and correct for baseline differences, and could examine the effect of time, compensation scheme design, and claim settlement on (mental) health. Copyright © 2011 Elsevier Ltd. All rights reserved.
Energy efficiency analysis of reactor for torrefaction of biomass with direct heating
NASA Astrophysics Data System (ADS)
Kuzmina, J. S.; Director, L. B.; Shevchenko, A. L.; Zaichenko, V. M.
2016-11-01
Paper presents energy analysis of reactor for torrefaction with direct heating of granulated biomass by exhaust gases. Various schemes of gas flow through the reactor zones are presented. Performed is a comparative evaluation of the specific energy consumption for the considered schemes. It has been shown that one of the most expensive processes of torrefaction technology is recycling of pyrolysis gases.
ERIC Educational Resources Information Center
Ostrow, Korinn S.; Wang, Yan; Heffernan, Neil T.
2017-01-01
Data\tis flexible in that it is molded by not only the features and variables available to a researcher for analysis and interpretation, but also by how those features and variables are recorded and processed prior to evaluation. "Big Data" from online learning platforms and intelligent tutoring systems is no different. The work presented…
Estrogenic modulation of auditory processing: a vertebrate comparison
Caras, Melissa L.
2013-01-01
Sex-steroid hormones are well-known regulators of vocal motor behavior in several organisms. A large body of evidence now indicates that these same hormones modulate processing at multiple levels of the ascending auditory pathway. The goal of this review is to provide a comparative analysis of the role of estrogens in vertebrate auditory function. Four major conclusions can be drawn from the literature: First, estrogens may influence the development of the mammalian auditory system. Second, estrogenic signaling protects the mammalian auditory system from noise- and age-related damage. Third, estrogens optimize auditory processing during periods of reproductive readiness in multiple vertebrate lineages. Finally, brain-derived estrogens can act locally to enhance auditory response properties in at least one avian species. This comparative examination may lead to a better appreciation of the role of estrogens in the processing of natural vocalizations and may provide useful insights toward alleviating auditory dysfunctions emanating from hormonal imbalances. PMID:23911849
A superior edge preserving filter with a systematic analysis
NASA Technical Reports Server (NTRS)
Holladay, Kenneth W.; Rickman, Doug
1991-01-01
A new, adaptive, edge preserving filter for use in image processing is presented. It had superior performance when compared to other filters. Termed the contiguous K-average, it aggregates pixels by examining all pixels contiguous to an existing cluster and adding the pixel closest to the mean of the existing cluster. The process is iterated until K pixels were accumulated. Rather than simply compare the visual results of processing with this operator to other filters, some approaches were developed which allow quantitative evaluation of how well and filter performs. Particular attention is given to the standard deviation of noise within a feature and the stability of imagery under iterative processing. Demonstrations illustrate the performance of several filters to discriminate against noise and retain edges, the effect of filtering as a preprocessing step, and the utility of the contiguous K-average filter when used with remote sensing data.
Nomura, Aline Tsuma Gaedke; Pruinelli, Lisiane; da Silva, Marcos Barragan; Lucena, Amália de Fátima; Almeida, Miriam de Abreu
2018-03-01
Hospital accreditation is a strategy for the pursuit of quality of care and safety for patients and professionals. Targeted educational interventions could help support this process. This study aimed to evaluate the quality of electronic nursing records during the hospital accreditation process. A retrospective study comparing 112 nursing records during the hospital accreditation process was conducted. Educational interventions were implemented, and records were evaluated preintervention and postintervention. Mann-Whitney and χ tests were used for data analysis. Results showed that there was a significant improvement in the nursing documentation quality postintervention. When comparing records preintervention and postintervention, results showed a statistically significant difference (P < .001) between the two periods. The comparison between items showed that most scores were significant. Findings indicated that educational interventions performed by nurses led to a positive change that improved nursing documentation and, consequently, better care practices.
NASA Astrophysics Data System (ADS)
Chung, T. W.; Chen, C. K.; Hsu, S. H.
2017-11-01
Protein concentration process using filter membrane has a significant advantage on energy saving compared to the traditional drying processes. However, fouling on large membrane area and frequent membrane cleaning will increase the energy consumption and operation cost for the protein concentration process with filter membrane. In this study, the membrane filtration for protein concentration will be conducted and compared with the recent protein concentration technology. The analysis of operating factors for protein concentration process using filter membrane was discussed. The separation mechanism of membrane filtration was developed according to the size difference between the pore of membrane and the particle of filter material. The Darcy’s Law was applied to discuss the interaction on flux, TMP (transmembrane pressure) and resistance in this study. The effect of membrane pore size, pH value and TMP on the steady-state flux (Jst) and protein rejection (R) were studied. It is observed that the Jst increases with decreasing membrane pore size, the Jst increases with increasing TMP, and R increased with decreasing solution pH value. Compare to other variables, the pH value is the most significant variable for separation between protein and water.
Using Dispersed Modes During Model Correlation
NASA Technical Reports Server (NTRS)
Stewart, Eric C.; Hathcock, Megan L.
2017-01-01
The model correlation process for the modal characteristics of a launch vehicle is well established. After a test, parameters within the nominal model are adjusted to reflect structural dynamics revealed during testing. However, a full model correlation process for a complex structure can take months of man-hours and many computational resources. If the analyst only has weeks, or even days, of time in which to correlate the nominal model to the experimental results, then the traditional correlation process is not suitable. This paper describes using model dispersions to assist the model correlation process and decrease the overall cost of the process. The process creates thousands of model dispersions from the nominal model prior to the test and then compares each of them to the test data. Using mode shape and frequency error metrics, one dispersion is selected as the best match to the test data. This dispersion is further improved by using a commercial model correlation software. In the three examples shown in this paper, this dispersion based model correlation process performs well when compared to models correlated using traditional techniques and saves time in the post-test analysis.
Comparative Analysis on Nonlinear Models for Ron Gasoline Blending Using Neural Networks
NASA Astrophysics Data System (ADS)
Aguilera, R. Carreño; Yu, Wen; Rodríguez, J. C. Tovar; Mosqueda, M. Elena Acevedo; Ortiz, M. Patiño; Juarez, J. J. Medel; Bautista, D. Pacheco
The blending process always being a nonlinear process is difficult to modeling, since it may change significantly depending on the components and the process variables of each refinery. Different components can be blended depending on the existing stock, and the chemical characteristics of each component are changing dynamically, they all are blended until getting the expected specification in different properties required by the customer. One of the most relevant properties is the Octane, which is difficult to control in line (without the component storage). Since each refinery process is quite different, a generic gasoline blending model is not useful when a blending in line wants to be done in a specific process. A mathematical gasoline blending model is presented in this paper for a given process described in state space as a basic gasoline blending process description. The objective is to adjust the parameters allowing the blending gasoline model to describe a signal in its trajectory, representing in neural networks extreme learning machine method and also for nonlinear autoregressive-moving average (NARMA) in neural networks method, such that a comparative work be developed.
Metcalfe, David; Rockey, Chris; Jefferson, Bruce; Judd, Simon; Jarvis, Peter
2015-12-15
This investigation aimed to compare the disinfection by-product formation potentials (DBPFPs) of three UK surface waters (1 upland reservoir and 2 lowland rivers) with differing characteristics treated by (a) a full scale conventional process and (b) pilot scale processes using a novel suspended ion exchange (SIX) process and inline coagulation (ILCA) followed by ceramic membrane filtration (CMF). Liquid chromatography-organic carbon detection analysis highlighted clear differences between the organic fractions removed by coagulation and suspended ion exchange. Pretreatments which combined SIX and coagulation resulted in significant reductions in dissolved organic carbon (DOC), UV absorbance (UVA), trihalomethane and haloacetic acid formation potential (THMFP, HAAFP), in comparison with the SIX or coagulation process alone. Further experiments showed that in addition to greater overall DOC removal, the processes also reduced the concentration of brominated DBPs and selectively removed organic compounds with high DBPFP. The SIX/ILCA/CMF process resulted in additional removals of DOC, UVA, THMFP, HAAFP and brominated DBPs of 50, 62, 62, 62% and 47% respectively compared with conventional treatment. Copyright © 2015. Published by Elsevier Ltd.
Environmental impact of mushroom compost production.
Leiva, Francisco; Saenz-Díez, Juan-Carlos; Martínez, Eduardo; Jiménez, Emilio; Blanco, Julio
2016-09-01
This research analyses the environmental impact of the creation of Agaricus bisporus compost packages. The composting process is the intermediate stage of the mushroom production process, subsequent to the mycelium cultivation stage and prior to the fruiting bodies cultivation stage. A full life cycle assessment model of the Agaricus bisporus composting process has been developed through the identification and analysis of the inputs-outputs and energy consumption of the activities involved in the production process. The study has been developed based on data collected from a plant during a 1 year campaign, thereby obtaining accurate information used to analyse the environmental impact of the process. A global analysis of the main stages of the process shows that the process that has the greatest impact in most categories is the compost batch preparation process. This is due to an increased consumption of energy resources by the machinery that mixes the raw materials to create the batch. At the composting process inside the tunnel stage, the activity that has the greatest impact in almost all categories studied is the initial stage of composting. This is due to higher energy consumption during the process compared to the other stages. © 2015 Society of Chemical Industry. © 2015 Society of Chemical Industry.
Non Destructive Analysis of Fsw Welds using Ultrasonic Signal Analysis
NASA Astrophysics Data System (ADS)
Pavan Kumar, T.; Prabhakar Reddy, P.
2017-08-01
Friction Stir Welding is an evolving metal joining technique and is mostly used in joining materials which cannot be easily joined by other available welding techniques. It is a technique which can be used for welding dissimilar materials also. The strength of the weld joint is determined by the way in which these material are mixing with each other, since we are not using any filler material for the welding process the intermixing has a significant importance. The complication with the friction stir welding process is that there are many process parameters which effect this intermixing process such as tool geometry, rotating speed of the tool, transverse speed etc., In this study an attempt is made to compare the material flow and weld quality of various weldments by changing the parameters. Ultrasonic signal Analysis is used to characterize the microstructure of the weldments. use of ultrasonic waves is a non destructive, accurate and fast way of characterization of microstructure. In this method the relationship between the ultrasonic measured parameters and microstructures are evaluated using background echo and backscattered signal process techniques. The ultrasonic velocity and attenuation measurements are dependent on the elastic modulus and any change in the microstructure is reflected in the ultrasonic velocity. An insight into material flow is essential to determine the quality of the weld. Hence an attempt is made in this study to know the relationship between tool geometry and the pattern of material flow and resulting weld quality the experiments are conducted to weld dissimilar aluminum alloys and the weldments are characterized using and ultra Sonic signal processing. Characterization is also done using Scanning Electron Microscopy. It is observed that there is a good correlation between the ultrasonic signal processing results and Scanning Electron Microscopy on the observed precipitates. Tensile tests and hardness tests are conducted on the weldments and compared for determining the weld quality.
Moral panic, moral regulation, and the civilizing process.
Hier, Sean
2016-09-01
This article compares two analytical frameworks ostensibly formulated to widen the focus of moral panic studies. The comparative analysis suggests that attempts to conceptualize moral panics in terms of decivilizing processes have neither substantively supplemented the explanatory gains made by conceptualizing moral panic as a form of moral regulation nor provided a viable alternative framework that better explains the dynamics of contemporary moral panics. The article concludes that Elias's meta-theory of the civilizing process potentially provides explanatory resources to investigate a possible historical-structural shift towards the so-called age of (a)moral panic; the analytical demands of such a project, however, require a sufficiently different line of inquiry than the one encouraged by both the regulatory and decivilizing perspectives on moral panic. © London School of Economics and Political Science 2016.
Janve, Bhaskar; Yang, Wade; Sims, Charles
2015-06-01
Power ultrasound reduces the traditional corn steeping time from 18 to 1.5 h during tortilla chips dough (masa) processing. This study sought to examine consumer (n = 99) acceptability and quality of tortilla chips made from the masa by traditional compared with ultrasonic methods. Overall appearance, flavor, and texture acceptability scores were evaluated using a 9-point hedonic scale. The baked chips (process intermediate) before and after frying (finished product) were analyzed using a texture analyzer and machine vision. The texture values were determined using the 3-point bend test using breaking force gradient (BFG), peak breaking force (PBF), and breaking distance (BD). The fracturing properties determined by the crisp fracture support rig using fracture force gradient (FFG), peak fracture force (PFF), and fracture distance (FD). The machine vision evaluated the total surface area, lightness (L), color difference (ΔE), Hue (°h), and Chroma (C*). The results were evaluated by analysis of variance and means were separated using Tukey's test. Machine vision values of L, °h, were higher (P < 0.05) and ΔE was lower (P < 0.05) for fried and L, °h were significantly (P < 0.05) higher for baked chips produced from ultra-sonication as compare to traditional. Baked chips texture for ultra-sonication was significantly higher (P < 0.05) on BFG, BPD, PFF, and FD. Fried tortilla chips texture were higher significantly (P < 0.05) in BFG and PFF for ultra-sonication than traditional processing. However, the instrumental differences were not detected in sensory analysis, concluding possibility of power ultrasound as potential tortilla chips processing aid. © 2015 Institute of Food Technologists®
NASA Technical Reports Server (NTRS)
Whitlow, W., Jr.; Bennett, R. M.
1982-01-01
Since the aerodynamic theory is nonlinear, the method requires the coupling of two iterative processes - an aerodynamic analysis and a structural analysis. A full potential analysis code, FLO22, is combined with a linear structural analysis to yield aerodynamic load distributions on and deflections of elastic wings. This method was used to analyze an aeroelastically-scaled wind tunnel model of a proposed executive-jet transport wing and an aeroelastic research wing. The results are compared with the corresponding rigid-wing analyses, and some effects of elasticity on the aerodynamic loading are noted.
The practice of quality-associated costing: application to transfusion manufacturing processes.
Trenchard, P M; Dixon, R
1997-01-01
This article applies the new method of quality-associated costing (QAC) to the mixture of processes that create red cell and plasma products from whole blood donations. The article compares QAC with two commonly encountered but arbitrary models and illustrates the invalidity of clinical cost-benefit analysis based on these models. The first, an "isolated" cost model, seeks to allocate each whole process cost to only one product class. The other is a "shared" cost model, and it seeks to allocate an approximately equal share of all process costs to all associated products.
Exploring the role of auditory analysis in atypical compared to typical language development.
Grube, Manon; Cooper, Freya E; Kumar, Sukhbinder; Kelly, Tom; Griffiths, Timothy D
2014-02-01
The relationship between auditory processing and language skills has been debated for decades. Previous findings have been inconsistent, both in typically developing and impaired subjects, including those with dyslexia or specific language impairment. Whether correlations between auditory and language skills are consistent between different populations has hardly been addressed at all. The present work presents an exploratory approach of testing for patterns of correlations in a range of measures of auditory processing. In a recent study, we reported findings from a large cohort of eleven-year olds on a range of auditory measures and the data supported a specific role for the processing of short sequences in pitch and time in typical language development. Here we tested whether a group of individuals with dyslexic traits (DT group; n = 28) from the same year group would show the same pattern of correlations between auditory and language skills as the typically developing group (TD group; n = 173). Regarding the raw scores, the DT group showed a significantly poorer performance on the language but not the auditory measures, including measures of pitch, time and rhythm, and timbre (modulation). In terms of correlations, there was a tendency to decrease in correlations between short-sequence processing and language skills, contrasted by a significant increase in correlation for basic, single-sound processing, in particular in the domain of modulation. The data support the notion that the fundamental relationship between auditory and language skills might differ in atypical compared to typical language development, with the implication that merging data or drawing inference between populations might be problematic. Further examination of the relationship between both basic sound feature analysis and music-like sound analysis and language skills in impaired populations might allow the development of appropriate training strategies. These might include types of musical training to augment language skills via their common bases in sound sequence analysis. Copyright © 2013 The Authors. Published by Elsevier B.V. All rights reserved.
Simulation based optimization on automated fibre placement process
NASA Astrophysics Data System (ADS)
Lei, Shi
2018-02-01
In this paper, a software simulation (Autodesk TruPlan & TruFiber) based method is proposed to optimize the automate fibre placement (AFP) process. Different types of manufacturability analysis are introduced to predict potential defects. Advanced fibre path generation algorithms are compared with respect to geometrically different parts. Major manufacturing data have been taken into consideration prior to the tool paths generation to achieve high success rate of manufacturing.
1999-03-01
Budget (Oficina Central de Presupuesto [OCEPRE]), which is the presidential agency with overall responsibility to formulate the national budget...Budget (Oficina Central de Presupuesto OCEPRE), and they receive a special treatment in the Venezuelan Budgetary process. The OCEPRE is the...the Central Office of Budget (Oficina Central de Presupuesto , OCEPRE). This occurs when funds for weapons acquisitions come from the ordinary budget
2010-01-01
Comparative Effectiveness Research, or other efforts to determine best practices and to develop guidelines based on meta-analysis and evidence - based medicine . An...authoritative reviews or other evidence - based medicine sources, but they have been made unambiguous and computable – a process which sounds...best practice recommendation created through an evidence - based medicine (EBM) development process. The lifecycle envisions four stages of refinement
Neural bases of antisocial behavior: a voxel-based meta-analysis
Inokuchi, Ryota; Nakao, Tomohiro; Yamasue, Hidenori
2014-01-01
Individuals with antisocial behavior place a great physical and economic burden on society. Deficits in emotional processing have been recognized as a fundamental cause of antisocial behavior. Emerging evidence also highlights a significant contribution of attention allocation deficits to such behavior. A comprehensive literature search identified 12 studies that were eligible for inclusion in the meta-analysis, which compared 291 individuals with antisocial problems and 247 controls. Signed Differential Mapping revealed that compared with controls, gray matter volume (GMV) in subjects with antisocial behavior was reduced in the right lentiform nucleus (P < 0.0001), left insula (P = 0.0002) and left frontopolar cortex (FPC) (P = 0.0006), and was increased in the right fusiform gyrus (P < 0.0001), right inferior parietal lobule (P = 0.0003), right superior parietal lobule (P = 0.0004), right cingulate gyrus (P = 0.0004) and the right postcentral gyrus (P = 0.0004). Given the well-known contributions of limbic and paralimbic areas to emotional processing, the observed reductions in GMV in these regions might represent neural correlates of disturbance in emotional processing underlying antisocial behavior. Previous studies have suggested an FPC role in attention allocation during emotional processing. Therefore, GMV deviations in this area may constitute a neural basis of deficits in attention allocation linked with antisocial behavior. PMID:23926170
Cona, Giorgia; Bisiacchi, Patrizia Silvia; Sartori, Giuseppe; Scarpazza, Cristina
2016-05-17
Remembering to execute pre-defined intentions at the appropriate time in the future is typically referred to as Prospective Memory (PM). Studies of PM showed that distinct cognitive processes underlie the execution of delayed intentions depending on whether the cue associated with such intentions is focal to ongoing activity processing or not (i.e., cue focality). The present activation likelihood estimation (ALE) meta-analysis revealed several differences in brain activity as a function of focality of the PM cue. The retrieval of intention is supported mainly by left anterior prefrontal cortex (Brodmann Area, BA 10) in nonfocal tasks, and by cerebellum and ventral parietal regions in focal tasks. Furthermore, the precuneus showed increased activation during the maintenance phase of intentions compared to the retrieval phase in nonfocal tasks, whereas the inferior parietal lobule showed increased activation during the retrieval of intention compared to maintenance phase in the focal tasks. Finally, the retrieval of intention relies more on the activity in anterior cingulate cortex for nonfocal tasks, and on posterior cingulate cortex for focal tasks. Such focality-related pattern of activations suggests that prospective remembering is mediated mainly by top-down and stimulus-independent processes in nonfocal tasks, whereas by more automatic, bottom-up, processes in focal tasks.
Cona, Giorgia; Bisiacchi, Patrizia Silvia; Sartori, Giuseppe; Scarpazza, Cristina
2016-01-01
Remembering to execute pre-defined intentions at the appropriate time in the future is typically referred to as Prospective Memory (PM). Studies of PM showed that distinct cognitive processes underlie the execution of delayed intentions depending on whether the cue associated with such intentions is focal to ongoing activity processing or not (i.e., cue focality). The present activation likelihood estimation (ALE) meta-analysis revealed several differences in brain activity as a function of focality of the PM cue. The retrieval of intention is supported mainly by left anterior prefrontal cortex (Brodmann Area, BA 10) in nonfocal tasks, and by cerebellum and ventral parietal regions in focal tasks. Furthermore, the precuneus showed increased activation during the maintenance phase of intentions compared to the retrieval phase in nonfocal tasks, whereas the inferior parietal lobule showed increased activation during the retrieval of intention compared to maintenance phase in the focal tasks. Finally, the retrieval of intention relies more on the activity in anterior cingulate cortex for nonfocal tasks, and on posterior cingulate cortex for focal tasks. Such focality-related pattern of activations suggests that prospective remembering is mediated mainly by top-down and stimulus-independent processes in nonfocal tasks, whereas by more automatic, bottom-up, processes in focal tasks. PMID:27185531
Fabrication of lead zirconate titanate actuator via suspension polymerization casting
NASA Astrophysics Data System (ADS)
Miao, Weiguo
2000-10-01
The research presented herein has focused on the fabrication of a lead zirconate titanate (PZT) telescopic actuator from Suspension Polymerization Casting (SPC). Two systems were studied: an acrylamide-based hydrogel, and an acrylate-based nonaqueous system. Analytical tools such as thermomechanical analysis (TMA), differential scanning calorimetry (DSC), chemorheology, thermogravimetric analysis (TGA), and differential thermal analysis (DTA) were used to investigate the polymerization and burnout processes. The acrylamide hydrogel polymerization casting process used hydroxymethyl acrylamide (HMAM) monofunctional monomer with methylenebisacrylamide (MBAM) difunctional monomer, or used methacrylamide (MAM) as monofunctional monomer. High solid loading PZT slurries with low viscosities were obtained by optimizing the amounts of dispersant and the PZT powders. The overall activation energy of gelation was calculated to be 60--76 kJ/mol for the monomer solution, this energy was increased to 91 kJ/mol with the addition of PZT powder. The results show that the PZT powder has a retardation effect on gelation. Although several PZT tubes were made using the acrylamide-based system, the demolding and drying difficulties made this process unsuitable for building internal structures, such as the telescopic actuator. The acrylate-based system was used successfully to build telescopic actuator. Efforts were made to study the influence of composition and experimental conditions on the polymerization process. Temperature was found to have the largest impact on polymerization. To adjust the polymerization temperature and time, initiator and/or catalyst were used. PZT powder has a catalytic effect on the polymerization process. Compared with acrylamide systems, acrylate provided a strong polymer network to support the ceramic green body. This high strength is beneficial for the demolding process, but it can easily cause cracks during the burnout process. To solve the burnout issue, non-reactive decalin was used as a solvent to lower the stress inside the green body. The addition of decalin has no large impact on the polymerization process. With 15 wt% decalin in the monomer solution, the burnout process was successfully solved. The burnout process was monitored by TGA/DTA and TMA. A 51 vol% PZT filled acrylate slurry was cast into a mold made by Stereolithography (SLA), and after curing, the telescopic actuator was removed from the mold. This indirect SLA method provides an efficient way to build ceramic parts. PZT samples were sintered at 1275°C for 4 hours, with density over 98%. SEM analysis showed the sample made by SPC has a uniform microstructure, which may be beneficial to the electric properties. The sample made by polymerization has a d33 value about 680 pm/V, which is better than the literature value (580 pm/V). The electric tests showed this telescopic actuator produced a maximum deflection of 24.7 mum at 250 kV/m, in line with theoretical calculations. Compared with actuators made by other methods, the actuator made by SPC provides a comparable structural factor (187.5). The distortion in actuators is caused by fabrication and sintering.
Feasibility basis for use of new solid household waste processing equipment
NASA Astrophysics Data System (ADS)
Vertakova, Y. V.; Zvyagintsev, G. L.; Babich, T. N.; Polozhentseva, Y. S.
2017-10-01
Economic efficiency assessment of innovative organizational project of solid household waste processing enterprise (SHW) is given. A distinctive feature of this project is new mining and chemical technology use of waste depolymerization. The proved feature is fuel-resource production in portion modules of tubular type. They are patented and approved under laboratory conditions. The main ways of SHW processing in the world including Russia are described. Advantages and disadvantages are revealed. Comparative analysis is carried out. Technology prioritization is a result of this analysis. During organization of such enterprise, it was proved that not only SHW processing is a result of its functioning. The other result is environmentally friendly production using secondary raw materials. These products can be sold and can have bring income. Main investment and current expenses necessary for the offered project implementation are defined. This allows making economic assessment of innovative enterprise efficiency.
Rachid, G; El Fadel, M
2013-08-15
This paper presents a SWOT analysis of SEA systems in the Middle East North Africa region through a comparative examination of the status, application and structure of existing systems based on country-specific legal, institutional and procedural frameworks. The analysis is coupled with the multi-attribute decision making method (MADM) within an analytical framework that involves both performance analysis based on predefined evaluation criteria and countries' self-assessment of their SEA system through open-ended surveys. The results show heterogenous status with a general delayed progress characterized by varied levels of weaknesses embedded in the legal and administrative frameworks and poor integration with the decision making process. Capitalizing on available opportunities, the paper highlights measures to enhance the development and enactment of SEA in the region. Copyright © 2013 Elsevier Ltd. All rights reserved.
Elite and Status Attainment Models of Inequality of Opportunity
ERIC Educational Resources Information Center
Myles, John F.; Srensen, Aage B.
1975-01-01
With changes in method, analysis of the process of attainment of various occupations and sub-sets of occupations such as elites can bring about the desired comparability between elite and status attainment studies of equality of opportunity. (Author/AM)
Analysis of ROC on chest direct digital radiography (DR) after image processing in diagnosis of SARS
NASA Astrophysics Data System (ADS)
Lv, Guozheng; Lan, Rihui; Zeng, Qingsi; Zheng, Zhong
2004-05-01
The Severe Acute Respiratory Syndrome (SARS, also called Infectious Atypical Pneumonia), which initially broke out in late 2002, has threatened the public"s health seriously. How to confirm the patients contracting SARS becomes an urgent issue in diagnosis. This paper intends to evaluate the importance of Image Processing in the diagnosis on SARS at the early stage. Receiver Operating Characteristics (ROC) analysis has been employed in this study to compare the value of DR images in the diagnosis of SARS patients before and after image processing by Symphony Software supplied by E-Com Technology Ltd., and DR image study of 72 confirmed or suspected SARS patients were reviewed respectively. All the images taken from the studied patients were processed by Symphony. Both the original and processed images were taken into ROC analysis, based on which the ROC graph for each group of images has been produced as described below: For processed images: a = 1.9745, b = 1.4275, SA = 0.8714; For original images: a = 0.9066, b = 0.8310, SA = 0.7572; (a - intercept, b - slop, SA - Area below the curve). The result shows significant difference between the original images and processed images (P<0.01). In summary, the images processed by Symphony are superior to the original ones in detecting the opacity lesion, and increases the accuracy of SARS diagnosis.
Valladares Linares, R; Li, Z; Yangali-Quintanilla, V; Ghaffour, N; Amy, G; Leiknes, T; Vrouwenvelder, J S
2016-01-01
In recent years, forward osmosis (FO) hybrid membrane systems have been investigated as an alternative to conventional high-pressure membrane processes (i.e. reverse osmosis (RO)) for seawater desalination and wastewater treatment and recovery. Nevertheless, their economic advantage in comparison to conventional processes for seawater desalination and municipal wastewater treatment has not been clearly addressed. This work presents a detailed economic analysis on capital and operational expenses (CAPEX and OPEX) for: i) a hybrid forward osmosis - low-pressure reverse osmosis (FO-LPRO) process, ii) a conventional seawater reverse osmosis (SWRO) desalination process, and iii) a membrane bioreactor - reverse osmosis - advanced oxidation process (MBR-RO-AOP) for wastewater treatment and reuse. The most important variables affecting economic feasibility are obtained through a sensitivity analysis of a hybrid FO-LPRO system. The main parameters taken into account for the life cycle costs are the water quality characteristics (similar feed water and similar water produced), production capacity of 100,000 m(3) d(-1) of potable water, energy consumption, materials, maintenance, operation, RO and FO module costs, and chemicals. Compared to SWRO, the FO-LPRO systems have a 21% higher CAPEX and a 56% lower OPEX due to savings in energy consumption and fouling control. In terms of the total water cost per cubic meter of water produced, the hybrid FO-LPRO desalination system has a 16% cost reduction compared to the benchmark for desalination, mainly SWRO. Compared to the MBR-RO-AOP, the FO-LPRO systems have a 7% lower CAPEX and 9% higher OPEX, resulting in no significant cost reduction per m(3) produced by FO-LPRO. Hybrid FO-LPRO membrane systems are shown to have an economic advantage compared to current available technology for desalination, and comparable costs with a wastewater treatment and recovery system. Based on development on FO membrane modules, packing density, and water permeability, the total water cost could be further reduced. Copyright © 2015 Elsevier Ltd. All rights reserved.
USAF solar thermal applications overview
NASA Technical Reports Server (NTRS)
Hauger, J. S.; Simpson, J. A.
1981-01-01
Process heat applications were compared to solar thermal technologies. The generic process heat applications were analyzed for solar thermal technology utilization, using SERI's PROSYS/ECONOMAT model in an end use matching analysis and a separate analysis was made for solar ponds. Solar technologies appear attractive in a large number of applications. Low temperature applications at sites with high insolation and high fuel costs were found to be most attractive. No one solar thermal technology emerges as a clearly universal or preferred technology, however,, solar ponds offer a potential high payoff in a few, selected applications. It was shown that troughs and flat plate systems are cost effective in a large number of applications.
3-D interactive visualisation tools for Hi spectral line imaging
NASA Astrophysics Data System (ADS)
van der Hulst, J. M.; Punzo, D.; Roerdink, J. B. T. M.
2017-06-01
Upcoming HI surveys will deliver such large datasets that automated processing using the full 3-D information to find and characterize HI objects is unavoidable. Full 3-D visualization is an essential tool for enabling qualitative and quantitative inspection and analysis of the 3-D data, which is often complex in nature. Here we present SlicerAstro, an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing, which we developed for the inspection and analysis of HI spectral line data. We describe its initial capabilities, including 3-D filtering, 3-D selection and comparative modelling.
The Integration Process of Very Thin Mirror Shells with a Particular Regard to Simbol-X
NASA Astrophysics Data System (ADS)
Basso, S.; Pareschi, G.; Tagliaferri, G.; Mazzoleni, F.; Valtolina, R.; Citterio, O.; Conconi, P.
2009-05-01
The optics of Simbol-X are very thin compared to previous X-ray missions (like XMM). Therefore their shells floppy and are unable to maintain the correct shape. To avoid the deformations of their very thin X-ray optics during the integration process we adopt two stiffening rings with a good roundness. In this article the procedure used for the first three prototypes of the Simbol-X optics is presented with a description of the problems involved and with an analysis of the degradation of the performances during the integration. This analysis has been performed with the UV vertical bench measurements at INAF-OAB.
NASA Technical Reports Server (NTRS)
Klemas, V. (Principal Investigator); Bartlett, D.; Rogers, R.; Reed, L.
1974-01-01
The author has identified the following significant results. Analysis of ERTS-1 color composite images using analogy processing equipment confirmed that all the major wetlands plant species were distinguishable at ERTS-1 scale. Furthermore, human alterations of the coastal zone were easily recognized since such alterations typically involve removal of vegetative cover resulting in a change of spectral signature. The superior spectral resolution of the CCTs as compared with single band or composite imagery has indeed provided good discrimination through digital analysis of the CCTs with the added advantage of rapid production of thematic maps and data.
Bayesian estimation of self-similarity exponent
NASA Astrophysics Data System (ADS)
Makarava, Natallia; Benmehdi, Sabah; Holschneider, Matthias
2011-08-01
In this study we propose a Bayesian approach to the estimation of the Hurst exponent in terms of linear mixed models. Even for unevenly sampled signals and signals with gaps, our method is applicable. We test our method by using artificial fractional Brownian motion of different length and compare it with the detrended fluctuation analysis technique. The estimation of the Hurst exponent of a Rosenblatt process is shown as an example of an H-self-similar process with non-Gaussian dimensional distribution. Additionally, we perform an analysis with real data, the Dow-Jones Industrial Average closing values, and analyze its temporal variation of the Hurst exponent.
Mathematical Analysis and Optimization of Infiltration Processes
NASA Technical Reports Server (NTRS)
Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.
1997-01-01
A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.
Information Theory Broadens the Spectrum of Molecular Ecology and Evolution.
Sherwin, W B; Chao, A; Jost, L; Smouse, P E
2017-12-01
Information or entropy analysis of diversity is used extensively in community ecology, and has recently been exploited for prediction and analysis in molecular ecology and evolution. Information measures belong to a spectrum (or q profile) of measures whose contrasting properties provide a rich summary of diversity, including allelic richness (q=0), Shannon information (q=1), and heterozygosity (q=2). We present the merits of information measures for describing and forecasting molecular variation within and among groups, comparing forecasts with data, and evaluating underlying processes such as dispersal. Importantly, information measures directly link causal processes and divergence outcomes, have straightforward relationship to allele frequency differences (including monotonicity that q=2 lacks), and show additivity across hierarchical layers such as ecology, behaviour, cellular processes, and nongenetic inheritance. Copyright © 2017 Elsevier Ltd. All rights reserved.
Comparative kinetic analysis on thermal degradation of some cephalosporins using TG and DSC data
2013-01-01
Background The thermal decomposition of cephalexine, cefadroxil and cefoperazone under non-isothermal conditions using the TG, respectively DSC methods, was studied. In case of TG, a hyphenated technique, including EGA, was used. Results The kinetic analysis was performed using the TG and DSC data in air for the first step of cephalosporin’s decomposition at four heating rates. The both TG and DSC data were processed according to an appropriate strategy to the following kinetic methods: Kissinger-Akahira-Sunose, Friedman, and NPK, in order to obtain realistic kinetic parameters, even if the decomposition process is a complex one. The EGA data offer some valuable indications about a possible decomposition mechanism. The obtained data indicate a rather good agreement between the activation energy’s values obtained by different methods, whereas the EGA data and the chemical structures give a possible explanation of the observed differences on the thermal stability. A complete kinetic analysis needs a data processing strategy using two or more methods, but the kinetic methods must also be applied to the different types of experimental data (TG and DSC). Conclusion The simultaneous use of DSC and TG data for the kinetic analysis coupled with evolved gas analysis (EGA) provided us a more complete picture of the degradation of the three cephalosporins. It was possible to estimate kinetic parameters by using three different kinetic methods and this allowed us to compare the Ea values obtained from different experimental data, TG and DSC. The thermodegradation being a complex process, the both differential and integral methods based on the single step hypothesis are inadequate for obtaining believable kinetic parameters. Only the modified NPK method allowed an objective separation of the temperature, respective conversion influence on the reaction rate and in the same time to ascertain the existence of two simultaneous steps. PMID:23594763
Lean energy analysis of CNC lathe
NASA Astrophysics Data System (ADS)
Liana, N. A.; Amsyar, N.; Hilmy, I.; Yusof, MD
2018-01-01
The industrial sector in Malaysia is one of the main sectors that have high percentage of energy demand compared to other sector and this problem may lead to the future power shortage and increasing the production cost of a company. Suitable initiatives should be implemented by the industrial sectors to solve the issues such as by improving the machining system. In the past, the majority of the energy consumption in industry focus on lighting, HVAC and office section usage. Future trend, manufacturing process is also considered to be included in the energy analysis. A study on Lean Energy Analysis in a machining process is presented. Improving the energy efficiency in a lathe machine by enhancing the cutting parameters of turning process is discussed. Energy consumption of a lathe machine was analyzed in order to identify the effect of cutting parameters towards energy consumption. It was found that the combination of parameters for third run (spindle speed: 1065 rpm, depth of cut: 1.5 mm, feed rate: 0.3 mm/rev) was the most preferred and ideal to be used during the turning machining process as it consumed less energy usage.
Independent component analysis algorithm FPGA design to perform real-time blind source separation
NASA Astrophysics Data System (ADS)
Meyer-Baese, Uwe; Odom, Crispin; Botella, Guillermo; Meyer-Baese, Anke
2015-05-01
The conditions that arise in the Cocktail Party Problem prevail across many fields creating a need for of Blind Source Separation. The need for BSS has become prevalent in several fields of work. These fields include array processing, communications, medical signal processing, and speech processing, wireless communication, audio, acoustics and biomedical engineering. The concept of the cocktail party problem and BSS led to the development of Independent Component Analysis (ICA) algorithms. ICA proves useful for applications needing real time signal processing. The goal of this research was to perform an extensive study on ability and efficiency of Independent Component Analysis algorithms to perform blind source separation on mixed signals in software and implementation in hardware with a Field Programmable Gate Array (FPGA). The Algebraic ICA (A-ICA), Fast ICA, and Equivariant Adaptive Separation via Independence (EASI) ICA were examined and compared. The best algorithm required the least complexity and fewest resources while effectively separating mixed sources. The best algorithm was the EASI algorithm. The EASI ICA was implemented on hardware with Field Programmable Gate Arrays (FPGA) to perform and analyze its performance in real time.
Flame analysis using image processing techniques
NASA Astrophysics Data System (ADS)
Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng
2018-04-01
This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.
Fast algorithm for spectral processing with application to on-line welding quality assurance
NASA Astrophysics Data System (ADS)
Mirapeix, J.; Cobo, A.; Jaúregui, C.; López-Higuera, J. M.
2006-10-01
A new technique is presented in this paper for the analysis of welding process emission spectra to accurately estimate in real-time the plasma electronic temperature. The estimation of the electronic temperature of the plasma, through the analysis of the emission lines from multiple atomic species, may be used to monitor possible perturbations during the welding process. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, sub-pixel algorithms are used to more accurately estimate the central wavelength of the peaks. Three different sub-pixel algorithms will be analysed and compared, and it will be shown that the LPO (linear phase operator) sub-pixel algorithm is a better solution within the proposed system. Experimental tests during TIG-welding using a fibre optic to capture the arc light, together with a low cost CCD-based spectrometer, show that some typical defects associated with perturbations in the electron temperature can be easily detected and identified with this technique. A typical processing time for multiple peak analysis is less than 20 ms running on a conventional PC.
Tan, Eric C. D.; Snowden-Swan, Lesley J.; Talmadge, Michael; ...
2016-09-27
This paper presents a comparative techno-economic analysis (TEA) of five conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with a specific focus on pathways utilizing oxygenated intermediates. The four emerging pathways of interest are compared with one conventional pathway (Fischer-Tropsch) for the production of the hydrocarbon blendstocks. The processing steps of the four emerging pathways include biomass-to-syngas via indirect gasification, syngas clean-up, conversion of syngas to alcohols/oxygenates followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation. Conversion of biomass-derived syngas to oxygenated intermediates occurs via three different pathways, producing: (i) mixedmore » alcohols over a MoS 2 catalyst, (ii) mixed oxygenates (a mixture of C 2+ oxygenated compounds, predominantly ethanol, acetic acid, acetaldehyde, ethyl acetate) using an Rh-based catalyst, and (iii) ethanol from syngas fermentation. This is followed by the conversion of oxygenates/alcohols to fuel-range olefins in two approaches: (i) mixed alcohols/ethanol to 1-butanol rich mixture via Guerbet reaction, followed by alcohol dehydration, oligomerization, and hydrogenation, and (ii) mixed oxygenates/ethanol to isobutene rich mixture and followed by oligomerization and hydrogenation. The design features a processing capacity of 2000 tonnes/day (2205 short tons) of dry biomass. The minimum fuel selling prices (MFSPs) for the four developing pathways range from 3.40 dollars to 5.04 dollars per gasoline-gallon equivalent (GGE), in 2011 US dollars. Sensitivity studies show that MFSPs can be improved with co-product credits and are comparable to the commercial Fischer-Tropsch benchmark ($3.58/GGE). Altogether, this comparative TEA study documents potential economics for the developmental biofuel pathways via mixed oxygenates.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Eric C. D.; Snowden-Swan, Lesley J.; Talmadge, Michael
This paper presents a comparative techno-economic analysis (TEA) of five conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with a specific focus on pathways utilizing oxygenated intermediates. The four emerging pathways of interest are compared with one conventional pathway (Fischer-Tropsch) for the production of the hydrocarbon blendstocks. The processing steps of the four emerging pathways include biomass-to-syngas via indirect gasification, syngas clean-up, conversion of syngas to alcohols/oxygenates followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation. Conversion of biomass-derived syngas to oxygenated intermediates occurs via three different pathways, producing: (i) mixedmore » alcohols over a MoS 2 catalyst, (ii) mixed oxygenates (a mixture of C 2+ oxygenated compounds, predominantly ethanol, acetic acid, acetaldehyde, ethyl acetate) using an Rh-based catalyst, and (iii) ethanol from syngas fermentation. This is followed by the conversion of oxygenates/alcohols to fuel-range olefins in two approaches: (i) mixed alcohols/ethanol to 1-butanol rich mixture via Guerbet reaction, followed by alcohol dehydration, oligomerization, and hydrogenation, and (ii) mixed oxygenates/ethanol to isobutene rich mixture and followed by oligomerization and hydrogenation. The design features a processing capacity of 2000 tonnes/day (2205 short tons) of dry biomass. The minimum fuel selling prices (MFSPs) for the four developing pathways range from 3.40 dollars to 5.04 dollars per gasoline-gallon equivalent (GGE), in 2011 US dollars. Sensitivity studies show that MFSPs can be improved with co-product credits and are comparable to the commercial Fischer-Tropsch benchmark ($3.58/GGE). Altogether, this comparative TEA study documents potential economics for the developmental biofuel pathways via mixed oxygenates.« less
Pichai, Saravanan; Rajesh, M; Reddy, Naveen; Adusumilli, Gopinath; Reddy, Jayaprakash; Joshi, Bhavana
2014-09-01
Skeletal maturation is an integral part of individual pattern of growth and development and is a continuous process. Peak growth velocity in standing height is the most valid representation of the rate of overall skeletal growth. Ossification changes of hand wrist and cervical vertebrae are the reliable indicators of growth status of individual. The objective of this study was to compare skeletal maturation as measured by hand wrist bone analysis and cervical vertebral analysis. Hand wrist radiographs and lateral cephalograms of 72 subjects aged between 7 and 16 years both male and female from the patients visiting Department of Orthodontics and Dentofacial Orthopedics, R.V. Dental College and Hospital. The 9 stages were reduced to 5 stages to compare with cervical vertebral maturation stage by Baccetti et al. The Bjork, Grave and Brown stages were reduced to six intervals to compare with cervical vertebral maturational index (CVMI) staging by Hassel and Farman. These measurements were then compared with the hand wrist bone analysis, and the results were statistically analyzed using the Mann-Whitney test. There was no significant difference between the hand wrist analysis and the two different cervical vertebral analyses for assessing skeletal maturation. There was no significant difference between the two cervical vertebral analyses, but the CVMI method, which is visual method is less time consuming. Vertebral analysis on a lateral cephalogram is as valid as the hand wrist bone analysis with the advantage of reducing the radiation exposure of growing subjects.
Petzold, Thomas; Hertzschuch, Diana; Elchlep, Frank; Eberlein-Gonska, Maria
2014-01-01
Process management (PM) is a valuable method for the systematic analysis and structural optimisation of the quality and safety of clinical treatment. PM requires a high motivation and willingness to implement changes of both employees and management. Definition of quality indicators is required to systematically measure the quality of the specified processes. One way to represent comparable quality results is the use of quality indicators of the external quality assurance in accordance with Sect. 137 SGB V—a method which the Federal Joint Committee (GBA) and the institutions commissioned by the GBA have employed and consistently enhanced for more than ten years. Information on the quality of inpatient treatment is available for 30 defined subjects throughout Germany. The combination of specified processes with quality indicators is beneficial for the information of employees. A process-based indicator dashboard provides essential information about the treatment process. These can be used for process analysis. In a continuous consideration of these indicator results values can be determined and errors will be remedied quickly. If due consideration is given to these indicators, they can be used for benchmarking to identify potential process improvements. Copyright © 2014. Published by Elsevier GmbH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tan, Eric; Snowden-Swan, Lesley J.; Talmadge, Michael
This paper presents a comparative techno-economic analysis of five conversion pathways from biomass to gasoline-, jet-, and diesel-range hydrocarbons via indirect liquefaction with specific focus on pathways utilizing oxygenated intermediates (derived either via thermochemical or biochemical conversion steps). The four emerging pathways of interest are compared with one conventional pathway (Fischer-Tropsch) for the production of the hydrocarbon blendstocks. The processing steps of the four emerging pathways include: biomass-to-syngas via indirect gasification, gas cleanup, conversion of syngas to alcohols/oxygenates, followed by conversion of alcohols/oxygenates to hydrocarbon blendstocks via dehydration, oligomerization, and hydrogenation. We show that the emerging pathways via oxygenated intermediatesmore » have the potential to be cost competitive with the conventional Fischer-Tropsch process. The evaluated pathways and the benchmark process generally exhibit similar fuel yields and carbon conversion efficiencies. The resulting minimum fuel selling prices are comparable to the benchmark at approximately $3.60 per gallon-gasoline equivalent, with potential for two new pathways to be more economically competitive. Additionally, the coproduct values can play an important role in the economics of the processes with oxygenated intermediates derived via syngas fermentation. Major cost drivers for the integrated processes are tied to achievable fuel yields and conversion efficiency of the intermediate steps, i.e., the production of oxygenates/alcohols from syngas and the conversion of oxygenates/alcohols to hydrocarbon fuels.« less
Analysis of the combustion and pyrolysis of dried sewage sludge by TGA and MS.
Magdziarz, Aneta; Werle, Sebastian
2014-01-01
In this study, the combustion and pyrolysis processes of three sewage sludge were investigated. The sewage sludge came from three wastewater treatment plants. Proximate and ultimate analyses were performed. The thermal behaviour of studied sewage sludge was investigated by thermogravimetric analysis with mass spectrometry (TGA-MS). The samples were heated from ambient temperature to 800 °C at a constant rate 10 °C/min in air (combustion process) and argon flows (pyrolysis process). The thermal profiles presented in form of TG/DTG curves were comparable for studied sludges. All TG/DTG curves were divided into three stages. The main decomposition of sewage sludge during the combustion process took place in the range 180-580 °C with c.a. 70% mass loss. The pyrolysis process occurred in lower temperature but with less mass loss. The evolved gaseous products (H2, CH4, CO2, H2O) from the decomposition of sewage sludge were identified on-line. Copyright © 2013 Elsevier Ltd. All rights reserved.
Han, Tae Hee; Kim, Moon Jung; Kim, Shinyoung; Kim, Hyun Ok; Lee, Mi Ae; Choi, Ji Seon; Hur, Mina; St John, Andrew
2013-05-01
Failure modes and effects analysis (FMEA) is a risk management tool used by the manufacturing industry but now being applied in laboratories. Teams from six South Korean blood banks used this tool to map their manual and automated blood grouping processes and determine the risk priority numbers (RPNs) as a total measure of error risk. The RPNs determined by each of the teams consistently showed that the use of automation dramatically reduced the RPN compared to manual processes. In addition, FMEA showed where the major risks occur in each of the manual processes and where attention should be prioritized to improve the process. Despite no previous experience with FMEA, the teams found the technique relatively easy to use and the subjectivity associated with assigning risk numbers did not affect the validity of the data. FMEA should become a routine technique for improving processes in laboratories. © 2012 American Association of Blood Banks.
Effect of food processing on plant DNA degradation and PCR-based GMO analysis: a review.
Gryson, Nicolas
2010-03-01
The applicability of a DNA-based method for GMO detection and quantification depends on the quality and quantity of the DNA. Important food-processing conditions, for example temperature and pH, may lead to degradation of the DNA, rendering PCR analysis impossible or GMO quantification unreliable. This review discusses the effect of several food processes on DNA degradation and subsequent GMO detection and quantification. The data show that, although many of these processes do indeed lead to the fragmentation of DNA, amplification of the DNA may still be possible. Length and composition of the amplicon may, however, affect the result, as also may the method of extraction used. Also, many techniques are used to describe the behaviour of DNA in food processing, which occasionally makes it difficult to compare research results. Further research should be aimed at defining ingredients in terms of their DNA quality and PCR amplification ability, and elaboration of matrix-specific certified reference materials.
The Effects of Pre-processing Strategies for Pediatric Cochlear Implant Recipients
Rakszawski, Bernadette; Wright, Rose; Cadieux, Jamie H.; Davidson, Lisa S.; Brenner, Christine
2016-01-01
Background Cochlear implants (CIs) have been shown to improve children’s speech recognition over traditional amplification when severe to profound sensorineural hearing loss is present. Despite improvements, understanding speech at low-level intensities or in the presence of background noise remains difficult. In an effort to improve speech understanding in challenging environments, Cochlear Ltd. offers pre-processing strategies that apply various algorithms prior to mapping the signal to the internal array. Two of these strategies include Autosensitivity Control™ (ASC) and Adaptive Dynamic Range Optimization (ADRO®). Based on previous research, the manufacturer’s default pre-processing strategy for pediatrics’ everyday programs combines ASC+ADRO®. Purpose The purpose of this study is to compare pediatric speech perception performance across various pre-processing strategies while applying a specific programming protocol utilizing increased threshold (T) levels to ensure access to very low-level sounds. Research Design This was a prospective, cross-sectional, observational study. Participants completed speech perception tasks in four pre-processing conditions: no pre-processing, ADRO®, ASC, ASC+ADRO®. Study Sample Eleven pediatric Cochlear Ltd. cochlear implant users were recruited: six bilateral, one unilateral, and four bimodal. Intervention Four programs, with the participants’ everyday map, were loaded into the processor with different pre-processing strategies applied in each of the four positions: no pre-processing, ADRO®, ASC, and ASC+ADRO®. Data Collection and Analysis Participants repeated CNC words presented at 50 and 70 dB SPL in quiet and HINT sentences presented adaptively with competing R-Space noise at 60 and 70 dB SPL. Each measure was completed as participants listened with each of the four pre-processing strategies listed above. Test order and condition were randomized. A repeated-measures analysis of variance (ANOVA) was used to compare each pre-processing strategy across group data. Critical differences were utilized to determine significant score differences between each pre-processing strategy for individual participants. Results For CNC words presented at 50 dB SPL, the group data revealed significantly better scores using ASC+ADRO® compared to all other pre-processing conditions while ASC resulted in poorer scores compared to ADRO® and ASC+ADRO®. Group data for HINT sentences presented in 70 dB SPL of R-Space noise revealed significantly improved scores using ASC and ASC+ADRO® compared to no pre-processing, with ASC+ADRO® scores being better than ADRO® alone scores. Group data for CNC words presented at 70 dB SPL and adaptive HINT sentences presented in 60 dB SPL of R-Space noise showed no significant difference among conditions. Individual data showed that the pre-processing strategy yielding the best scores varied across measures and participants. Conclusions Group data reveals an advantage with ASC+ADRO® for speech perception presented at lower levels and in higher levels of background noise. Individual data revealed that the optimal pre-processing strategy varied among participants; indicating that a variety of pre-processing strategies should be explored for each CI user considering his or her performance in challenging listening environments. PMID:26905529
Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens
2010-08-01
The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.
Effect of relevance on amygdala activation and association with the ventral striatum.
Ousdal, Olga Therese; Reckless, Greg E; Server, Andres; Andreassen, Ole A; Jensen, Jimmy
2012-08-01
While the amygdala historically has been implicated in emotional stimuli processing, recent data suggest a general role in parceling out the relevance of stimuli, regardless of their emotional properties. Using functional magnetic resonance imaging, we tested the relevance hypothesis by investigating human amygdala responses to emotionally neutral stimuli while manipulating their relevance. The task was operationalized as highly relevant if a subsequent opportunity to respond for a reward depended on response accuracy of the task, and less relevant if the reward opportunity was independent of task performance. A region of interest analysis revealed bilateral amygdala activations in response to the high relevance condition compared to the low relevance condition. An exploratory whole-brain analysis yielded robust similar results in bilateral ventral striatum. A subsequent functional connectivity analysis demonstrated increased connectivity between amygdala and ventral striatum for the highly relevant stimuli compared to the less relevant stimuli. These findings suggest that the amygdala's processing profile goes beyond detection of emotions per se, and directly support the proposed role in relevance detection. In addition, the findings suggest a close relationship between amygdala and ventral striatal activity when processing relevant stimuli. Thus, the results may indicate that human amygdala modulates ventral striatum activity and subsequent behaviors beyond that observed for emotional cues, to encompass a broader range of relevant stimuli. Copyright © 2012 Elsevier Inc. All rights reserved.
Electrophoresis gel image processing and analysis using the KODAK 1D software.
Pizzonia, J
2001-06-01
The present article reports on the performance of the KODAK 1D Image Analysis Software for the acquisition of information from electrophoresis experiments and highlights the utility of several mathematical functions for subsequent image processing, analysis, and presentation. Digital images of Coomassie-stained polyacrylamide protein gels containing molecular weight standards and ethidium bromide stained agarose gels containing DNA mass standards are acquired using the KODAK Electrophoresis Documentation and Analysis System 290 (EDAS 290). The KODAK 1D software is used to optimize lane and band identification using features such as isomolecular weight lines. Mathematical functions for mass standard representation are presented, and two methods for estimation of unknown band mass are compared. Given the progressive transition of electrophoresis data acquisition and daily reporting in peer-reviewed journals to digital formats ranging from 8-bit systems such as EDAS 290 to more expensive 16-bit systems, the utility of algorithms such as Gaussian modeling, which can correct geometric aberrations such as clipping due to signal saturation common at lower bit depth levels, is discussed. Finally, image-processing tools that can facilitate image preparation for presentation are demonstrated.
Feature Extraction of Electronic Nose Signals Using QPSO-Based Multiple KFDA Signal Processing
Wen, Tailai; Huang, Daoyu; Lu, Kun; Deng, Changjian; Zeng, Tanyue; Yu, Song; He, Zhiyi
2018-01-01
The aim of this research was to enhance the classification accuracy of an electronic nose (E-nose) in different detecting applications. During the learning process of the E-nose to predict the types of different odors, the prediction accuracy was not quite satisfying because the raw features extracted from sensors’ responses were regarded as the input of a classifier without any feature extraction processing. Therefore, in order to obtain more useful information and improve the E-nose’s classification accuracy, in this paper, a Weighted Kernels Fisher Discriminant Analysis (WKFDA) combined with Quantum-behaved Particle Swarm Optimization (QPSO), i.e., QWKFDA, was presented to reprocess the original feature matrix. In addition, we have also compared the proposed method with quite a few previously existing ones including Principal Component Analysis (PCA), Locality Preserving Projections (LPP), Fisher Discriminant Analysis (FDA) and Kernels Fisher Discriminant Analysis (KFDA). Experimental results proved that QWKFDA is an effective feature extraction method for E-nose in predicting the types of wound infection and inflammable gases, which shared much higher classification accuracy than those of the contrast methods. PMID:29382146
Feature Extraction of Electronic Nose Signals Using QPSO-Based Multiple KFDA Signal Processing.
Wen, Tailai; Yan, Jia; Huang, Daoyu; Lu, Kun; Deng, Changjian; Zeng, Tanyue; Yu, Song; He, Zhiyi
2018-01-29
The aim of this research was to enhance the classification accuracy of an electronic nose (E-nose) in different detecting applications. During the learning process of the E-nose to predict the types of different odors, the prediction accuracy was not quite satisfying because the raw features extracted from sensors' responses were regarded as the input of a classifier without any feature extraction processing. Therefore, in order to obtain more useful information and improve the E-nose's classification accuracy, in this paper, a Weighted Kernels Fisher Discriminant Analysis (WKFDA) combined with Quantum-behaved Particle Swarm Optimization (QPSO), i.e., QWKFDA, was presented to reprocess the original feature matrix. In addition, we have also compared the proposed method with quite a few previously existing ones including Principal Component Analysis (PCA), Locality Preserving Projections (LPP), Fisher Discriminant Analysis (FDA) and Kernels Fisher Discriminant Analysis (KFDA). Experimental results proved that QWKFDA is an effective feature extraction method for E-nose in predicting the types of wound infection and inflammable gases, which shared much higher classification accuracy than those of the contrast methods.
Improving wavelet denoising based on an in-depth analysis of the camera color processing
NASA Astrophysics Data System (ADS)
Seybold, Tamara; Plichta, Mathias; Stechele, Walter
2015-02-01
While Denoising is an extensively studied task in signal processing research, most denoising methods are designed and evaluated using readily processed image data, e.g. the well-known Kodak data set. The noise model is usually additive white Gaussian noise (AWGN). This kind of test data does not correspond to nowadays real-world image data taken with a digital camera. Using such unrealistic data to test, optimize and compare denoising algorithms may lead to incorrect parameter tuning or suboptimal choices in research on real-time camera denoising algorithms. In this paper we derive a precise analysis of the noise characteristics for the different steps in the color processing. Based on real camera noise measurements and simulation of the processing steps, we obtain a good approximation for the noise characteristics. We further show how this approximation can be used in standard wavelet denoising methods. We improve the wavelet hard thresholding and bivariate thresholding based on our noise analysis results. Both the visual quality and objective quality metrics show the advantage of the proposed method. As the method is implemented using look-up-tables that are calculated before the denoising step, our method can be implemented with very low computational complexity and can process HD video sequences real-time in an FPGA.
Workload-Matched Adaptive Automation Support of Air Traffic Controller Information Processing Stages
NASA Technical Reports Server (NTRS)
Kaber, David B.; Prinzel, Lawrence J., III; Wright, Melanie C.; Clamann, Michael P.
2002-01-01
Adaptive automation (AA) has been explored as a solution to the problems associated with human-automation interaction in supervisory control environments. However, research has focused on the performance effects of dynamic control allocations of early stage sensory and information acquisition functions. The present research compares the effects of AA to the entire range of information processing stages of human operators, such as air traffic controllers. The results provide evidence that the effectiveness of AA is dependent on the stage of task performance (human-machine system information processing) that is flexibly automated. The results suggest that humans are better able to adapt to AA when applied to lower-level sensory and psychomotor functions, such as information acquisition and action implementation, as compared to AA applied to cognitive (analysis and decision-making) tasks. The results also provide support for the use of AA, as compared to completely manual control. These results are discussed in terms of implications for AA design for aviation.
Xia, Juan; Zhou, Junyu; Zhang, Ronggui; Jiang, Dechen; Jiang, Depeng
2018-06-04
In this communication, a gold-coated polydimethylsiloxane (PDMS) chip with cell-sized microwells was prepared through a stamping and spraying process that was applied directly for high-throughput electrochemiluminescence (ECL) analysis of intracellular glucose at single cells. As compared with the previous multiple-step fabrication of photoresist-based microwells on the electrode, the preparation process is simple and offers fresh electrode surface for higher luminescence intensity. More luminescence intensity was recorded from cell-retained microwells than that at the planar region among the microwells that was correlated with the content of intracellular glucose. The successful monitoring of intracellular glucose at single cells using this PDMS chip will provide an alternative strategy for high-throughput single-cell analysis. Graphical abstract ᅟ.
Analysis of gas absorption to a thin liquid film in the presence of a zero-order chemical reaction
NASA Technical Reports Server (NTRS)
Rajagopalan, S.; Rahman, M. M.
1995-01-01
The paper presents a detailed theoretical analysis of the process of gas absorption to a thin liquid film adjacent to a horizontal rotating disk. The film is formed by the impingement of a controlled liquid jet at the center of the disk and subsequent radial spreading of liquid along the disk. The chemical reaction between the gas and the liquid film can be expressed as a zero-order homogeneous reaction. The process was modeled by establishing equations for the conservation of mass, momentum, and species concentration and solving them analytically. A scaling analysis was used to determine dominant transport processes. Appropriate boundary conditions were used to solve these equations to develop expressions for the local concentration of gas across the thickness of the film and distributions of film height, bulk concentration, and Sherwood number along the radius of the disk. The partial differential equation for species concentration was solved using the separation of variables technique along with the Duhamel's theorem and the final analytical solution was expressed using confluent hypergeometric functions. Tables for eigenvalues and eigenfunctions are presented for a number of reaction rate constants. A parametric study was performed using Reynolds number, Ekman number, and dimensionless reaction rate as parameters. At all radial locations, Sherwood number increased with Reynolds number (flow rate) as well as Ekman number (rate of rotation). The enhancement of mass transfer due to chemical reaction was found to be small when compared to the case of no reaction (pure absorption), but the enhancement factor was very significant when compared to pure absorption in a stagnant liquid film. The zero-order reaction processes considered in the present investigation included the absorption of oxygen in aqueous alkaline solutions of sodiumdithionite and rhodium complex catalyzed carbonylation of methanol. Present analytical results were compared to previous theoretical results for limiting conditions, and were found to have very good agreement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jian; Bearden, Mark D.; Fernandez, Carlos A.
Magnesium (Mg) has many useful applications especially in various Mg alloys which can decrease weight while increasing strength. To increase the affordability and minimize environment consequence, a novel catalyzed organo-metathetical (COMET) process was proposed to extract Mg from seawater aiming to achieve significant reduction in total energy and production cost comparing with the melting salt electrolysis method currently adopted by US Mg LLC. A process flowsheet for a reference COMET process was set-up using Aspen Plus which included five key steps, anhydrous MgCl2 production, transmetallation, dibutyl Mg decomposition, n-BuLi regeneration, and LiCL electrolysis. The energy and production cost and CO2more » emission were estimated based on the Aspen modeling using Aspen economic analyzer. Our results showed that it is possible to produce Mg from seawater with a production cost of $2.0/kg-Mg while consuming about 35.3 kWh/kg-Mg and releasing 7.0 kg CO2/kg-Mg. A simplified US Mg manufacturing process was also generated using Aspen and the cost and emission results were estimated for comparison purpose. Under our simulation conditions, the reference COMET process maintain a comparable CO2 emission rate and can save about 40% in production cost and save about 15% energy compared to the simplified US Mg process.« less
Collaboration leads to enhanced curriculum.
Valerius, J; Mohan, V; Doctor, D; Hersh, W
2015-01-01
In 2007, we initiated a health information management (HIM) track of our biomedical informatics graduate program, and subsequent ongoing program assessment revealed a confluence of topics and courses within HIM and clinical informatics (CI) tracks. We completed a thorough comparative analysis of competencies derived from AMIA, AHIMA, and CAHIIM. Coupled with the need to streamline course offerings, the process, described in this paper allowed new opportunities for faculty collaboration, resulted in the creation of a model assessment for best practice in courses, and led to new avenues of growth within the program. The objective of the case study is to provide others in the informatics educational community with a model for analysis of curriculum in order to improve quality of student learning. We describe a case study where an academic informatics program realigned its course offerings to better reflect the HIM of today, and prepare for challenges of the future. Visionary leadership, intra-departmental self-analysis and alignment of the curriculum through defined mapping process reduced overlap within the CI and HIM tracks. Teaching within courses was optimized through the work of core faculty collaboration. The analysis of curriculum resulted in reduction of overlap within course curriculum. This allowed for additional and new course content to be added to existing courses. Leadership fostered an environment where top-down as well as bottom-up collaborative assessment activities resulted in a model to consolidate learning and reduce unnecessary duplication within courses. A focus on curriculum integration, emphasis on course alignment and strategic consolidation of course content raised the quality of informatics education provided to students. Faculty synergy was an essential component of this redesign process. Continuous quality improvement strategy included an ongoing alignment of curriculum and competencies through a comparative analysis approach. Through these efforts, new innovation was possible.
Neural correlates of text-based emoticons: a preliminary fMRI study.
Kim, Ko Woon; Lee, Sang Won; Choi, Jeewook; Kim, Tae Min; Jeong, Bumseok
2016-08-01
Like nonverbal cues in oral interactions, text-based emoticons, which are textual portrayals of a writer's facial expressions, are commonly used in electronic device-mediated communication. Little is known, however, about how text-based emoticons are processed in the human brain. With this study, we investigated whether the text-based emoticons are processed as face expressions using fMRI. During fMRI scan, subjects were asked to respond by pressing a button, indicating whether text-based emoticons represented positive or negative emotions. Voxel-wise analyses were performed to compare the responses and contrasted with emotional versus scrambled emoticons and among emoticons with different emotions. To explore processing strategies for text-based emoticons, brain activity in the bilateral occipital and fusiform face areas were compared. In the voxel-wise analysis, both emotional and scrambled emoticons were processed mainly in the bilateral fusiform gyri, inferior division of lateral occipital cortex, inferior frontal gyri, dorsolateral prefrontal cortex (DLPFC), dorsal anterior cingulate cortex (dACC), and parietal cortex. In a percent signal change analysis, the right occipital and fusiform face areas showed significantly higher activation than left ones. In comparisons among emoticons, sad one showed significant BOLD signal decrease in the dACC, the left AIC, the bilateral thalamus, and the precuneus as compared with other conditions. The results of this study imply that people recognize text-based emoticons as pictures representing face expressions. Even though text-based emoticons contain emotional meaning, they are not associated with the amygdala while previous studies using emotional stimuli documented amygdala activation.