ERIC Educational Resources Information Center
Godor, Brian P.
2016-01-01
Student learning approaches research has been built upon the notions of deep and surface learning. Despite its status as part of the educational research canon, the dichotomy of deep/surface has been critiqued as constraining the debate surrounding student learning. Additionally, issues of content validity have been expressed concerning…
Surface laser marking optimization using an experimental design approach
NASA Astrophysics Data System (ADS)
Brihmat-Hamadi, F.; Amara, E. H.; Lavisse, L.; Jouvard, J. M.; Cicala, E.; Kellou, H.
2017-04-01
Laser surface marking is performed on a titanium substrate using a pulsed frequency doubled Nd:YAG laser ( λ= 532 nm, τ pulse=5 ns) to process the substrate surface under normal atmospheric conditions. The aim of the work is to investigate, following experimental and statistical approaches, the correlation between the process parameters and the response variables (output), using a Design of Experiment method (DOE): Taguchi methodology and a response surface methodology (RSM). A design is first created using MINTAB program, and then the laser marking process is performed according to the planned design. The response variables; surface roughness and surface reflectance were measured for each sample, and incorporated into the design matrix. The results are then analyzed and the RSM model is developed and verified for predicting the process output for the given set of process parameters values. The analysis shows that the laser beam scanning speed is the most influential operating factor followed by the laser pumping intensity during marking, while the other factors show complex influences on the objective functions.
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
Metallic superhydrophobic surfaces via thermal sensitization
NASA Astrophysics Data System (ADS)
Vahabi, Hamed; Wang, Wei; Popat, Ketul C.; Kwon, Gibum; Holland, Troy B.; Kota, Arun K.
2017-06-01
Superhydrophobic surfaces (i.e., surfaces extremely repellent to water) allow water droplets to bead up and easily roll off from the surface. While a few methods have been developed to fabricate metallic superhydrophobic surfaces, these methods typically involve expensive equipment, environmental hazards, or multi-step processes. In this work, we developed a universal, scalable, solvent-free, one-step methodology based on thermal sensitization to create appropriate surface texture and fabricate metallic superhydrophobic surfaces. To demonstrate the feasibility of our methodology and elucidate the underlying mechanism, we fabricated superhydrophobic surfaces using ferritic (430) and austenitic (316) stainless steels (representative alloys) with roll off angles as low as 4° and 7°, respectively. We envision that our approach will enable the fabrication of superhydrophobic metal alloys for a wide range of civilian and military applications.
NASA Technical Reports Server (NTRS)
Townsend, J.; Meyers, C.; Ortega, R.; Peck, J.; Rheinfurth, M.; Weinstock, B.
1993-01-01
Probabilistic structural analyses and design methods are steadily gaining acceptance within the aerospace industry. The safety factor approach to design has long been the industry standard, and it is believed by many to be overly conservative and thus, costly. A probabilistic approach to design may offer substantial cost savings. This report summarizes several probabilistic approaches: the probabilistic failure analysis (PFA) methodology developed by Jet Propulsion Laboratory, fast probability integration (FPI) methods, the NESSUS finite element code, and response surface methods. Example problems are provided to help identify the advantages and disadvantages of each method.
Analysis of atomic force microscopy data for surface characterization using fuzzy logic
DOE Office of Scientific and Technical Information (OSTI.GOV)
Al-Mousa, Amjed, E-mail: aalmousa@vt.edu; Niemann, Darrell L.; Niemann, Devin J.
2011-07-15
In this paper we present a methodology to characterize surface nanostructures of thin films. The methodology identifies and isolates nanostructures using Atomic Force Microscopy (AFM) data and extracts quantitative information, such as their size and shape. The fuzzy logic based methodology relies on a Fuzzy Inference Engine (FIE) to classify the data points as being top, bottom, uphill, or downhill. The resulting data sets are then further processed to extract quantitative information about the nanostructures. In the present work we introduce a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and present an omni-directional searchmore » technique to improve the structural recognition accuracy. In order to demonstrate the effectiveness of our approach we present a case study which uses our approach to quantitatively identify particle sizes of two specimens each with a unique gold nanoparticle size distribution. - Research Highlights: {yields} A Fuzzy logic analysis technique capable of characterizing AFM images of thin films. {yields} The technique is applicable to different surfaces regardless of their densities. {yields} Fuzzy logic technique does not require manual adjustment of the algorithm parameters. {yields} The technique can quantitatively capture differences between surfaces. {yields} This technique yields more realistic structure boundaries compared to other methods.« less
NASA Astrophysics Data System (ADS)
Asyirah, B. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.
2017-09-01
In manufacturing a variety of parts, plastic injection moulding is widely use. The injection moulding process parameters have played important role that affects the product's quality and productivity. There are many approaches in minimising the warpage ans shrinkage such as artificial neural network, genetic algorithm, glowworm swarm optimisation and hybrid approaches are addressed. In this paper, a systematic methodology for determining a warpage and shrinkage in injection moulding process especially in thin shell plastic parts are presented. To identify the effects of the machining parameters on the warpage and shrinkage value, response surface methodology is applied. In thos study, a part of electronic night lamp are chosen as the model. Firstly, experimental design were used to determine the injection parameters on warpage for different thickness value. The software used to analyse the warpage is Autodesk Moldflow Insight (AMI) 2012.
Entry trajectory and atmosphere reconstruction methodologies for the Mars Exploration Rover mission
NASA Astrophysics Data System (ADS)
Desai, Prasun N.; Blanchard, Robert C.; Powell, Richard W.
2004-02-01
The Mars Exploration Rover (MER) mission will land two landers on the surface of Mars, arriving in January 2004. Both landers will deliver the rovers to the surface by decelerating with the aid of an aeroshell, a supersonic parachute, retro-rockets, and air bags for safely landing on the surface. The reconstruction of the MER descent trajectory and atmosphere profile will be performed for all the phases from hypersonic flight through landing. A description of multiple methodologies for the flight reconstruction is presented from simple parameter identification methods through a statistical Kalman filter approach.
Methodological Approaches to Online Scoring of Essays.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; O'Neil, Harold F., Jr.
This report examines the feasibility of scoring essays using computer-based techniques. Essays have been incorporated into many of the standardized testing programs. Issues of validity and reliability must be addressed to deploy automated approaches to scoring fully. Two approaches that have been used to classify documents, surface- and word-based…
On the merging of optical and SAR satellite imagery for surface water mapping applications
NASA Astrophysics Data System (ADS)
Markert, Kel N.; Chishtie, Farrukh; Anderson, Eric R.; Saah, David; Griffin, Robert E.
2018-06-01
Optical and Synthetic Aperture Radar (SAR) imagery from satellite platforms provide a means to discretely map surface water; however, the application of the two data sources in tandem has been inhibited by inconsistent data availability, the distinct physical properties that optical and SAR instruments sense, and dissimilar data delivery platforms. In this paper, we describe a preliminary methodology for merging optical and SAR data into a common data space. We apply our approach over a portion of the Mekong Basin, a region with highly variable surface water cover and persistent cloud cover, for surface water applications requiring dense time series analysis. The methods include the derivation of a representative index from both sensors that transforms data from disparate physical units (reflectance and backscatter) to a comparable dimensionless space applying a consistent water extraction approach to both datasets. The merging of optical and SAR data allows for increased observations in cloud prone regions that can be used to gain additional insight into surface water dynamics or flood mapping applications. This preliminary methodology shows promise for a common optical-SAR water extraction; however, data ranges and thresholding values can vary depending on data source, yielding classification errors in the resulting surface water maps. We discuss some potential future approaches to address these inconsistencies.
Fabrication of multi-functional silicon surface by direct laser writing
NASA Astrophysics Data System (ADS)
Verma, Ashwani Kumar; Soni, R. K.
2018-05-01
We present a simple, quick and one-step methodology based on nano-second laser direct writing for the fabrication of micro-nanostructures on silicon surface. The fabricated surfaces suppress the optical reflection by multiple reflection due to light trapping effect to a much lower value than polished silicon surface. These textured surfaces offer high enhancement ability after gold nanoparticle deposition and then explored for Surface Enhanced Raman Scattering (SERS) for specific molecular detection. The effect of laser scanning line interval on optical reflection and SERS signal enhancement ability was also investigated. Our results indicate that low optical reflection substrates exhibit uniform SERS enhancement with enhancement factor of the order of 106. Furthermore, this methodology provide an alternative approach for cost-effective large area fabrication with good control over feature size.
Shear-wave velocity profiling according to three alternative approaches: A comparative case study
NASA Astrophysics Data System (ADS)
Dal Moro, G.; Keller, L.; Al-Arifi, N. S.; Moustafa, S. S. R.
2016-11-01
The paper intends to compare three different methodologies which can be used to analyze surface-wave propagation, thus eventually obtaining the vertical shear-wave velocity (VS) profile. The three presented methods (currently still quite unconventional) are characterized by different field procedures and data processing. The first methodology is a sort of evolution of the classical Multi-channel Analysis of Surface Waves (MASW) here accomplished by jointly considering Rayleigh and Love waves (analyzed according to the Full Velocity Spectrum approach) and the Horizontal-to-Vertical Spectral Ratio (HVSR). The second method is based on the joint analysis of the HVSR curve together with the Rayleigh-wave dispersion determined via Miniature Array Analysis of Microtremors (MAAM), a passive methodology that relies on a small number (4 to 6) of vertical geophones deployed along a small circle (for the common near-surface application the radius usually ranges from 0.6 to 5 m). Finally, the third considered approach is based on the active data acquired by a single 3-component geophone and relies on the joint inversion of the group-velocity spectra of the radial and vertical components of the Rayleigh waves, together with the Radial-to-Vertical Spectral Ratio (RVSR). The results of the analyses performed while considering these approaches (completely different both in terms of field procedures and data analysis) appear extremely consistent thus mutually validating their performances. Pros and cons of each approach are summarized both in terms of computational aspects as well as with respect to practical considerations regarding the specific character of the pertinent field procedures.
NASA Astrophysics Data System (ADS)
Riasi, S.; Huang, G.; Montemagno, C.; Yeghiazarian, L.
2013-12-01
Micro-scale modeling of multiphase flow in porous media is critical to characterize porous materials. Several modeling techniques have been implemented to date, but none can be used as a general strategy for all porous media applications due to challenges presented by non-smooth high-curvature solid surfaces, and by a wide range of pore sizes and porosities. Finite approaches like the finite volume method require a high quality, problem-dependent mesh, while particle-based approaches like the lattice Boltzmann require too many particles to achieve a stable meaningful solution. Both come at a large computational cost. Other methods such as pore network modeling (PNM) have been developed to accelerate the solution process by simplifying the solution domain, but so far a unique and straightforward methodology to implement PNM is lacking. We have developed a general, stable and fast methodology to model multi-phase fluid flow in porous materials, irrespective of their porosity and solid phase topology. We have applied this methodology to highly porous fibrous materials in which void spaces are not distinctly separated, and where simplifying the geometry into a network of pore bodies and throats, as in PNM, does not result in a topology-consistent network. To this end, we have reduced the complexity of the 3-D void space geometry by working with its medial surface. We have used a non-iterative fast medial surface finder algorithm to determine a voxel-wide medial surface of the void space, and then solved the quasi-static drainage and imbibition on the resulting domain. The medial surface accurately represents the topology of the porous structure including corners, irregular cross sections, etc. This methodology is capable of capturing corner menisci and the snap-off mechanism numerically. It also allows for calculation of pore size distribution, permeability and capillary pressure-saturation-specific interfacial area surface of the porous structure. To show the capability of this method to numerically estimate the capillary pressure in irregular cross sections, we compared our results with analytical solutions available for capillary tubes with non-circular cross sections. We also validated this approach by implementing it on well-known benchmark problems such as a bundle of cylinders and packed spheres.
Applying the Inverse Maximum Ratio- Λ to 3-Dimensional Surfaces
NASA Astrophysics Data System (ADS)
Chandran, Avinash; Brown, Derek; DiPietro, Loretta; Danoff, Jerome
2016-06-01
The question of contour uniformity on a three-dimensional surface arises in various fields of study. Although many questions related to surface uniformity exist, there is a lack of standard methodology to quantify uniformity of a three-dimensional surface. Therefore, a sound mathematical approach to this question could prove to be useful in various areas of study. The purpose of this paper is to expand the previously validated mathematical concept of the inverse maximum ratio over a three-dimensional surface and assess its robustness. We will describe the mathematical approach used to accomplish this and use several simulated examples to validate the metric.
Infrared Algorithm Development for Ocean Observations with EOS/MODIS
NASA Technical Reports Server (NTRS)
Brown, Otis B.
1997-01-01
Efforts continue under this contract to develop algorithms for the computation of sea surface temperature (SST) from MODIS infrared measurements. This effort includes radiative transfer modeling, comparison of in situ and satellite observations, development and evaluation of processing and networking methodologies for algorithm computation and data accession, evaluation of surface validation approaches for IR radiances, development of experimental instrumentation, and participation in MODIS (project) related activities. Activities in this contract period have focused on radiative transfer modeling, evaluation of atmospheric correction methodologies, undertake field campaigns, analysis of field data, and participation in MODIS meetings.
Scratching beneath the Surface of Communities of (Mal)practice
ERIC Educational Resources Information Center
Pemberton, Jon; Mavin, Sharon; Stalker, Brenda
2007-01-01
Purpose: This paper seeks to surface less positive aspects of communities of practice (CoPs), regardless of emergent or organisationally managed, grounded in political-power interactions. Examples are provided from the authors' experiences of a research-based CoP within UK higher education. Design/methodology/approach: The paper is primarily…
Filatov, B N; Britanov, N G; Tochilkina, L P; Zhukov, V E; Maslennikov, A A; Ignatenko, M N; Volchek, K
2011-01-01
The threat of industrial chemical accidents and terrorist attacks requires the development of safety regulations for the cleanup of contaminated surfaces. This paper presents principles and a methodology for the development of a new toxicological parameter, "relative value unit" (RVU) as the primary decontamination standard.
Nanoengineered Plasmonic Hybrid Systems for Bio-nanotechnology
NASA Astrophysics Data System (ADS)
Leong, Kirsty
Plasmonic hybrid systems are fabricated using a combination of lithography and layer-by-layer directed self-assembly approaches to serve as highly sensitive nanosensing devices. This layer-by-layer directed self-assembly approach is utilized as a hybrid methodology to control the organization of quantum dots (QDs), nanoparticles, and biomolecules onto inorganic nanostructures with site-specific attachment and functionality. Here, surface plasmon-enhanced nanoarrays are fabricated where the photoluminescence of quantum dots and conjugated polymer nanoarrays are studied. This study was performed by tuning the localized surface plasmon resonance and the distance between the emitter and the metal surface using genetically engineered polypeptides as binding agents and biotin-streptavidin binding as linker molecules. In addition, these nanoarrays were also chemically modified to support the immobilization and label-free detection of DNA using surface enhanced Raman scattering. The surface of the nanoarrays was chemically modified using an acridine containing molecule which can act as an intercalating agent for DNA. The self-assembled monolayer (SAM) showed the ability to immobilize and intercalate DNA onto the surface. This SAM system using surface enhanced Raman scattering (SERS) serves as a highly sensitive methodology for the immobilization and label-free detection of DNA applicable into a wide range of bio-diagnostic platforms. Other micropatterned arrays were also fabricated using a combination of soft lithography and surface engineering. Selective single cell patterning and adhesion was achieved through chemical modifications and surface engineering of poly(dimethylsiloxane) surface. The surface of each microwell was functionally engineered with a SAM which contained an aldehyde terminated fused-ring aromatic thiolated molecule. Cells were found to be attracted and adherent to the chemically modified microwells. By combining soft lithography and surface engineering, a simple methodology produced single cell arrays on biocompatible substrates. Thus the design of plasmonic devices relies heavily on the nature of the plasmonic interactions between nanoparticles in the devices which can potentially be fabricated into lab-on-a-chip devices for multiplex sensing capabilities.
NASA Astrophysics Data System (ADS)
Al-Mousa, Amjed A.
Thin films are essential constituents of modern electronic devices and have a multitude of applications in such devices. The impact of the surface morphology of thin films on the device characteristics where these films are used has generated substantial attention to advanced film characterization techniques. In this work, we present a new approach to characterize surface nanostructures of thin films by focusing on isolating nanostructures and extracting quantitative information, such as the shape and size of the structures. This methodology is applicable to any Scanning Probe Microscopy (SPM) data, such as Atomic Force Microscopy (AFM) data which we are presenting here. The methodology starts by compensating the AFM data for some specific classes of measurement artifacts. After that, the methodology employs two distinct techniques. The first, which we call the overlay technique, proceeds by systematically processing the raster data that constitute the scanning probe image in both vertical and horizontal directions. It then proceeds by classifying points in each direction separately. Finally, the results from both the horizontal and the vertical subsets are overlaid, where a final decision on each surface point is made. The second technique, based on fuzzy logic, relies on a Fuzzy Inference Engine (FIE) to classify the surface points. Once classified, these points are clustered into surface structures. The latter technique also includes a mechanism which can consistently distinguish crowded surfaces from those with sparsely distributed structures and then tune the fuzzy technique system uniquely for that surface. Both techniques have been applied to characterize organic semiconductor thin films of pentacene on different substrates. Also, we present a case study to demonstrate the effectiveness of our methodology to identify quantitatively particle sizes of two specimens of gold nanoparticles of different nominal dimensions dispersed on a mica surface. A comparison with other techniques like: thresholding, watershed and edge detection is presented next. Finally, we present a systematic study of the fuzzy logic technique by experimenting with synthetic data. These results are discussed and compared along with the challenges of the two techniques.
Local deformation for soft tissue simulation
Omar, Nadzeri; Zhong, Yongmin; Smith, Julian; Gu, Chengfan
2016-01-01
ABSTRACT This paper presents a new methodology to localize the deformation range to improve the computational efficiency for soft tissue simulation. This methodology identifies the local deformation range from the stress distribution in soft tissues due to an external force. A stress estimation method is used based on elastic theory to estimate the stress in soft tissues according to a depth from the contact surface. The proposed methodology can be used with both mass-spring and finite element modeling approaches for soft tissue deformation. Experimental results show that the proposed methodology can improve the computational efficiency while maintaining the modeling realism. PMID:27286482
Industrial inspection of specular surfaces using a new calibration procedure
NASA Astrophysics Data System (ADS)
Aswendt, Petra; Hofling, Roland; Gartner, Soren
2005-06-01
The methodology of phase encoded reflection measurements has become a valuable tool for the industrial inspection of components with glossy surfaces. The measuring principle provides outstanding sensitivity for tiny variations of surface curvature so that sub-micron waviness and flaws are reliably detected. Quantitative curvature measurements can be obtained from a simple approach if the object is almost flat. 3D-objects with a high aspect ratio require more effort to determine both coordinates and normal direction of a surface point unambiguously. Stereoscopic solutions have been reported using more than one camera for a certain surface area. This paper will describe the combined double camera steady surface approach (DCSS) that is well suited for the implementation in industrial testing stations
Speciation of adsorbates on surface of solids by infrared spectroscopy and chemometrics.
Vilmin, Franck; Bazin, Philippe; Thibault-Starzyk, Frédéric; Travert, Arnaud
2015-09-03
Speciation, i.e. identification and quantification, of surface species on heterogeneous surfaces by infrared spectroscopy is important in many fields but remains a challenging task when facing strongly overlapped spectra of multiple adspecies. Here, we propose a new methodology, combining state of the art instrumental developments for quantitative infrared spectroscopy of adspecies and chemometrics tools, mainly a novel data processing algorithm, called SORB-MCR (SOft modeling by Recursive Based-Multivariate Curve Resolution) and multivariate calibration. After formal transposition of the general linear mixture model to adsorption spectral data, the main issues, i.e. validity of Beer-Lambert law and rank deficiency problems, are theoretically discussed. Then, the methodology is exposed through application to two case studies, each of them characterized by a specific type of rank deficiency: (i) speciation of physisorbed water species over a hydrated silica surface, and (ii) speciation (chemisorption and physisorption) of a silane probe molecule over a dehydrated silica surface. In both cases, we demonstrate the relevance of this approach which leads to a thorough surface speciation based on comprehensive and fully interpretable multivariate quantitative models. Limitations and drawbacks of the methodology are also underlined. Copyright © 2015 Elsevier B.V. All rights reserved.
Learning Approaches of Undergraduate Computer Technology Students: Strategies for Improvement
ERIC Educational Resources Information Center
Malakolunthu, Suseela; Joshua, Alice
2012-01-01
Purpose: In recent times, quality of graduates and their performance has been questioned. Students' performance is an indicator of the kind of approach (deep or surface) that is taken. This study investigates the kind of undergraduates take in their learning processes. Methodology: This quantitative survey used Revised Two-Factor Study Process…
Jacob, Samuel; Banerjee, Rintu
2016-08-01
A novel approach to overcome the acidification problem has been attempted in the present study by codigesting industrial potato waste (PW) with Pistia stratiotes (PS, an aquatic weed). The effectiveness of codigestion of the weed and PW was tested in an equal (1:1) proportion by weight with substrate concentration of 5g total solid (TS)/L (2.5gPW+2.5gPS) which resulted in enhancement of methane yield by 76.45% as compared to monodigestion of PW with a positive synergistic effect. Optimization of process parameters was conducted using central composite design (CCD) based response surface methodology (RSM) and artificial neural network (ANN) coupled genetic algorithm (GA) model. Upon comparison of these two optimization techniques, ANN-GA model obtained through feed forward back propagation methodology was found to be efficient and yielded 447.4±21.43LCH4/kgVSfed (0.279gCH4/kgCODvs) which is 6% higher as compared to the CCD-RSM based approach. Copyright © 2016 Elsevier Ltd. All rights reserved.
Goal-oriented rectification of camera-based document images.
Stamatopoulos, Nikolaos; Gatos, Basilis; Pratikakis, Ioannis; Perantonis, Stavros J
2011-04-01
Document digitization with either flatbed scanners or camera-based systems results in document images which often suffer from warping and perspective distortions that deteriorate the performance of current OCR approaches. In this paper, we present a goal-oriented rectification methodology to compensate for undesirable document image distortions aiming to improve the OCR result. Our approach relies upon a coarse-to-fine strategy. First, a coarse rectification is accomplished with the aid of a computationally low cost transformation which addresses the projection of a curved surface to a 2-D rectangular area. The projection of the curved surface on the plane is guided only by the textual content's appearance in the document image while incorporating a transformation which does not depend on specific model primitives or camera setup parameters. Second, pose normalization is applied on the word level aiming to restore all the local distortions of the document image. Experimental results on various document images with a variety of distortions demonstrate the robustness and effectiveness of the proposed rectification methodology using a consistent evaluation methodology that encounters OCR accuracy and a newly introduced measure using a semi-automatic procedure.
The derivation of scenic utility functions and surfaces and their role in landscape management
John W. Hamilton; Gregory J. Buhyoff; J. Douglas Wellman
1979-01-01
This paper outlines a methodological approach for determining relevant physical landscape features which people use in formulating judgments about scenic utility. This information, coupled with either empirically derived or rationally stipulated regression techniques, may be used to produce scenic utility functions and surfaces. These functions can provide a means for...
NASA Astrophysics Data System (ADS)
Sedano, Fernando; Kempeneers, Pieter; Strobl, Peter; Kucera, Jan; Vogt, Peter; Seebach, Lucia; San-Miguel-Ayanz, Jesús
2011-09-01
This study presents a novel cloud masking approach for high resolution remote sensing images in the context of land cover mapping. As an advantage to traditional methods, the approach does not rely on thermal bands and it is applicable to images from most high resolution earth observation remote sensing sensors. The methodology couples pixel-based seed identification and object-based region growing. The seed identification stage relies on pixel value comparison between high resolution images and cloud free composites at lower spatial resolution from almost simultaneously acquired dates. The methodology was tested taking SPOT4-HRVIR, SPOT5-HRG and IRS-LISS III as high resolution images and cloud free MODIS composites as reference images. The selected scenes included a wide range of cloud types and surface features. The resulting cloud masks were evaluated through visual comparison. They were also compared with ad-hoc independently generated cloud masks and with the automatic cloud cover assessment algorithm (ACCA). In general the results showed an agreement in detected clouds higher than 95% for clouds larger than 50 ha. The approach produced consistent results identifying and mapping clouds of different type and size over various land surfaces including natural vegetation, agriculture land, built-up areas, water bodies and snow.
Surface passivation of semiconducting oxides by self-assembled nanoparticles
Park, Dae-Sung; Wang, Haiyuan; Vasheghani Farahani, Sepehr K.; Walker, Marc; Bhatnagar, Akash; Seghier, Djelloul; Choi, Chel-Jong; Kang, Jie-Hun; McConville, Chris F.
2016-01-01
Physiochemical interactions which occur at the surfaces of oxide materials can significantly impair their performance in many device applications. As a result, surface passivation of oxide materials has been attempted via several deposition methods and with a number of different inert materials. Here, we demonstrate a novel approach to passivate the surface of a versatile semiconducting oxide, zinc oxide (ZnO), evoking a self-assembly methodology. This is achieved via thermodynamic phase transformation, to passivate the surface of ZnO thin films with BeO nanoparticles. Our unique approach involves the use of BexZn1-xO (BZO) alloy as a starting material that ultimately yields the required coverage of secondary phase BeO nanoparticles, and prevents thermally-induced lattice dissociation and defect-mediated chemisorption, which are undesirable features observed at the surface of undoped ZnO. This approach to surface passivation will allow the use of semiconducting oxides in a variety of different electronic applications, while maintaining the inherent properties of the materials. PMID:26757827
Integrating uniform design and response surface methodology to optimize thiacloprid suspension
Li, Bei-xing; Wang, Wei-chang; Zhang, Xian-peng; Zhang, Da-xia; Mu, Wei; Liu, Feng
2017-01-01
A model 25% suspension concentrate (SC) of thiacloprid was adopted to evaluate an integrative approach of uniform design and response surface methodology. Tersperse2700, PE1601, xanthan gum and veegum were the four experimental factors, and the aqueous separation ratio and viscosity were the two dependent variables. Linear and quadratic polynomial models of stepwise regression and partial least squares were adopted to test the fit of the experimental data. Verification tests revealed satisfactory agreement between the experimental and predicted data. The measured values for the aqueous separation ratio and viscosity were 3.45% and 278.8 mPa·s, respectively, and the relative errors of the predicted values were 9.57% and 2.65%, respectively (prepared under the proposed conditions). Comprehensive benefits could also be obtained by appropriately adjusting the amount of certain adjuvants based on practical requirements. Integrating uniform design and response surface methodology is an effective strategy for optimizing SC formulas. PMID:28383036
Olkowska, Ewa; Polkowska, Żaneta; Namieśnik, Jacek
2013-11-15
A new analytical procedure for the simultaneous determination of individual cationic surfactants (alkyl benzyl dimethyl ammonium chlorides) in surface water samples has been developed. We describe this methodology for the first time: it involves the application of solid phase extraction (SPE-for sample preparation) coupled with ion chromatography-conductivity detection (IC-CD-for the final determination). Mean recoveries of analytes between 79% and 93%, and overall method quantification limits in the range from 0.0018 to 0.038 μg/mL for surface water and CRM samples were achieved. The methodology was applied to the determination of individual alkyl benzyl quaternary ammonium compounds in environmental samples (reservoir water) and enables their presence in such types of waters to be confirmed. In addition, it is a simpler, less time-consuming, labour-intensive, avoiding use of toxic chloroform and significantly less expensive methodology than previously described approaches (liquid-liquid extraction coupled with liquid chromatography-mass spectrometry). Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
El Houda Thabet, Rihab; Combastel, Christophe; Raïssi, Tarek; Zolghadri, Ali
2015-09-01
The paper develops a set membership detection methodology which is applied to the detection of abnormal positions of aircraft control surfaces. Robust and early detection of such abnormal positions is an important issue for early system reconfiguration and overall optimisation of aircraft design. In order to improve fault sensitivity while ensuring a high level of robustness, the method combines a data-driven characterisation of noise and a model-driven approach based on interval prediction. The efficiency of the proposed methodology is illustrated through simulation results obtained based on data recorded in several flight scenarios of a highly representative aircraft benchmark.
Model for the Effect of Fiber Bridging on the Fracture Resistance of Reinforced-Carbon-Carbon
NASA Technical Reports Server (NTRS)
Chan, Kwai S.; Lee, Yi-Der; Hudak, Stephen J., Jr.
2009-01-01
A micromechanical methodology has been developed for analyzing fiber bridging and resistance-curve behavior in reinforced-carbon-carbon (RCC) panels with a three-dimensional (3D) composite architecture and a silicon carbide (SiC) surface coating. The methodology involves treating fiber bridging traction on the crack surfaces in terms of a weight function approach and a bridging law that relates the bridging stress to the crack opening displacement. A procedure has been developed to deduce material constants in the bridging law from the linear portion of the K-resistance curve. This report contains information on the application of procedures and outcomes.
D'Angelo, Sara; Staquicini, Fernanda I; Ferrara, Fortunato; Staquicini, Daniela I; Sharma, Geetanjali; Tarleton, Christy A; Nguyen, Huynh; Naranjo, Leslie A; Sidman, Richard L; Arap, Wadih; Bradbury, Andrew Rm; Pasqualini, Renata
2018-05-03
We developed a potentially novel and robust antibody discovery methodology, termed selection of phage-displayed accessible recombinant targeted antibodies (SPARTA). This combines an in vitro screening step of a naive human antibody library against known tumor targets, with in vivo selections based on tumor-homing capabilities of a preenriched antibody pool. This unique approach overcomes several rate-limiting challenges to generate human antibodies amenable to rapid translation into medical applications. As a proof of concept, we evaluated SPARTA on 2 well-established tumor cell surface targets, EphA5 and GRP78. We evaluated antibodies that showed tumor-targeting selectivity as a representative panel of antibody-drug conjugates (ADCs) and were highly efficacious. Our results validate a discovery platform to identify and validate monoclonal antibodies with favorable tumor-targeting attributes. This approach may also extend to other diseases with known cell surface targets and affected tissues easily isolated for in vivo selection.
Design Optimization Method for Composite Components Based on Moment Reliability-Sensitivity Criteria
NASA Astrophysics Data System (ADS)
Sun, Zhigang; Wang, Changxi; Niu, Xuming; Song, Yingdong
2017-08-01
In this paper, a Reliability-Sensitivity Based Design Optimization (RSBDO) methodology for the design of the ceramic matrix composites (CMCs) components has been proposed. A practical and efficient method for reliability analysis and sensitivity analysis of complex components with arbitrary distribution parameters are investigated by using the perturbation method, the respond surface method, the Edgeworth series and the sensitivity analysis approach. The RSBDO methodology is then established by incorporating sensitivity calculation model into RBDO methodology. Finally, the proposed RSBDO methodology is applied to the design of the CMCs components. By comparing with Monte Carlo simulation, the numerical results demonstrate that the proposed methodology provides an accurate, convergent and computationally efficient method for reliability-analysis based finite element modeling engineering practice.
Discrete and continuous dynamics modeling of a mass moving on a flexible structure
NASA Technical Reports Server (NTRS)
Herman, Deborah Ann
1992-01-01
A general discrete methodology for modeling the dynamics of a mass that moves on the surface of a flexible structure is developed. This problem was motivated by the Space Station/Mobile Transporter system. A model reduction approach is developed to make the methodology applicable to large structural systems. To validate the discrete methodology, continuous formulations are also developed. Three different systems are examined: (1) simply-supported beam, (2) free-free beam, and (3) free-free beam with two points of contact between the mass and the flexible beam. In addition to validating the methodology, parametric studies were performed to examine how the system's physical properties affect its dynamics.
A stochastic approach for automatic generation of urban drainage systems.
Möderl, M; Butler, D; Rauch, W
2009-01-01
Typically, performance evaluation of new developed methodologies is based on one or more case studies. The investigation of multiple real world case studies is tedious and time consuming. Moreover extrapolating conclusions from individual investigations to a general basis is arguable and sometimes even wrong. In this article a stochastic approach is presented to evaluate new developed methodologies on a broader basis. For the approach the Matlab-tool "Case Study Generator" is developed which generates a variety of different virtual urban drainage systems automatically using boundary conditions e.g. length of urban drainage system, slope of catchment surface, etc. as input. The layout of the sewer system is based on an adapted Galton-Watson branching process. The sub catchments are allocated considering a digital terrain model. Sewer system components are designed according to standard values. In total, 10,000 different virtual case studies of urban drainage system are generated and simulated. Consequently, simulation results are evaluated using a performance indicator for surface flooding. Comparison between results of the virtual and two real world case studies indicates the promise of the method. The novelty of the approach is that it is possible to get more general conclusions in contrast to traditional evaluations with few case studies.
NASA Astrophysics Data System (ADS)
Whitehead, James Joshua
The analysis documented herein provides an integrated approach for the conduct of optimization under uncertainty (OUU) using Monte Carlo Simulation (MCS) techniques coupled with response surface-based methods for characterization of mixture-dependent variables. This novel methodology provides an innovative means of conducting optimization studies under uncertainty in propulsion system design. Analytic inputs are based upon empirical regression rate information obtained from design of experiments (DOE) mixture studies utilizing a mixed oxidizer hybrid rocket concept. Hybrid fuel regression rate was selected as the target response variable for optimization under uncertainty, with maximization of regression rate chosen as the driving objective. Characteristic operational conditions and propellant mixture compositions from experimental efforts conducted during previous foundational work were combined with elemental uncertainty estimates as input variables. Response surfaces for mixture-dependent variables and their associated uncertainty levels were developed using quadratic response equations incorporating single and two-factor interactions. These analysis inputs, response surface equations and associated uncertainty contributions were applied to a probabilistic MCS to develop dispersed regression rates as a function of operational and mixture input conditions within design space. Illustrative case scenarios were developed and assessed using this analytic approach including fully and partially constrained operational condition sets over all of design mixture space. In addition, optimization sets were performed across an operationally representative region in operational space and across all investigated mixture combinations. These scenarios were selected as representative examples relevant to propulsion system optimization, particularly for hybrid and solid rocket platforms. Ternary diagrams, including contour and surface plots, were developed and utilized to aid in visualization. The concept of Expanded-Durov diagrams was also adopted and adapted to this study to aid in visualization of uncertainty bounds. Regions of maximum regression rate and associated uncertainties were determined for each set of case scenarios. Application of response surface methodology coupled with probabilistic-based MCS allowed for flexible and comprehensive interrogation of mixture and operating design space during optimization cases. Analyses were also conducted to assess sensitivity of uncertainty to variations in key elemental uncertainty estimates. The methodology developed during this research provides an innovative optimization tool for future propulsion design efforts.
Estimating Agricultural Nitrous Oxide Emissions
USDA-ARS?s Scientific Manuscript database
Nitrous oxide emissions are highly variable in space and time and different methodologies have not agreed closely, especially at small scales. However, as scale increases, so does the agreement between estimates based on soil surface measurements (bottom up approach) and estimates derived from chang...
Thiol-ene mediated neoglycosylation of collagen patches: a preliminary study.
Russo, Laura; Battocchio, Chiara; Secchi, Valeria; Magnano, Elena; Nappini, Silvia; Taraballi, Francesca; Gabrielli, Luca; Comelli, Francesca; Papagni, Antonio; Costa, Barbara; Polzonetti, Giovanni; Nicotra, Francesco; Natalello, Antonino; Doglia, Silvia M; Cipolla, Laura
2014-02-11
Despite the relevance of carbohydrates as cues in eliciting specific biological responses, the covalent surface modification of collagen-based matrices with small carbohydrate epitopes has been scarcely investigated. We report thereby the development of an efficient procedure for the chemoselective neoglycosylation of collagen matrices (patches) via a thiol-ene approach, between alkene-derived monosaccharides and the thiol-functionalized material surface. Synchrotron radiation-induced X-ray photoelectron spectroscopy (SR-XPS), Fourier transform-infrared (FT-IR), and enzyme-linked lectin assay (ELLA) confirmed the effectiveness of the collagen neoglycosylation. Preliminary biological evaluation in osteoarthritic models is reported. The proposed methodology can be extended to any thiolated surface for the development of smart biomaterials for innovative approaches in regenerative medicine.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
Greenhouse gas emissions from reservoir water surfaces: A new global synthesis
Collectively, reservoirs created by dams are thought to be an important source ofgreenhouse gases (GHGs) to the atmosphere. So far, efforts to quantify, model, andmanage these emissions have been limited by data availability and inconsistenciesin methodological approach. Here we ...
Greenhouse Gas Emissions from Reservoir Water Surfaces: A New Global Synthesis - journal
Collectively, reservoirs are an important anthropogenic source of greenhouse gases (GHGs) to the atmosphere. Attempts to model reservoir GHG fluxes, however, have been limited by inconsistencies in methodological approaches and data availability. An increase in the number of pu...
Lunar Surface Habitat Configuration Assessment: Methodology and Observations
NASA Technical Reports Server (NTRS)
Carpenter, Amanda
2008-01-01
The Lunar Habitat Configuration Assessment evaluated the major habitat approaches that were conceptually developed during the Lunar Architecture Team II Study. The objective of the configuration assessment was to identify desired features, operational considerations, and risks to derive habitat requirements. This assessment only considered operations pertaining to the lunar surface and did not consider all habitat conceptual designs developed. To examine multiple architectures, the Habitation Focus Element Team defined several adequate concepts which warranted the need for a method to assess the various configurations. The fundamental requirement designed into each concept included the functional and operational capability to support a crew of four on a six-month lunar surface mission; however, other conceptual aspects were diverse in comparison. The methodology utilized for this assessment consisted of defining figure of merits, providing relevant information, and establishing a scoring system. In summary, the assessment considered the geometric configuration of each concept to determine the complexity of unloading, handling, mobility, leveling, aligning, mating to other elements, and the accessibility to the lunar surface. In theory, the assessment was designed to derive habitat requirements, potential technology development needs and identify risks associated with living and working on the lunar surface. Although the results were more subjective opposed to objective, the assessment provided insightful observations for further assessments and trade studies of lunar surface habitats. This overall methodology and resulting observations will be describe in detail and illustrative examples will be discussed.
NASA Astrophysics Data System (ADS)
Ketkar, Supriya; Lee, Junhan; Asokamani, Sen; Cho, Winston; Mishra, Shailendra
2018-03-01
This paper discusses the approach and solution adopted by GLOBALFOUNDRIES, a high volume manufacturing (HVM) foundry, for dry-etch related edge-signature surface particle defects issue facing the sub-nm node in the gate-etch sector. It is one of the highest die killers for the company in the 14-nm node. We have used different approaches to attack and rectify the edge signature surface particle defect. Several process-related & hardware changes have been successively implemented to achieve defect reduction improvement by 63%. Each systematic process and/or hardware approach has its own unique downstream issues and they have been dealt in a route-cause-effect technique to address the issue.
2007-04-01
optimization methodology we introduce. State-of-the-art protein - protein docking approaches start by identifying conformations with good surface /chemical com...side-chains on the interface ). The protein - protein docking literature (e.g., [8] and the references therein) is predominantly treating the docking...mations by various measures of surface complementarity which can be efficiently computed using fast Fourier correlation tech- niques (FFTs). However, when
Benchmarking a soil moisture data assimilation system for agricultural drought monitoring
USDA-ARS?s Scientific Manuscript database
Despite considerable interest in the application of land surface data assimilation systems (LDAS) for agricultural drought applications, relatively little is known about the large-scale performance of such systems and, thus, the optimal methodological approach for implementing them. To address this ...
A new approach to synthesis of benzyl cinnamate: Optimization by response surface methodology.
Zhang, Dong-Hao; Zhang, Jiang-Yan; Che, Wen-Cai; Wang, Yun
2016-09-01
In this work, the new approach to synthesis of benzyl cinnamate by enzymatic esterification of cinnamic acid with benzyl alcohol is optimized by response surface methodology. The effects of various reaction conditions, including temperature, enzyme loading, substrate molar ratio of benzyl alcohol to cinnamic acid, and reaction time, are investigated. A 5-level-4-factor central composite design is employed to search for the optimal yield of benzyl cinnamate. A quadratic polynomial regression model is used to analyze the experimental data at a 95% confidence level (P<0.05). The coefficient of determination of this model is found to be 0.9851. Three sets of optimum reaction conditions are established, and the verified experimental trials are performed for validating the optimum points. Under the optimum conditions (40°C, 31mg/mL enzyme loading, 2.6:1 molar ratio, 27h), the yield reaches 97.7%, which provides an efficient processes for industrial production of benzyl cinnamate. Copyright © 2016 Elsevier Ltd. All rights reserved.
Improved Surface Parameter Retrievals using AIRS/AMSU Data
NASA Technical Reports Server (NTRS)
Susskind, Joel; Blaisdell, John
2008-01-01
The AIRS Science Team Version 5.0 retrieval algorithm became operational at the Goddard DAAC in July 2007 generating near real-time products from analysis of AIRS/AMSU sounding data. This algorithm contains many significant theoretical advances over the AIRS Science Team Version 4.0 retrieval algorithm used previously. Two very significant developments of Version 5 are: 1) the development and implementation of an improved Radiative Transfer Algorithm (RTA) which allows for accurate treatment of non-Local Thermodynamic Equilibrium (non-LTE) effects on shortwave sounding channels; and 2) the development of methodology to obtain very accurate case by case product error estimates which are in turn used for quality control. These theoretical improvements taken together enabled a new methodology to be developed which further improves soundings in partially cloudy conditions. In this methodology, longwave C02 channel observations in the spectral region 700 cm(exp -1) to 750 cm(exp -1) are used exclusively for cloud clearing purposes, while shortwave C02 channels in the spectral region 2195 cm(exp -1) 2395 cm(exp -1) are used for temperature sounding purposes. This allows for accurate temperature soundings under more difficult cloud conditions. This paper further improves on the methodology used in Version 5 to derive surface skin temperature and surface spectral emissivity from AIRS/AMSU observations. Now, following the approach used to improve tropospheric temperature profiles, surface skin temperature is also derived using only shortwave window channels. This produces improved surface parameters, both day and night, compared to what was obtained in Version 5. These in turn result in improved boundary layer temperatures and retrieved total O3 burden.
Metamaterial bricks and quantization of meta-surfaces
Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R.; Drinkwater, Bruce W.; Subramanian, Sriram
2017-01-01
Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units—which we call metamaterial bricks—each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators. PMID:28240283
Metamaterial bricks and quantization of meta-surfaces
NASA Astrophysics Data System (ADS)
Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R.; Drinkwater, Bruce W.; Subramanian, Sriram
2017-02-01
Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units--which we call metamaterial bricks--each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators.
Metamaterial bricks and quantization of meta-surfaces.
Memoli, Gianluca; Caleap, Mihai; Asakawa, Michihiro; Sahoo, Deepak R; Drinkwater, Bruce W; Subramanian, Sriram
2017-02-27
Controlling acoustic fields is crucial in diverse applications such as loudspeaker design, ultrasound imaging and therapy or acoustic particle manipulation. The current approaches use fixed lenses or expensive phased arrays. Here, using a process of analogue-to-digital conversion and wavelet decomposition, we develop the notion of quantal meta-surfaces. The quanta here are small, pre-manufactured three-dimensional units-which we call metamaterial bricks-each encoding a specific phase delay. These bricks can be assembled into meta-surfaces to generate any diffraction-limited acoustic field. We apply this methodology to show experimental examples of acoustic focusing, steering and, after stacking single meta-surfaces into layers, the more complex field of an acoustic tractor beam. We demonstrate experimentally single-sided air-borne acoustic levitation using meta-layers at various bit-rates: from a 4-bit uniform to 3-bit non-uniform quantization in phase. This powerful methodology dramatically simplifies the design of acoustic devices and provides a key-step towards realizing spatial sound modulators.
Pinheiro, Rubiane C; Soares, Cleide M F; de Castro, Heizir F; Moraes, Flavio F; Zanin, Gisella M
2008-03-01
The conditions for maximization of the enzymatic activity of lipase entrapped in sol-gel matrix were determined for different vegetable oils using an experimental design. The effects of pH, temperature, and biocatalyst loading on lipase activity were verified using a central composite experimental design leading to a set of 13 assays and the surface response analysis. For canola oil and entrapped lipase, statistical analyses showed significant effects for pH and temperature and also the interactions between pH and temperature and temperature and biocatalyst loading. For the olive oil and entrapped lipase, it was verified that the pH was the only variable statistically significant. This study demonstrated that response surface analysis is a methodology appropriate for the maximization of the percentage of hydrolysis, as a function of pH, temperature, and lipase loading.
Fan, HuiYin; Dumont, Marie-Josée; Simpson, Benjamin K
2017-11-01
Gelatin from salmon ( Salmo salar ) skin with high molecular weight protein chains ( α -chains) was extracted using trypsin-aided process. Response surface methodology was used to optimise the extraction parameters. Yield, hydroxyproline content and protein electrophoretic profile via sodium dodecyl sulfate-polyacrylamide gel electrophoresis analysis of gelatin were used as responses in the optimization study. The optimum conditions were determined as: trypsin concentration at 1.49 U/g; extraction temperature at 45 °C; and extraction time at 6 h 16 min. This response surface optimized model was significant and produced an experimental value (202.04 ± 8.64%) in good agreement with the predicted value (204.19%). Twofold higher yields of gelatin with high molecular weight protein chains were achieved in the optimized process with trypsin treatment when compared to the process without trypsin.
Modeling of electrohydrodynamic drying process using response surface methodology
Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin
2014-01-01
Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289
Kim, Jae Kyeom; Lim, Ho-Jeong; Kim, Mi-So; Choi, Soo Jung; Kim, Mi-Jeong; Kim, Cho Rong; Shin, Dong-Hoon; Shin, Eui-Cheol
2016-01-01
Background: The central nervous system is easily damaged by oxidative stress due to high oxygen consumption and poor defensive capacity. Hence, multiple studies have demonstrated that inhibiting oxidative stress-induced damage, through an antioxidant-rich diet, might be a reasonable approach to prevent neurodegenerative disease. Objective: In the present study, response surface methodology was utilized to optimize the extraction for neuro-protective constituents of Camellia japonica byproducts. Materials and Methods: Rat pheochromocytoma cells were used to evaluate protective potential of Camellia japonica byproducts. Results: Optimum conditions were 33.84 min, 75.24%, and 75.82°C for time, ethanol concentration and temperature. Further, we demonstrated that major organic acid contents were significantly impacted by the extraction conditions, which may explain varying magnitude of protective potential between fractions. Conclusions: Given the paucity of information in regards to defatted C. japonica seed cake and their health promoting potential, our results herein provide interesting preliminary data for utilization of this byproduct from oil processing in both academic and industrial applications. SUMMARY Neuro-protective potential of C. japonica seed cake on cell viability was affected by extraction conditionsExtraction conditions effectively influenced on active constituents of C. japonica seed cakeBiological activity of C. japonica seed cake was optimized by the responsive surface methodology. Abbreviations used: GC-MS: Gas chromatography-mass spectrometer, MTT: 3-(4,5-dimethylthiazol-2-yl)-2,5-diphenyltetrazolium bromide, PC12 cells: Pheochromocytoma, RSM: Response surface methodology. PMID:27601847
USDA-ARS?s Scientific Manuscript database
Despite considerable interest in the application of land surface data assimilation systems (LDAS) for agricultural drought applications, relatively little is known about the large-scale performance of such systems and, thus, the optimal methodological approach for implementing them. To address this ...
Leading for Sustainability: Is Surface Understanding Enough?
ERIC Educational Resources Information Center
Pepper, Coral; Wildy, Helen
2008-01-01
Purpose: This paper aims to report an investigation of how education for sustainability is conceptualised, incorporated across the curriculum and led in three Western Australian Government secondary schools. It also reports on processes to enable education for sustainability to become embedded into these schools. Design/methodology/approach: Data…
NASA Astrophysics Data System (ADS)
Pinault, J.-L.; Berthier, F.
2007-01-01
We propose a methodological approach to characterize the resilience of aquatic ecosystems with respect to the evolution of environmental parameters as well as their aptitude to adapt to forcings. This method that is applied to Lake Annecy, France, proceeds in three stages. First, according to the depth, variations of physicochemical parameters versus time are separated into three components related to (1) energy transfer through the surface of the lake, (2) the flow of rivers and springs that feed the lake, and (3) long-term evolution of the benthic zone as a consequence of mineral and organic matter loads. Second, dynamics of the lake are deduced by analyzing the physicochemical parameter components related to the three boundary conditions. Third, a stochastic process associated with the transfer models aims to characterize the resilience of the lakes according to forcings. For Lake Annecy, whose dynamics are representative of oligotrophic stratified lakes controlled by decarbonation processes where turnover and mixing occurring once a year in winter, the major consequence is the impoverishment of dissolved oxygen in deep water in autumn due to a temperature increase of the surface water in summer. The simulation raises relevant questions about whether a connection exists between physicochemical parameters and global warming, which should not induce harmful consequences on water quality and biodiversity in deep water. This methodological approach is general since it does not use any physical conceptual model to predict the hydrosystem behavior but uses directly observed data.
A new mathematical modeling approach for the energy of threonine molecule
NASA Astrophysics Data System (ADS)
Sahiner, Ahmet; Kapusuz, Gulden; Yilmaz, Nurullah
2017-07-01
In this paper, we propose an improved new methodology in energy conformation problems for finding optimum energy values. First, we construct the Bezier surfaces near local minimizers based on the data obtained from Density Functional Theory (DFT) calculations. Second, we blend the constructed surfaces in order to obtain a single smooth model. Finally, we apply the global optimization algorithm to find two torsion angles those make the energy of the molecule minimum.
Surface chemistry at Swiss Universities of Applied Sciences.
Brodard, Pierre; Pfeifer, Marc E; Adlhart, Christian D; Pieles, Uwe; Shahgaldian, Patrick
2014-01-01
In the Swiss Universities of Applied Sciences, a number of research groups are involved in surface science, with different methodological approaches and a broad range of sophisticated characterization techniques. A snapshot of the current research going on in different groups from the University of Applied Sciences and Arts Western Switzerland (HES-SO), the Zurich University of Applied Sciences (ZHAW) and the University of Applied Sciences and Arts Northwestern Switzerland (FHNW) is given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamada, Yuki; Grippo, Mark A.
2015-01-01
A monitoring plan that incorporates regional datasets and integrates cost-effective data collection methods is necessary to sustain the long-term environmental monitoring of utility-scale solar energy development in expansive, environmentally sensitive desert environments. Using very high spatial resolution (VHSR; 15 cm) multispectral imagery collected in November 2012 and January 2014, an image processing routine was developed to characterize ephemeral streams, vegetation, and land surface in the southwestern United States where increased utility-scale solar development is anticipated. In addition to knowledge about desert landscapes, the methodology integrates existing spectral indices and transformation (e.g., visible atmospherically resistant index and principal components); a newlymore » developed index, erosion resistance index (ERI); and digital terrain and surface models, all of which were derived from a common VHSR image. The methodology identified fine-scale ephemeral streams with greater detail than the National Hydrography Dataset and accurately estimated vegetation distribution and fractional cover of various surface types. The ERI classified surface types that have a range of erosive potentials. The remote-sensing methodology could ultimately reduce uncertainty and monitoring costs for all stakeholders by providing a cost-effective monitoring approach that accurately characterizes the land resources at potential development sites.« less
Space exploration initiative (SEI) logistics support lessons from the DoD
NASA Astrophysics Data System (ADS)
Cox, John R.; McCoy, Walbert G.; Jenkins, Terence
Proven and innovative logistics management approaches and techniques used for developing and supporting DoD and Strategic Defense Initiative Office (SDIO) systems are described on the basis of input from DoD to the SEI Synthesis Group; SDIO-developed logistics initiatives, innovative tools, and methodologies; and logistics planning support provided to the NASA/Johnson Planet Surface System Office. The approach is tailored for lunar/Martian surface operations, and provides guidelines for the development and management of a crucial element of the SEI logistics support program. A case study is presented which shows how incorporation of DoD's proven and innovative logistics management approach, tools, and techniques can substantially benefit early logistics planning for SEI, while also implementing many of DoD's recommendations for SEI.
NASA Technical Reports Server (NTRS)
Turc, Catalin; Anand, Akash; Bruno, Oscar; Chaubell, Julian
2011-01-01
We present a computational methodology (a novel Nystrom approach based on use of a non-overlapping patch technique and Chebyshev discretizations) for efficient solution of problems of acoustic and electromagnetic scattering by open surfaces. Our integral equation formulations (1) Incorporate, as ansatz, the singular nature of open-surface integral-equation solutions, and (2) For the Electric Field Integral Equation (EFIE), use analytical regularizes that effectively reduce the number of iterations required by iterative linear-algebra solution based on Krylov-subspace iterative solvers.
Response surface methodology, often supported by factorial designs, is the classical experimental approach that is widely accepted for detecting and characterizing interactions among chemicals in a mixture. In an effort to reduce the experimental effort as the number of compound...
The Gender Subtext of Organizational Learning
ERIC Educational Resources Information Center
Raaijmakers, Stephan; Bleijenbergh, Inge; Fokkinga, Brigit; Visser, Max
2018-01-01
Purpose: This paper aims to challenge the alleged gender-neutral character of Argyris and Schön's theory of organizational learning (1978). While theories in organizational science seem gender neutral at the surface, a closer analysis reveals they are often based on men's experiences. Design/methodology/approach: This paper uses the method of…
USDA-ARS?s Scientific Manuscript database
Surface energy fluxes, especially the latent heat flux from evapotranspiration (ET), determine exchanges of energy and mass between the hydrosphere, atmosphere, and biosphere. There are numerous remote sensing-based energy balance approaches such as METRIC and SEBAL that use hot and cold pixels from...
What to Buy? The Role of Director of Defense Research and Engineering (DDR&E) Lessons from the 1970s
2011-01-01
experience . ix Contents 1. Background , Methodology, and Approach .................................................................1 2. Origins and...65 5 . The Case of the 2000–3000 Ton Surface Effect Ship (SES) Prototype Program...97 1. Strategic Background : The Antisubmarine Warfare Experience ................97 2. The Problem
Terrestrial Ecosystems - Land Surface Forms of the Conterminous United States
Cress, Jill J.; Sayre, Roger G.; Comer, Patrick; Warner, Harumi
2009-01-01
As part of an effort to map terrestrial ecosystems, the U.S. Geological Survey has generated land surface form classes to be used in creating maps depicting standardized, terrestrial ecosystem models for the conterminous United States, using an ecosystems classification developed by NatureServe . A biophysical stratification approach, developed for South America and now being implemented globally, was used to model the ecosystem distributions. Since land surface forms strongly influence the differentiation and distribution of terrestrial ecosystems, they are one of the key input layers in this biophysical stratification. After extensive investigation into various land surface form mapping methodologies, the decision was made to use the methodology developed by the Missouri Resource Assessment Partnership (MoRAP). MoRAP made modifications to Hammond's land surface form classification, which allowed the use of 30-meter source data and a 1-km2 window for analyzing the data cell and its surrounding cells (neighborhood analysis). While Hammond's methodology was based on three topographic variables, slope, local relief, and profile type, MoRAP's methodology uses only slope and local relief. Using the MoRAP method, slope is classified as gently sloping when more than 50 percent of the area in a 1-km2 neighborhood has slope less than 8 percent, otherwise the area is considered moderately sloping. Local relief, which is the difference between the maximum and minimum elevation in a neighborhood, is classified into five groups: 0-15 m, 16-30 m, 31-90 m, 91-150 m, and >150 m. The land surface form classes are derived by combining slope and local relief to create eight landform classes: flat plains (gently sloping and local relief = 90 m), low hills (not gently sloping and local relief = 150 m). However, in the USGS application of the MoRAP methodology, an additional local relief group was used (> 400 m) to capture additional local topographic variation. As a result, low mountains were redefined as not gently sloping and 151 m 400 m. The final application of the MoRAP methodology was implemented using the USGS 30-meter National Elevation Dataset and an existing USGS slope dataset that had been derived by calculating the slope from the NED in Universal Transverse Mercator (UTM) coordinates in each UTM zone, and then combining all of the zones into a national dataset. This map shows a smoothed image of the nine land surface form classes based on MoRAP's methodology. Additional information about this map and any data developed for the ecosystems modeling of the conterminous United States is available online at http://rmgsc.cr.usgs.gov/ecosystems/.
Versatile multi-functionalization of protein nanofibrils for biosensor applications
NASA Astrophysics Data System (ADS)
Sasso, L.; Suei, S.; Domigan, L.; Healy, J.; Nock, V.; Williams, M. A. K.; Gerrard, J. A.
2014-01-01
Protein nanofibrils offer advantages over other nanostructures due to the ease in their self-assembly and the versatility of surface chemistry available. Yet, an efficient and general methodology for their post-assembly functionalization remains a significant challenge. We introduce a generic approach, based on biotinylation and thiolation, for the multi-functionalization of protein nanofibrils self-assembled from whey proteins. Biochemical characterization shows the effects of the functionalization onto the nanofibrils' surface, giving insights into the changes in surface chemistry of the nanostructures. We show how these methods can be used to decorate whey protein nanofibrils with several components such as fluorescent quantum dots, enzymes, and metal nanoparticles. A multi-functionalization approach is used, as a proof of principle, for the development of a glucose biosensor platform, where the protein nanofibrils act as nanoscaffolds for glucose oxidase. Biotinylation is used for enzyme attachment and thiolation for nanoscaffold anchoring onto a gold electrode surface. Characterization via cyclic voltammetry shows an increase in glucose-oxidase mediated current response due to thiol-metal interactions with the gold electrode. The presented approach for protein nanofibril multi-functionalization is novel and has the potential of being applied to other protein nanostructures with similar surface chemistry.Protein nanofibrils offer advantages over other nanostructures due to the ease in their self-assembly and the versatility of surface chemistry available. Yet, an efficient and general methodology for their post-assembly functionalization remains a significant challenge. We introduce a generic approach, based on biotinylation and thiolation, for the multi-functionalization of protein nanofibrils self-assembled from whey proteins. Biochemical characterization shows the effects of the functionalization onto the nanofibrils' surface, giving insights into the changes in surface chemistry of the nanostructures. We show how these methods can be used to decorate whey protein nanofibrils with several components such as fluorescent quantum dots, enzymes, and metal nanoparticles. A multi-functionalization approach is used, as a proof of principle, for the development of a glucose biosensor platform, where the protein nanofibrils act as nanoscaffolds for glucose oxidase. Biotinylation is used for enzyme attachment and thiolation for nanoscaffold anchoring onto a gold electrode surface. Characterization via cyclic voltammetry shows an increase in glucose-oxidase mediated current response due to thiol-metal interactions with the gold electrode. The presented approach for protein nanofibril multi-functionalization is novel and has the potential of being applied to other protein nanostructures with similar surface chemistry. Electronic supplementary information (ESI) available: Cyclic voltammetry characterization of biosensor platforms including bare Au electrodes (Fig. S1), biosensor response to various glucose concentrations (Fig. S2), and AFM roughness measurements due to WPNF modifications (Fig. S3). See DOI: 10.1039/c3nr05752f
The evolution equation for the flame surface density in turbulent premixed combustion
NASA Technical Reports Server (NTRS)
Trouve, A.; Poinsot, T.
1992-01-01
One central ingredient in flamelet models for turbulent premixed combustion is the flame surface density. This quantity conveys most of the effects of the turbulence on the rate of energy release and is obtained via a modeled transport equation, called the Sigma-equation. Past theoretical work has produced a rigorous approach that leads to an exact, but unclosed, formulation for the turbulent Sigma-equation. In this exact Sigma-equation, it appears that the dynamical properties of the flame surface density are determined by a single parameter, namely the turbulent flame stretch. Unfortunately, the flame surface density and the turbulent flame stretch are not available from experiments and, in the absence of experimental data, little is known on the validity of the closure assumptions used in current flamelet models. Direct Numerical Simulation (DNS) is the obvious, complementary approach to get basic information on these fundamental quantities. Three-dimensional DNS of premixed flames in isotropic turbulent flow is used to estimate the different terms appearing in the Sigma-equation. A new methodology is proposed to provide the source and sink terms for the flame surface density, resolved both temporally and spatially throughout the turbulent flame brush. Using this methodology, the effects of the Lewis number on the rate of production of flame surface area are described in great detail and meaningful comparisons with flamelet models can be performed. The analysis reveals in particular the tendency of the models to overpredict flame surface dissipation as well as their inability to reproduce variations due to thermo-diffusive phenomena. Thanks to the detailed information produced by a DNS-based analysis, this type of comparison not only underscores the shortcomings of current models but also suggests ways to improve them.
USDA-ARS?s Scientific Manuscript database
It is desirable to be able to predict above ground biomass production indirectly, without extensive sampling or destructive harvesting. Leaf area index (LAI) is the amount of leaf surface area per ground area and is an important parameter in ecophysiology. As LAI increases, the photosynthetically ...
Shet, Vinayaka B; Palan, Anusha M; Rao, Shama U; Varun, C; Aishwarya, Uday; Raja, Selvaraj; Goveas, Louella Concepta; Vaman Rao, C; Ujwal, P
2018-02-01
In the current investigation, statistical approaches were adopted to hydrolyse non-edible seed cake (NESC) of Pongamia and optimize the hydrolysis process by response surface methodology (RSM). Through the RSM approach, the optimized conditions were found to be 1.17%v/v of HCl concentration at 54.12 min for hydrolysis. Under optimized conditions, the release of reducing sugars was found to be 53.03 g/L. The RSM data were used to train the artificial neural network (ANN) and the predictive ability of both models was compared by calculating various statistical parameters. A three-layered ANN model consisting of 2:12:1 topology was developed; the response of the ANN model indicates that it is precise when compared with the RSM model. The fit of the models was expressed with the regression coefficient R 2 , which was found to be 0.975 and 0.888, respectively, for the ANN and RSM models. This further demonstrated that the performance of ANN was better than that of RSM.
Yellapu, Sravan Kumar; Bezawada, Jyothi; Kaur, Rajwinder; Kuttiraja, Mathiazhakan; Tyagi, Rajeshwar D
2016-10-01
The lipid extraction from the microbial biomass is a tedious and high cost dependent process. In the present study, detergent assisted lipids extraction from the culture of the yeast Yarrowia lipolytica SKY-7 was carried out. Response surface methodology (RSM) was used to investigate the effect of three principle parameters (N-LS concentration, time and temperature) on microbial lipid extraction efficiency % (w/w). The results obtained by statistical analysis showed that the quadratic model fits in all cases. Maximum lipid recovery of 95.3±0.3% w/w was obtained at the optimum level of process variables [N-LS concentration 24.42mg (equal to 48mgN-LS/g dry biomass), treatment time 8.8min and reaction temperature 30.2°C]. Whereas the conventional chloroform and methanol extraction to achieve total lipid recovery required 12h at 60°C. The study confirmed that oleaginous yeast biomass treatment with N-lauroyl sarcosine would be a promising approach for industrial scale microbial lipid recovery. Copyright © 2016 Elsevier Ltd. All rights reserved.
Methodologies for Removing/Desorbing and Transporting Particles from Surfaces to Instrumentation
NASA Astrophysics Data System (ADS)
Miller, Carla J.; Cespedes, Ernesto R.
2012-12-01
Explosive trace detection (ETD) continues to be a key technology supporting the fight against terrorist bombing threats. Very selective and sensitive ETD instruments have been developed to detect explosive threats concealed on personnel, in vehicles, in luggage, and in cargo containers, as well as for forensic analysis (e.g. post blast inspection, bomb-maker identification, etc.) in a broad range of homeland security, law enforcement, and military applications. A number of recent studies have highlighted the fact that significant improvements in ETD systems' capabilities will be achieved, not by increasing the selectivity/sensitivity of the sensors, but by improved techniques for particle/vapor sampling, pre-concentration, and transport to the sensors. This review article represents a compilation of studies focused on characterizing the adhesive properties of explosive particles, the methodologies for removing/desorbing these particles from a range of surfaces, and approaches for transporting them to the instrument. The objectives of this review are to summarize fundamental work in explosive particle characterization, to describe experimental work performed in harvesting and transport of these particles, and to highlight those approaches that indicate high potential for improving ETD capabilities.
Leroy, Frédéric; Müller-Plathe, Florian
2015-08-04
We introduce a methodology, referred to as the dry-surface method, to calculate the work of adhesion of heterogeneous solid-liquid interfaces by molecular simulation. This method employs a straightforward thermodynamic integration approach to calculate the work of adhesion as the reversible work to turn off the attractive part of the actual solid-liquid interaction potential. It is formulated in such a way that it may be used either to evaluate the ability of force fields to reproduce reference values of the work of adhesion or to optimize force-field parameters with reference values of the work of adhesion as target quantities. The methodology is tested in the case of water on a generic model of nonpolar substrates with the structure of gold. It is validated through a quantitative comparison to phantom-wall calculations and against a previous characterization of the thermodynamics of the gold-water interface. It is found that the work of adhesion of water on nonpolar substrates is a nonlinear function of the microscopic solid-liquid interaction energy parameter. We also comment on the ability of mean-field approaches to predict the work of adhesion of water on nonpolar substrates. In addition, we discuss in detail the information on the solid-liquid interfacial thermodynamics delivered by the phantom-wall approach. We show that phantom-wall calculations yield the solid-liquid interfacial tension relative to the solid surface tension rather than the absolute solid-liquid interfacial tension as previously believed.
Tactile surface classification for limbed robots using a pressure sensitive robot skin.
Shill, Jacob J; Collins, Emmanuel G; Coyle, Eric; Clark, Jonathan
2015-02-02
This paper describes an approach to terrain identification based on pressure images generated through direct surface contact using a robot skin constructed around a high-resolution pressure sensing array. Terrain signatures for classification are formulated from the magnitude frequency responses of the pressure images. The initial experimental results for statically obtained images show that the approach yields classification accuracies [Formula: see text]. The methodology is extended to accommodate the dynamic pressure images anticipated when a robot is walking or running. Experiments with a one-legged hopping robot yield similar identification accuracies [Formula: see text]. In addition, the accuracies are independent with respect to changing robot dynamics (i.e., when using different leg gaits). The paper further shows that the high-resolution capabilities of the sensor enables similarly textured surfaces to be distinguished. A correcting filter is developed to accommodate for failures or faults that inevitably occur within the sensing array with continued use. Experimental results show using the correcting filter can extend the effective operational lifespan of a high-resolution sensing array over 6x in the presence of sensor damage. The results presented suggest this methodology can be extended to autonomous field robots, providing a robot with crucial information about the environment that can be used to aid stable and efficient mobility over rough and varying terrains.
Bifunctional redox tagging of carbon nanoparticles
NASA Astrophysics Data System (ADS)
Poon, Jeffrey; Batchelor-McAuley, Christopher; Tschulik, Kristina; Palgrave, Robert G.; Compton, Richard G.
2015-01-01
Despite extensive work on the controlled surface modification of carbon with redox moieties, to date almost all available methodologies involve complex chemistry and are prone to the formation of polymerized multi-layer surface structures. Herein, the facile bifunctional redox tagging of carbon nanoparticles (diameter 27 nm) and its characterization is undertaken using the industrial dye Reactive Blue 2. The modification route is demonstrated to be via exceptionally strong physisorption. The modified carbon is found to exhibit both well-defined oxidative and reductive voltammetric redox features which are quantitatively interpreted. The method provides a generic approach to monolayer modifications of carbon and carbon nanoparticle surfaces.
NASA Astrophysics Data System (ADS)
Adamu, Musa; Mohammed, Bashar S.; Shafiq, Nasir
2018-04-01
Roller compacted concrete (RCC) when used for pavement is subjected to skidding/rubbing by wheels of moving vehicles, this causes pavement surface to wear out and abrade. Therefore, abrasion resistance is one of the most important properties of concern for RCC pavement. In this study, response surface methodology was used to design, evaluate and analyze the effect of partial replacement of fine aggregate with crumb rubber, and addition of nano silica on the abrasion resistance of roller compacted rubbercrete (RCR). RCR is the terminology used for RCC pavement where crumb rubber was used as partial replacement to fine aggregate. The Box-Behnken design method was used to develop the mixtures combinations using 10%, 20%, and 30% crumb rubber with 0%, 1%, and 2% nano silica. The Cantabro loss method was used to measure the abrasion resistance. The results showed that the abrasion resistance of RCR decreases with increase in crumb rubber content, and increases with increase in addition of nano silica. The analysis of variance shows that the model developed using response surface methodology (RSM) has a very good degree of correlation, and can be used to predict the abrasion resistance of RCR with a percentage error of 5.44%. The combination of 10.76% crumb rubber and 1.59% nano silica yielded the best combinations of RCR in terms of abrasion resistance of RCR.
A boundary element method for Stokes flows with interfaces
NASA Astrophysics Data System (ADS)
Alinovi, Edoardo; Bottaro, Alessandro
2018-03-01
The boundary element method is a widely used and powerful technique to numerically describe multiphase flows with interfaces, satisfying Stokes' approximation. However, low viscosity ratios between immiscible fluids in contact at an interface and large surface tensions may lead to consistency issues as far as mass conservation is concerned. A simple and effective approach is described to ensure mass conservation at all viscosity ratios and capillary numbers within a standard boundary element framework. Benchmark cases are initially considered demonstrating the efficacy of the proposed technique in satisfying mass conservation, comparing with approaches and other solutions present in the literature. The methodology developed is finally applied to the problem of slippage over superhydrophobic surfaces.
NASA Astrophysics Data System (ADS)
Zhang, Hongjuan; Kurtz, Wolfgang; Kollet, Stefan; Vereecken, Harry; Franssen, Harrie-Jan Hendricks
2018-01-01
The linkage between root zone soil moisture and groundwater is either neglected or simplified in most land surface models. The fully-coupled subsurface-land surface model TerrSysMP including variably saturated groundwater dynamics is used in this work. We test and compare five data assimilation methodologies for assimilating groundwater level data via the ensemble Kalman filter (EnKF) to improve root zone soil moisture estimation with TerrSysMP. Groundwater level data are assimilated in the form of pressure head or soil moisture (set equal to porosity in the saturated zone) to update state vectors. In the five assimilation methodologies, the state vector contains either (i) pressure head, or (ii) log-transformed pressure head, or (iii) soil moisture, or (iv) pressure head for the saturated zone only, or (v) a combination of pressure head and soil moisture, pressure head for the saturated zone and soil moisture for the unsaturated zone. These methodologies are evaluated in synthetic experiments which are performed for different climate conditions, soil types and plant functional types to simulate various root zone soil moisture distributions and groundwater levels. The results demonstrate that EnKF cannot properly handle strongly skewed pressure distributions which are caused by extreme negative pressure heads in the unsaturated zone during dry periods. This problem can only be alleviated by methodology (iii), (iv) and (v). The last approach gives the best results and avoids unphysical updates related to strongly skewed pressure heads in the unsaturated zone. If groundwater level data are assimilated by methodology (iii), EnKF fails to update the state vector containing the soil moisture values if for (almost) all the realizations the observation does not bring significant new information. Synthetic experiments for the joint assimilation of groundwater levels and surface soil moisture support methodology (v) and show great potential for improving the representation of root zone soil moisture.
2016-01-01
The standard analytical approach for studying steady gravity free-surface waves generated by a moving body often relies upon a linearization of the physical geometry, where the body is considered asymptotically small in one or several of its dimensions. In this paper, a methodology that avoids any such geometrical simplification is presented for the case of steady-state flows at low speeds. The approach is made possible through a reduction of the water-wave equations to a complex-valued integral equation that can be studied using the method of steepest descents. The main result is a theory that establishes a correspondence between different bluff-bodied free-surface flow configurations, with the topology of the Riemann surface formed by the steepest descent paths. Then, when a geometrical feature of the body is modified, a corresponding change to the Riemann surface is observed, and the resultant effects to the water waves can be derived. This visual procedure is demonstrated for the case of two-dimensional free-surface flow past a surface-piercing ship and over an angled step in a channel. PMID:27493559
Ceglie, Francesco Giovanni; Bustamante, Maria Angeles; Ben Amara, Mouna; Tittarelli, Fabio
2015-01-01
Peat replacement is an increasing demand in containerized and transplant production, due to the environmental constraints associated to peat use. However, despite the wide information concerning the use of alternative materials as substrates, it is very complex to establish the best materials and mixtures. This work evaluates the use of mixture design and surface response methodology in a peat substitution experiment using two alternative materials (green compost and palm fibre trunk waste) for transplant production of tomato (Lycopersicon esculentum Mill.); melon, (Cucumis melo L.); and lettuce (Lactuca sativa L.) in organic farming conditions. In general, the substrates showed suitable properties for their use in seedling production, showing the best plant response the mixture of 20% green compost, 39% palm fibre and 31% peat. The mixture design and applied response surface methodology has shown to be an useful approach to optimize substrate formulations in peat substitution experiments to standardize plant responses. PMID:26070163
Surface Connectivity and Interocean Exchanges From Drifter-Based Transition Matrices
NASA Astrophysics Data System (ADS)
McAdam, Ronan; van Sebille, Erik
2018-01-01
Global surface transport in the ocean can be represented by using the observed trajectories of drifters to calculate probability distribution functions. The oceanographic applications of the Markov Chain approach to modeling include tracking of floating debris and water masses, globally and on yearly-to-centennial time scales. Here we analyze the error inherent with mapping trajectories onto a grid and the consequences for ocean transport modeling and detection of accumulation structures. A sensitivity analysis of Markov Chain parameters is performed in an idealized Stommel gyre and western boundary current as well as with observed ocean drifters, complementing previous studies on widespread floating debris accumulation. Focusing on two key areas of interocean exchange—the Agulhas system and the North Atlantic intergyre transport barrier—we assess the capacity of the Markov Chain methodology to detect surface connectivity and dynamic transport barriers. Finally, we extend the methodology's functionality to separate the geostrophic and nongeostrophic contributions to interocean exchange in these key regions.
Reversible on-surface wiring of resistive circuits.
Inkpen, Michael S; Leroux, Yann R; Hapiot, Philippe; Campos, Luis M; Venkataraman, Latha
2017-06-01
Whilst most studies in single-molecule electronics involve components first synthesized ex situ , there is also great potential in exploiting chemical transformations to prepare devices in situ . Here, as a first step towards this goal, we conduct reversible reactions on monolayers to make and break covalent bonds between alkanes of different lengths, then measure the conductance of these molecules connected between electrodes using the scanning tunneling microscopy-based break junction (STM-BJ) method. In doing so, we develop the critical methodology required for assembling and disassembling surface-bound single-molecule circuits. We identify effective reaction conditions for surface-bound reagents, and importantly demonstrate that the electronic characteristics of wires created in situ agree with those created ex situ . Finally, we show that the STM-BJ technique is unique in its ability to definitively probe surface reaction yields both on a local (∼50 nm 2 ) and pseudo-global (≥10 mm 2 ) level. This investigation thus highlights a route to the construction and integration of more complex, and ultimately functional, surface-based single-molecule circuitry, as well as advancing a methodology that facilitates studies beyond the reach of traditional ex situ synthetic approaches.
Development of a New Methodology for Computing Surface Sensible Heat Fluxes using Thermal Imagery
NASA Astrophysics Data System (ADS)
Morrison, T. J.; Calaf, M.; Fernando, H. J.; Price, T. A.; Pardyjak, E.
2017-12-01
Current numerical weather predication models utilize similarity to characterize momentum, moisture, and heat fluxes. Such formulations are only valid under the ideal assumptions of spatial homogeneity, statistical stationary, and zero subsidence. However, recent surface temperature measurements from the Mountain Terrain Atmospheric Modeling and Observations (MATERHORN) Program on the Salt Flats of Utah's West desert, show that even under the most a priori ideal conditions, heterogeneity of the aforementioned variables exists. We present a new method to extract spatially-distributed measurements of surface sensible heat flux from thermal imagery. The approach consists of using a surface energy budget, where the ground heat flux is easily computed from limited measurements using a force-restore-type methodology, the latent heat fluxes are neglected, and the energy storage is computed using a lumped capacitance model. Preliminary validation of the method is presented using experimental data acquired from a nearby sonic anemometer during the MATERHORN campaign. Additional evaluation is required to confirm the method's validity. Further decomposition analysis of on-site instrumentation (thermal camera, cold-hotwire probes, and sonic anemometers) using Proper Orthogonal Decomposition (POD), and wavelet analysis, reveals time scale similarity between the flow and surface fluctuations.
NASA Astrophysics Data System (ADS)
Bala, N.; Napiah, M.; Kamaruddin, I.; Danlami, N.
2018-04-01
In this study, modelling and optimization of materials polyethylene, polypropylene and nanosilica for nanocomposite modified asphalt mixtures has been examined to obtain optimum quantities for higher fatique life. Response Surface Methodology (RSM) was applied for the optimization based on Box Behnken design (BBD). Interaction effects of independent variables polymers and nanosilica on fatique life were evaluated. The result indicates that the individual effects of polymers and nanosilica content are both important. However, the content of nanosilica used has more significant effect on fatique life resistance. Also, the mean error obtained from optimization results is less than 5% for all the responses, this indicates that predicted values are in agreement with experimental results. Furthermore, it was concluded that asphalt mixture design with high performance properties, optimization using RSM is a very effective approach.
Yoo, Sung Jin; Park, Bong Seok
2017-09-06
This paper addresses a distributed connectivity-preserving synchronized tracking problem of multiple uncertain nonholonomic mobile robots with limited communication ranges. The information of the time-varying leader robot is assumed to be accessible to only a small fraction of follower robots. The main contribution of this paper is to introduce a new distributed nonlinear error surface for dealing with both the synchronized tracking and the preservation of the initial connectivity patterns among nonholonomic robots. Based on this nonlinear error surface, the recursive design methodology is presented to construct the approximation-based local adaptive tracking scheme at the robot dynamic level. Furthermore, a technical lemma is established to analyze the stability and the connectivity preservation of the total closed-loop control system in the Lyapunov sense. An example is provided to illustrate the effectiveness of the proposed methodology.
NASA Astrophysics Data System (ADS)
Hodge, R.; Brasington, J.; Richards, K.
2009-04-01
The ability to collect 3D elevation data at mm-resolution from in-situ natural surfaces, such as fluvial and coastal sediments, rock surfaces, soils and dunes, is beneficial for a range of geomorphological and geological research. From these data the properties of the surface can be measured, and Digital Terrain Models (DTM) can be constructed. Terrestrial Laser Scanning (TLS) can collect quickly such 3D data with mm-precision and mm-spacing. This paper presents a methodology for the collection and processing of such TLS data, and considers how the errors in this TLS data can be quantified. TLS has been used to collect elevation data from fluvial gravel surfaces. Data were collected from areas of approximately 1 m2, with median grain sizes ranging from 18 to 63 mm. Errors are inherent in such data as a result of the precision of the TLS, and the interaction of factors including laser footprint, surface topography, surface reflectivity and scanning geometry. The methodology for the collection and processing of TLS data from complex surfaces like these fluvial sediments aims to minimise the occurrence of, and remove, such errors. The methodology incorporates taking scans from multiple scanner locations, averaging repeat scans, and applying a series of filters to remove erroneous points. Analysis of 2.5D DTMs interpolated from the processed data has identified geomorphic properties of the gravel surfaces, including the distribution of surface elevations, preferential grain orientation and grain imbrication. However, validation of the data and interpolated DTMs is limited by the availability of techniques capable of collecting independent elevation data of comparable quality. Instead, two alternative approaches to data validation are presented. The first consists of careful internal validation to optimise filter parameter values during data processing combined with a series of laboratory experiments. In the experiments, TLS data were collected from a sphere and planes with different reflectivities to measure the accuracy and precision of TLS data of these geometrically simple objects. Whilst this first approach allows the maximum precision of TLS data from complex surfaces to be estimated, it cannot quantify the distribution of errors within the TLS data and across the interpolated DTMs. The second approach enables this by simulating the collection of TLS data from complex surfaces of a known geometry. This simulated scanning has been verified through systematic comparison with laboratory TLS data. Two types of surface geometry have been investigated: simulated regular arrays of uniform spheres used to analyse the effect of sphere size; and irregular beds of spheres with the same grain size distribution as the fluvial gravels, which provide a comparable complex geometry to the field sediment surfaces. A series of simulated scans of these surfaces has enabled the magnitude and spatial distribution of errors in the interpolated DTMs to be quantified, as well as demonstrating the utility of the different processing stages in removing errors from TLS data. As well as demonstrating the application of simulated scanning as a technique to quantify errors, these results can be used to estimate errors in comparable TLS data.
Toumi, Héla; Boumaiza, Moncef; Millet, Maurice; Radetski, Claudemir Marcos; Camara, Baba Issa; Felten, Vincent; Masfaraud, Jean-François; Férard, Jean-François
2018-04-19
We studied the combined acute effect (i.e., after 48 h) of deltamethrin (a pyrethroid insecticide) and malathion (an organophosphate insecticide) on Daphnia magna. Two approaches were used to examine the potential interaction effects of eight mixtures of deltamethrin and malathion: (i) calculation of mixture toxicity index (MTI) and safety factor index (SFI) and (ii) response surface methodology coupled with isobole-based statistical model (using generalized linear model). According to the calculation of MTI and SFI, one tested mixture was found additive while the two other tested mixtures were found no additive (MTI) or antagonistic (SFI), but these differences between index responses are only due to differences in terminology related to these two indexes. Through the surface response approach and isobologram analysis, we concluded that there was a significant antagonistic effect of the binary mixtures of deltamethrin and malathion that occurs on D. magna immobilization, after 48 h of exposure. Index approaches and surface response approach with isobologram analysis are complementary. Calculation of mixture toxicity index and safety factor index allows identifying punctually the type of interaction for several tested mixtures, while the surface response approach with isobologram analysis integrates all the data providing a global outcome about the type of interactive effect. Only the surface response approach and isobologram analysis allowed the statistical assessment of the ecotoxicological interaction. Nevertheless, we recommend the use of both approaches (i) to identify the combined effects of contaminants and (ii) to improve risk assessment and environmental management.
Marsic, Damien; Govindasamy, Lakshmanan; Currlin, Seth; Markusic, David M; Tseng, Yu-Shan; Herzog, Roland W; Agbandje-McKenna, Mavis; Zolotukhin, Sergei
2014-01-01
Methodologies to improve existing adeno-associated virus (AAV) vectors for gene therapy include either rational approaches or directed evolution to derive capsid variants characterized by superior transduction efficiencies in targeted tissues. Here, we integrated both approaches in one unified design strategy of “virtual family shuffling” to derive a combinatorial capsid library whereby only variable regions on the surface of the capsid are modified. Individual sublibraries were first assembled in order to preselect compatible amino acid residues within restricted surface-exposed regions to minimize the generation of dead-end variants. Subsequently, the successful families were interbred to derive a combined library of ~8 × 105 complexity. Next-generation sequencing of the packaged viral DNA revealed capsid surface areas susceptible to directed evolution, thus providing guidance for future designs. We demonstrated the utility of the library by deriving an AAV2-based vector characterized by a 20-fold higher transduction efficiency in murine liver, now equivalent to that of AAV8. PMID:25048217
System-morphological approach: Another look at morphology research and geomorphological mapping
NASA Astrophysics Data System (ADS)
Lastochkin, Alexander N.; Zhirov, Andrey I.; Boltramovich, Sergei F.
2018-02-01
A large number of studies require a clear and unambiguous morphological basis. For over thirty years, Russian scientists have been applying a system-morphological approach for the Arctic and Antarctic research, ocean floor investigation, for various infrastructure construction projects (oil and gas, sports, etc.), in landscape and environmental studies. This article is a review aimed to introduce this methodological approach to the international scientific community. The details of the methods and techniques can be found in a series of earlier papers published in the Russian language in 1987-2016. The proposed system-morphological approach includes: 1) partitioning of the Earth surface, i.e. precise identification of linear, point, and areal elements of topography considered as a two-dimensional surface without any geological substance; 2) further identification of larger formations: geomorphological systems and regions; 3) analysis of structural relations and symmetry of topography; and 4) various dynamic (litho- and glaciodynamic, tectonic, etc.) interpretations of the observed morphology. This method can be used to study the morphology of the surface topography as well as less accessible interfaces such as submarine and subglacial ones.
ERIC Educational Resources Information Center
Hanyak, Michael E., Jr.
2015-01-01
In an introductory chemical engineering course, the conceptual framework of a holistic problem-solving methodology in conjunction with a problem-based learning approach has been shown to create a learning environment that nurtures deep learning rather than surface learning. Based on exam scores, student grades are either the same or better than…
NASA Astrophysics Data System (ADS)
Francisco, Arthur; Blondel, Cécile; Brunetière, Noël; Ramdarshan, Anusha; Merceron, Gildas
2018-03-01
Tooth wear and, more specifically, dental microwear texture is a dietary proxy that has been used for years in vertebrate paleoecology and ecology. DMTA, dental microwear texture analysis, relies on a few parameters related to the surface complexity, anisotropy and heterogeneity of the enamel facets at the micrometric scale. Working with few but physically meaningful parameters helps in comparing published results and in defining levels for classification purposes. Other dental microwear approaches are based on ISO parameters and coupled with statistical tests to find the more relevant ones. The present study roughly utilizes most of the aforementioned parameters in their more or less modified form. But more than parameters, we here propose a new approach: instead of a single parameter characterizing the whole surface, we sample the surface and thus generate 9 derived parameters in order to broaden the parameter set. The identification of the most discriminative parameters is performed with an automated procedure which is an extended and refined version of the workflows encountered in some studies. The procedure in its initial form includes the most common tools, like the ANOVA and the correlation analysis, along with the required mathematical tests. The discrimination results show that a simplified form of the procedure is able to more efficiently identify the desired number of discriminative parameters. Also highlighted are some trends like the relevance of working with both height and spatial parameters, as well as the potential benefits of dimensionless surfaces. On a set of 45 surfaces issued from 45 specimens of three modern ruminants with differences in feeding preferences (grazing, leaf-browsing and fruit-eating), it is clearly shown that the level of wear discrimination is improved with the new methodology compared to the other ones.
NASA Astrophysics Data System (ADS)
Kirkham, R.; Olsen, K.; Hayes, J. C.; Emer, D. F.
2013-12-01
Underground nuclear tests may be first detected by seismic or air samplers operated by the CTBTO (Comprehensive Nuclear-Test-Ban Treaty Organization). After initial detection of a suspicious event, member nations may call for an On-Site Inspection (OSI) that in part, will sample for localized releases of radioactive noble gases and particles. Although much of the commercially available equipment and methods used for surface and subsurface environmental sampling of gases can be used for an OSI scenario, on-site sampling conditions, required sampling volumes and establishment of background concentrations of noble gases require development of specialized methodologies. To facilitate development of sampling equipment and methodologies that address OSI sampling volume and detection objectives, and to collect information required for model development, a field test site was created at a former underground nuclear explosion site located in welded volcanic tuff. A mixture of SF-6, Xe127 and Ar37 was metered into 4400 m3 of air as it was injected into the top region of the UNE cavity. These tracers were expected to move towards the surface primarily in response to barometric pumping or through delayed cavity pressurization (accelerated transport to minimize source decay time). Sampling approaches compared during the field exercise included sampling at the soil surface, inside surface fractures, and at soil vapor extraction points at depths down to 2 m. Effectiveness of various sampling approaches and the results of tracer gas measurements will be presented.
Analytical Round Robin for Elastic-Plastic Analysis of Surface Cracked Plates: Phase I Results
NASA Technical Reports Server (NTRS)
Wells, D. N.; Allen, P. A.
2012-01-01
An analytical round robin for the elastic-plastic analysis of surface cracks in flat plates was conducted with 15 participants. Experimental results from a surface crack tension test in 2219-T8 aluminum plate provided the basis for the inter-laboratory study (ILS). The study proceeded in a blind fashion given that the analysis methodology was not specified to the participants, and key experimental results were withheld. This approach allowed the ILS to serve as a current measure of the state of the art for elastic-plastic fracture mechanics analysis. The analytical results and the associated methodologies were collected for comparison, and sources of variability were studied and isolated. The results of the study revealed that the J-integral analysis methodology using the domain integral method is robust, providing reliable J-integral values without being overly sensitive to modeling details. General modeling choices such as analysis code, model size (mesh density), crack tip meshing, or boundary conditions, were not found to be sources of significant variability. For analyses controlled only by far-field boundary conditions, the greatest source of variability in the J-integral assessment is introduced through the constitutive model. This variability can be substantially reduced by using crack mouth opening displacements to anchor the assessment. Conclusions provide recommendations for analysis standardization.
Jana, Arijit; Maity, Chiranjit; Halder, Suman Kumar; Mondal, Keshab Chandra; Pati, Bikash Ranjan; Mohapatra, Pradeep Kumar Das
2012-07-01
Tannase production by newly isolated Penicillium purpurogenum PAF6 was investigated by 'one variable at a time' (OVAT) approach followed by response surface methodology (RSM). Tannin-rich plant residues were used as supporting solid substrate and sole carbon source and, among them, tamarind seed was found to be the most favorable substrate than haritaki, pomegranate, tea leaf waste and arjun fruit. Physicochemical parameters were initially optimized using OVAT methodology and some important factors like incubation time, incubation temperature, substrate:moisture ratio as well as carbon, nitrogen and phosphate concentrations were verified with Box-Behken design of response surface methodology. Phosphate source, nitrogen source and temperature were found as the most favorable variables in the maximization of production. Tannase production was enhanced from 1.536 U/g to 5.784 U/g using tamarind seed OVAT optimization and further enhancement up to 6.15 U/g following RSM. An overall 3.76- and 4.0-fold increases in tannase production were achieved in OVAT and RSM, respectively.
Visualization of DNA and Protein-DNA Complexes with Atomic Force Microscopy
Lyubchenko, Yuri L.; Gall, Alexander A.; Shlyakhtenko, Luda S.
2014-01-01
This article describes sample preparation techniques for AFM imaging of DNA and protein–DNA complexes. The approach is based on chemical functionalization of the mica surface with aminopropyl silatrane (APS) to yield an APS-mica surface. This surface binds nucleic acids and nucleoprotein complexes in a wide range of ionic strengths, in the absence of divalent cations, and in a broad range of pH. The chapter describes the methodologies for the preparation of APS-mica surfaces and the preparation of samples for AFM imaging. The protocol for synthesis and purifi cation of APS is also provided. The AFM applications are illustrated with examples of images of DNA and protein–DNA complexes. PMID:24357372
Enhancement in photoluminescence performance of carbon-decorated T-ZnO
NASA Astrophysics Data System (ADS)
Jian, Xian; Chen, Guozhang; Wang, Chao; Yin, Liangjun; Li, Gang; Yang, Ping; Chen, Lei; Xu, Bao; Gao, Yang; Feng, Yanyu; Tang, Hui; Luan, Chunhong; Liang, Yinglin; Jiang, Jing; Cao, Yu; Wang, Siyuan; Gao, Xin
2015-03-01
The facile preparation of ZnO possessing high visible luminescence intensity remains challenging due to an unclear luminescence mechanism. Here, two basic approaches are proposed to enhance the luminescent intensity based on the theoretical analysis over surface defects. Based on the deduction, we introduce a methodology for obtaining hybrid tetrapod-like zinc oxide (T-ZnO), decorated by carbon nanomarterials on T-ZnO surfaces through the catalytic chemical vapor deposition approach. The intensity of the T-ZnO green emission can be modulated by topography and the proportion of carbon. Under proper experiment conditions, the carbon decorating leads to dramatically enhanced luminescence intensity of T-ZnO from 400 to 700 nm compared with no carbon decorated, which elevates this approach to a simple and effective method for the betterment of fluorescent materials in practical applications.
Milton, James A.; Patole, Samson; Yin, Huabing; Xiao, Qiang; Brown, Tom; Melvin, Tracy
2013-01-01
Although strategies for the immobilization of DNA oligonucleotides onto surfaces for bioanalytical and top-down bio-inspired nanobiofabrication approaches are well developed, the effect of introducing spacer molecules between the surface and the DNA oligonucleotide for the hybridization of nanoparticle–DNA conjugates has not been previously assessed in a quantitative manner. The hybridization efficiency of DNA oligonucleotides end-labelled with gold nanoparticles (1.4 or 10 nm diameter) with DNA sequences conjugated to silicon surfaces via hexaethylene glycol phosphate diester oligomer spacers (0, 1, 2, 6 oligomers) was found to be independent of spacer length. To quantify both the density of DNA strands attached to the surfaces and hybridization with the surface-attached DNA, new methodologies have been developed. Firstly, a simple approach based on fluorescence has been developed for determination of the immobilization density of DNA oligonucleotides. Secondly, an approach using mass spectrometry has been created to establish (i) the mean number of DNA oligonucleotides attached to the gold nanoparticles and (ii) the hybridization density of nanoparticle–oligonucleotide conjugates with the silicon surface–attached complementary sequence. These methods and results will be useful for application with nanosensors, the self-assembly of nanoelectronic devices and the attachment of nanoparticles to biomolecules for single-molecule biophysical studies. PMID:23361467
System Synthesis in Preliminary Aircraft Design using Statistical Methods
NASA Technical Reports Server (NTRS)
DeLaurentis, Daniel; Mavris, Dimitri N.; Schrage, Daniel P.
1996-01-01
This paper documents an approach to conceptual and preliminary aircraft design in which system synthesis is achieved using statistical methods, specifically design of experiments (DOE) and response surface methodology (RSM). These methods are employed in order to more efficiently search the design space for optimum configurations. In particular, a methodology incorporating three uses of these techniques is presented. First, response surface equations are formed which represent aerodynamic analyses, in the form of regression polynomials, which are more sophisticated than generally available in early design stages. Next, a regression equation for an overall evaluation criterion is constructed for the purpose of constrained optimization at the system level. This optimization, though achieved in a innovative way, is still traditional in that it is a point design solution. The methodology put forward here remedies this by introducing uncertainty into the problem, resulting a solutions which are probabilistic in nature. DOE/RSM is used for the third time in this setting. The process is demonstrated through a detailed aero-propulsion optimization of a high speed civil transport. Fundamental goals of the methodology, then, are to introduce higher fidelity disciplinary analyses to the conceptual aircraft synthesis and provide a roadmap for transitioning from point solutions to probabalistic designs (and eventually robust ones).
An Elliptic PDE Approach for Shape Characterization
Haidar, Haissam; Bouix, Sylvain; Levitt, James; McCarley, Robert W.; Shenton, Martha E.; Soul, Janet S.
2009-01-01
This paper presents a novel approach to analyze the shape of anatomical structures. Our methodology is rooted in classical physics and in particular Poisson's equation, a fundamental partial differential equation [1]. The solution to this equation and more specifically its equipotential surfaces display properties that are useful for shape analysis. We present a numerical algorithm to calculate the length of streamlines formed by the gradient field of the solution to this equation for 2D and 3D objects. The length of the streamlines along the equipotential surfaces was used to build a new function which can characterize the shape of objects. We illustrate our method on 2D synthetic and natural shapes as well as 3D medical data. PMID:17271986
Sparse reconstruction of liver cirrhosis from monocular mini-laparoscopic sequences
NASA Astrophysics Data System (ADS)
Marcinczak, Jan Marek; Painer, Sven; Grigat, Rolf-Rainer
2015-03-01
Mini-laparoscopy is a technique which is used by clinicians to inspect the liver surface with ultra-thin laparoscopes. However, so far no quantitative measures based on mini-laparoscopic sequences are possible. This paper presents a Structure from Motion (SfM) based methodology to do 3D reconstruction of liver cirrhosis from mini-laparoscopic videos. The approach combines state-of-the-art tracking, pose estimation, outlier rejection and global optimization to obtain a sparse reconstruction of the cirrhotic liver surface. Specular reflection segmentation is included into the reconstruction framework to increase the robustness of the reconstruction. The presented approach is evaluated on 15 endoscopic sequences using three cirrhotic liver phantoms. The median reconstruction accuracy ranges from 0.3 mm to 1 mm.
Jesse, Stephen; Kalinin, Sergei V; Nikiforov, Maxim P
2013-07-09
An approach for the thermomechanical characterization of phase transitions in polymeric materials (polyethyleneterephthalate) by band excitation acoustic force microscopy is developed. This methodology allows the independent measurement of resonance frequency, Q factor, and oscillation amplitude of a tip-surface contact area as a function of tip temperature, from which the thermal evolution of tip-surface spring constant and mechanical dissipation can be extracted. A heating protocol maintained a constant tip-surface contact area and constant contact force, thereby allowing for reproducible measurements and quantitative extraction of material properties including temperature dependence of indentation-based elastic and loss moduli.
NASA Astrophysics Data System (ADS)
Pelosi, Claudia; Capobianco, Giuseppe; Agresti, Giorgia; Bonifazi, Giuseppe; Morresi, Fabio; Rossi, Sara; Santamaria, Ulderico; Serranti, Silvia
2018-06-01
The aim of this work is to investigate the stability to simulated solar radiation of some paintings samples through a new methodological approach adopting non-invasive spectroscopic techniques. In particular, commercial watercolours and iron oxide based pigments were used, these last ones being prepared for the experimental by gum Arabic in order to propose a possible substitute for traditional reintegration materials. Reflectance spectrophotometry in the visible range and Hyperspectral Imaging in the short wave infrared were chosen as non-invasive techniques for evaluation the stability to irradiation of the chosen pigments. These were studied before and after artificial ageing procedure performed in Solar Box chamber under controlled conditions. Data were treated and elaborated in order to evaluate the sensitivity of the chosen techniques in identifying the variations on paint layers, induced by photo-degradation, before they could be observed by eye. Furthermore a supervised classification method for monitoring the painted surface changes adopting a multivariate approach was successfully applied.
NASA Astrophysics Data System (ADS)
Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.
1991-03-01
To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).
Human-centred approaches in slipperiness measurement
Grönqvist, Raoul; Abeysekera, John; Gard, Gunvor; Hsiang, Simon M.; Leamon, Tom B.; Newman, Dava J.; Gielo-Perczak, Krystyna; Lockhart, Thurmon E.; Pai, Clive Y.-C.
2010-01-01
A number of human-centred methodologies—subjective, objective, and combined—are used for slipperiness measurement. They comprise a variety of approaches from biomechanically-oriented experiments to psychophysical tests and subjective evaluations. The objective of this paper is to review some of the research done in the field, including such topics as awareness and perception of slipperiness, postural and balance control, rating scales for balance, adaptation to slippery conditions, measurement of unexpected movements, kinematics of slipping, and protective movements during falling. The role of human factors in slips and falls will be discussed. Strengths and weaknesses of human-centred approaches in relation to mechanical slip test methodologies are considered. Current friction-based criteria and thresholds for walking without slipping are reviewed for a number of work tasks. These include activities such as walking on a level or an inclined surface, running, stopping and jumping, as well as stair ascent and descent, manual exertion (pushing and pulling, load carrying, lifting) and particular concerns of the elderly and mobility disabled persons. Some future directions for slipperiness measurement and research in the field of slips and falls are outlined. Human-centred approaches for slipperiness measurement do have many applications. First, they are utilized to develop research hypotheses and models to predict workplace risks caused by slipping. Second, they are important alternatives to apparatus-based friction measurements and are used to validate such methodologies. Third, they are used as practical tools for evaluating and monitoring slip resistance properties of foot wear, anti-skid devices and floor surfaces. PMID:11794763
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, M.S.Y.
1990-12-01
The PAGAN code system is a part of the performance assessment methodology developed for use by the U.S. Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. In this methodology, PAGAN is used as one candidate approach for analysis of the ground-water pathway. PAGAN, Version 1.1. has the capability to model the source term, vadose-zone transport, and aquifer transport of radionuclides from a waste disposal unit. It combines the two codes SURFACE and DISPERSE which are used as semi-analytical solutions to the convective-dispersion equation. This system uses menu driven input/out for implementing a simple ground-water transport analysismore » and incorporates statistical uncertainty functions for handling data uncertainties. The output from PAGAN includes a time and location-dependent radionuclide concentration at a well in the aquifer, or a time and location-dependent radionuclide flux into a surface-water body.« less
Shen, Naikun; Qin, Yan; Wang, Qingyan; Xie, Nengzhong; Mi, Huizhi; Zhu, Qixia; Liao, Siming; Huang, Ribo
2013-10-01
Succinic acid is an important C4 platform chemical in the synthesis of many commodity and special chemicals. In the present work, different compounds were evaluated for succinic acid production by Actinobacillus succinogenes GXAS 137. Important parameters were screened by the single factor experiment and Plackeet-Burman design. Subsequently, the highest production of succinic acid was approached by the path of steepest ascent. Then, the optimum values of the parameters were obtained by Box-Behnken design. The results show that the important parameters were glucose, yeast extract and MgCO3 concentrations. The optimum condition was as follows (g/L): glucose 70.00, yeast extract 9.20 and MgCO3 58.10. Succinic acid yield reached 47.64 g/L at the optimal condition. Succinic acid increased by 29.14% than that before the optimization (36.89 g/L). Response surface methodology was proven to be a powerful tool to optimize succinic acid production.
Lee, Seung-Mok; Kim, Young-Gyu; Cho, Il-Hyoung
2005-01-01
Optimal operating conditions in order to treat dyeing wastewater were investigated by using the factorial design and responses surface methodology (RSM). The experiment was statistically designed and carried out according to a 22 full factorial design with four factorial points, three center points, and four axial points. Then, the linear and nonlinear regression was applied on the data by using SAS package software. The independent variables were TiO2 dosage, H2O2 concentration and total organic carbon (TOC) removal efficiency of dyeing wastewater was dependent variable. From the factorial design and responses surface methodology (RSM), maximum removal efficiency (85%) of dyeing wastewater was obtained at TiO2 dosage (1.82 gL(-1)), H2O2 concentration (980 mgL(-1)) for oxidation reaction (20 min).
Erva, Rajeswara Reddy; Goswami, Ajgebi Nath; Suman, Priyanka; Vedanabhatla, Ravali; Rajulapati, Satish Babu
2017-03-16
The culture conditions and nutritional rations influencing the production of extra cellular antileukemic enzyme by novel Enterobacter aerogenes KCTC2190/MTCC111 were optimized in shake-flask culture. Process variables like pH, temperature, incubation time, carbon and nitrogen sources, inducer concentration, and inoculum size were taken into account. In the present study, finest enzyme activity achieved by traditional one variable at a time method was 7.6 IU/mL which was a 2.6-fold increase compared to the initial value. Further, the L-asparaginase production was optimized using response surface methodology, and validated experimental result at optimized process variables gave 18.35 IU/mL of L-asparaginase activity, which is 2.4-times higher than the traditional optimization approach. The study explored the E. aerogenes MTCC111 as a potent and potential bacterial source for high yield of antileukemic drug.
Wavelet maxima curves of surface latent heat flux associated with two recent Greek earthquakes
NASA Astrophysics Data System (ADS)
Cervone, G.; Kafatos, M.; Napoletani, D.; Singh, R. P.
2004-05-01
Multi sensor data available through remote sensing satellites provide information about changes in the state of the oceans, land and atmosphere. Recent studies have shown anomalous changes in oceans, land, atmospheric and ionospheric parameters prior to earthquakes events. This paper introduces an innovative data mining technique to identify precursory signals associated with earthquakes. The proposed methodology is a multi strategy approach which employs one dimensional wavelet transformations to identify singularities in the data, and an analysis of the continuity of the wavelet maxima in time and space to identify the singularities associated with earthquakes. The proposed methodology has been employed using Surface Latent Heat Flux (SLHF) data to study the earthquakes which occurred on 14 August 2003 and on 1 March 2004 in Greece. A single prominent SLHF anomaly has been found about two weeks prior to each of the earthquakes.
Norris, Scott A; Brenner, Michael P; Aziz, Michael J
2009-06-03
We develop a methodology for deriving continuum partial differential equations for the evolution of large-scale surface morphology directly from molecular dynamics simulations of the craters formed from individual ion impacts. Our formalism relies on the separation between the length scale of ion impact and the characteristic scale of pattern formation, and expresses the surface evolution in terms of the moments of the crater function. We demonstrate that the formalism reproduces the classical Bradley-Harper results, as well as ballistic atomic drift, under the appropriate simplifying assumptions. Given an actual set of converged molecular dynamics moments and their derivatives with respect to the incidence angle, our approach can be applied directly to predict the presence and absence of surface morphological instabilities. This analysis represents the first work systematically connecting molecular dynamics simulations of ion bombardment to partial differential equations that govern topographic pattern-forming instabilities.
Determining radiated sound power of building structures by means of laser Doppler vibrometry
NASA Astrophysics Data System (ADS)
Roozen, N. B.; Labelle, L.; Rychtáriková, M.; Glorieux, C.
2015-06-01
This paper introduces a methodology that makes use of laser Doppler vibrometry to assess the acoustic insulation performance of a building element. The sound power radiated by the surface of the element is numerically determined from the vibrational pattern, offering an alternative for classical microphone measurements. Compared to the latter the proposed analysis is not sensitive to room acoustical effects. This allows the proposed methodology to be used at low frequencies, where the standardized microphone based approach suffers from a high uncertainty due to a low acoustic modal density. Standardized measurements as well as laser Doppler vibrometry measurements and computations have been performed on two test panels, a light-weight wall and a gypsum block wall and are compared and discussed in this paper. The proposed methodology offers an adequate solution for the assessment of the acoustic insulation of building elements at low frequencies. This is crucial in the framework of recent proposals of acoustic standards for measurement approaches and single number sound insulation performance ratings to take into account frequencies down to 50 Hz.
Chauhan, Awadesh K; Survase, Shrikant A; Kishenkumar, Jyoti; Annapure, Uday S
2009-06-01
This paper deals with the optimization of culture conditions for the production of cholesterol oxidase (COD) by Streptomyces lavendulae NCIM 2499 using the one-factor-at-a-time method, orthogonal array method and response surface methodology (RSM) approaches. The one-factor-at-a-time method was adopted to investigate the effects of medium components (i.e. carbon and nitrogen) and environmental factors (i.e. initial pH) on biomass growth and COD production. Subsequently, an L12 orthogonal matrix was used to evaluate the significance of glycerol, soyabean meal, malt extract, K2HPO4, MgSO4 and NaCl. The effects of media components were ranked according to their effects on the production of COD as malt extract > soyabean meal > K2HPO4 > NaCl > MgSO4 > glycerol. The subsequent optimization of the four most significant factors viz. malt extract, soyabean meal, K2HPO4 and NaCl, was carried out by employing a central composite rotatable design (CCRD) of RSM. There was a 2.48-fold increase in productivity of COD as compared to the unoptimized media by using these statistical approaches.
NASA Astrophysics Data System (ADS)
McJannet, D. L.; Cook, F. J.; McGloin, R. P.; McGowan, H. A.; Burn, S.
2011-05-01
The use of scintillometers to determine sensible and latent heat flux is becoming increasingly common because of their ability to quantify convective fluxes over distances of hundreds of meters to several kilometers. The majority of investigations using scintillometry have focused on processes above land surfaces, but here we propose a new methodology for obtaining sensible and latent heat fluxes from a scintillometer deployed over open water. This methodology has been tested by comparison with eddy covariance measurements and through comparison with alternative scintillometer calculation approaches that are commonly used in the literature. The methodology is based on linearization of the Bowen ratio, which is a common assumption in models such as Penman's model and its derivatives. Comparison of latent heat flux estimates from the eddy covariance system and the scintillometer showed excellent agreement across a range of weather conditions and flux rates, giving a high level of confidence in scintillometry-derived latent heat fluxes. The proposed approach produced better estimates than other scintillometry calculation methods because of the reliance of alternative methods on measurements of water temperature or water body heat storage, which are both notoriously hard to quantify. The proposed methodology requires less instrumentation than alternative scintillometer calculation approaches, and the spatial scales of required measurements are arguably more compatible. In addition to scintillometer measurements of the structure parameter of the refractive index of air, the only measurements required are atmospheric pressure, air temperature, humidity, and wind speed at one height over the water body.
NASA Astrophysics Data System (ADS)
Garrett, S. J.; Cooper, A. J.; Harris, J. H.; Özkan, M.; Segalini, A.; Thomas, P. J.
2016-01-01
We summarise results of a theoretical study investigating the distinct convective instability properties of steady boundary-layer flow over rough rotating disks. A generic roughness pattern of concentric circles with sinusoidal surface undulations in the radial direction is considered. The goal is to compare predictions obtained by means of two alternative, and fundamentally different, modelling approaches for surface roughness for the first time. The motivating rationale is to identify commonalities and isolate results that might potentially represent artefacts associated with the particular methodologies underlying one of the two modelling approaches. The most significant result of practical relevance obtained is that both approaches predict overall stabilising effects on type I instability mode of rotating disk flow. This mode leads to transition of the rotating-disk boundary layer and, more generally, the transition of boundary-layers with a cross-flow profile. Stabilisation of the type 1 mode means that it may be possible to exploit surface roughness for laminar-flow control in boundary layers with a cross-flow component. However, we also find differences between the two sets of model predictions, some subtle and some substantial. These will represent criteria for establishing which of the two alternative approaches is more suitable to correctly describe experimental data when these become available.
NASA Astrophysics Data System (ADS)
Doss, Derek J.; Heiselman, Jon S.; Collins, Jarrod A.; Weis, Jared A.; Clements, Logan W.; Geevarghese, Sunil K.; Miga, Michael I.
2017-03-01
Sparse surface digitization with an optically tracked stylus for use in an organ surface-based image-to-physical registration is an established approach for image-guided open liver surgery procedures. However, variability in sparse data collections during open hepatic procedures can produce disparity in registration alignments. In part, this variability arises from inconsistencies with the patterns and fidelity of collected intraoperative data. The liver lacks distinct landmarks and experiences considerable soft tissue deformation. Furthermore, data coverage of the organ is often incomplete or unevenly distributed. While more robust feature-based registration methodologies have been developed for image-guided liver surgery, it is still unclear how variation in sparse intraoperative data affects registration. In this work, we have developed an application to allow surgeons to study the performance of surface digitization patterns on registration. Given the intrinsic nature of soft-tissue, we incorporate realistic organ deformation when assessing fidelity of a rigid registration methodology. We report the construction of our application and preliminary registration results using four participants. Our preliminary results indicate that registration quality improves as users acquire more experience selecting patterns of sparse intraoperative surface data.
Li, Jingrui; Kondov, Ivan; Wang, Haobin; Thoss, Michael
2015-04-10
A recently developed methodology to simulate photoinduced electron transfer processes at dye-semiconductor interfaces is outlined. The methodology employs a first-principles-based model Hamiltonian and accurate quantum dynamics simulations using the multilayer multiconfiguration time-dependent Hartree approach. This method is applied to study electron injection in the dye-semiconductor system coumarin 343-TiO2. Specifically, the influence of electronic-vibrational coupling is analyzed. Extending previous work, we consider the influence of Dushinsky rotation of the normal modes as well as anharmonicities of the potential energy surfaces on the electron transfer dynamics.
Selective electron spin resonance measurements of micrometer-scale thin samples on a substrate
NASA Astrophysics Data System (ADS)
Dikarov, Ekaterina; Fehr, Matthias; Schnegg, Alexander; Lips, Klaus; Blank, Aharon
2013-11-01
An approach to the selective observation of paramagnetic centers in thin samples or surfaces with electron spin resonance (ESR) is presented. The methodology is based on the use of a surface microresonator that enables the selective obtention of ESR data from thin layers with minimal background signals from the supporting substrate. An experimental example is provided, which measures the ESR signal from a 1.2 µm polycrystalline silicon layer on a glass substrate used in modern solar-cell technology. The ESR results obtained with the surface microresonator show the effective elimination of background signals, especially at low cryogenic temperatures, compared to the use of a conventional resonator. The surface microresonator also facilitates much higher absolute spin sensitivity, requiring much smaller surfaces for the measurement.
Agathokleous, Evgenios
2017-08-01
Ethylenediurea (EDU) has been widely studied for its effectiveness to protect plants against injuries caused by surface ozone (O 3 ), however its mode of action remains unclear. So far, there is not a unified methodological approach and thus the methodology is quite arbitrary, thereby making it more difficult to generalize findings and understand the EDU mode of action. This review examines the question of whether potential N addition to plants by EDU is a fundamental underlying mechanism in protecting against O 3 phytotoxicity. Yet, this review proposes an evidence-based hypothesis that EDU may protect plants against O 3 deleterious effects upon generation of EDU-induced hormesis, i.e. by activating plant defense at low doses. This hypothesis challenges the future research directions. Revealing a hormesis-based EDU mode of action in protecting plants against O 3 toxicity would have further implications to ecotoxicology and environmental safety. Furthermore, this review discusses the need for further studies on plant metabolism under EDU treatment through relevant experimental approach, and attempts to set the bases for approaching a unified methodology that will contribute in revealing the EDU mode of action. In this framework, focus is given to the main EDU application methods. Copyright © 2017 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Raimonet, M.; Oudin, L.; Rabouille, C.; Garnier, J.; Silvestre, M.; Vautard, R.; Thieu, V.
2016-12-01
Water quality management of fresh and marine aquatic systems requires modelling tools along the land-ocean continuum in order to evaluate the effect of climate change on nutrient transfer and on potential ecosystem dysfonctioning (e.g. eutrophication, anoxia). In addition to direct effects of climate change on water temperature, it is essential to consider indirect effects of precipitation and temperature changes on hydrology since nutrient transfers are particularly sensitive to the partition of streamflow between surface flow and baseflow. Yet, the determination of surface flow and baseflow, their spatial repartition on drainage basins, and their relative potential evolution under climate change remains challenging. In this study, we developed a generic approach to determine 10-day surface flow and baseflow using a regionalized hydrological model applied at a high spatial resolution (unitary catchments of area circa 10km²). Streamflow data at gauged basins were used to calibrate hydrological model parameters that were then applied on neighbor ungauged basins to estimate streamflow at the scale of the French territory. The proposed methodology allowed representing spatialized surface flow and baseflow that are consistent with climatic and geomorphological settings. The methodology was then used to determine the effect of climate change on the spatial repartition of surface flow and baseflow on the Seine drainage bassin. Results showed large discrepancies of both the amount and the spatial repartition of changes of surface flow and baseflow according to the several GCM and RCM used to derive projected climatic forcing. Consequently, it is expected that the impact of climate change on nutrient transfer might also be quite heterogeneous for the Seine River. This methodology could be applied in any drainage basin where at least several gauged hydrometric stations are available. The estimated surface flow and baseflow can then be used in hydro-ecological models in order to evaluate direct and indirect impacts of climate change on nutrient transfers and potential ecosystem dysfunctioning along the land-ocean continuum.
Surface Warfare Officers Initial Training For Future Success
2018-03-01
updating and creating learning modules and Surface Warfare Officer School (SWOS) staffing as well as weaknesses in the methodologies used for...and Surface Warfare Officer School (SWOS) staffing as well as weaknesses in the methodologies used for training. We conclude that the Basic Division... METHODOLOGY ....................................................................................9 1. Staff Interviews
NASA Astrophysics Data System (ADS)
Pathak, Ashish; Raessi, Mehdi
2016-04-01
We present a three-dimensional (3D) and fully Eulerian approach to capturing the interaction between two fluids and moving rigid structures by using the fictitious domain and volume-of-fluid (VOF) methods. The solid bodies can have arbitrarily complex geometry and can pierce the fluid-fluid interface, forming contact lines. The three-phase interfaces are resolved and reconstructed by using a VOF-based methodology. Then, a consistent scheme is employed for transporting mass and momentum, allowing for simulations of three-phase flows of large density ratios. The Eulerian approach significantly simplifies numerical resolution of the kinematics of rigid bodies of complex geometry and with six degrees of freedom. The fluid-structure interaction (FSI) is computed using the fictitious domain method. The methodology was developed in a message passing interface (MPI) parallel framework accelerated with graphics processing units (GPUs). The computationally intensive solution of the pressure Poisson equation is ported to GPUs, while the remaining calculations are performed on CPUs. The performance and accuracy of the methodology are assessed using an array of test cases, focusing individually on the flow solver and the FSI in surface-piercing configurations. Finally, an application of the proposed methodology in simulations of the ocean wave energy converters is presented.
First-principles study of metallic iron interfaces
NASA Astrophysics Data System (ADS)
Hung, A.; Yarovsky, I.; Muscat, J.; Russo, S.; Snook, I.; Watts, R. O.
2002-04-01
Adhesion between clean, bulk-terminated bcc Fe(1 0 0) and Fe(1 1 0) matched and mismatched surfaces was simulated within the theoretical framework of the density functional theory. The generalized-gradient spin approximation exchange-correlation functional was used in conjunction with a plane wave-ultrasoft pseudopotential representation. The structure and properties of bulk bcc Fe were calculated in order to establish the reliability of the methodology employed, as well as to determine suitably converged values of computational parameters to be used in subsequent surface calculations. Interfaces were modelled using a single supercell approach, with the interfacial separation distance manipulated by the size of vacuum separation between vertically adjacent surface cells. The adhesive energies at discrete interfacial separations were calculated for each interface and the resulting data fitted to the universal binding energy relation (UBER) of Rose et al. [Phys. Rev. Lett. 47 (1981) 675]. An interpretation of the values of the fitted UBER parameters for the four Fe interfaces studied is given. In addition, a discussion on the validity of the employed computational methodology is presented.
Muhammad Auwal, Shehu; Zarei, Mohammad; Abdul-Hamid, Azizah; Saari, Nazamid
2017-03-31
The stone fish is an under-utilized sea cucumber with many nutritional and ethno-medicinal values. This study aimed to establish the conditions for its optimum hydrolysis with bromelain to generate angiotensin I-converting enzyme (ACE)-inhibitory hydrolysates. Response surface methodology (RSM) based on a central composite design was used to model and optimize the degree of hydrolysis (DH) and ACE-inhibitory activity. Process conditions including pH (4-7), temperature (40-70 °C), enzyme/substrate (E/S) ratio (0.5%-2%) and time (30-360 min) were used. A pH of 7.0, temperature of 40 °C, E/S ratio of 2% and time of 240 min were determined using a response surface model as the optimum levels to obtain the maximum ACE-inhibitory activity of 84.26% at 44.59% degree of hydrolysis. Hence, RSM can serve as an effective approach in the design of experiments to improve the antihypertensive effect of stone fish hydrolysates, which can thus be used as a value-added ingredient for various applications in the functional foods industries.
Jain, Monika; Garg, V K; Kadirvelu, K
2011-01-01
In the present study, chemically treated Helianthus annuus flowers (SHC) were used to optimize the removal efficiency for Cr(VI) by applying Response Surface Methodological approach. The surface structure of SHC was analyzed by Scanning Electron Microscopy (SEM) coupled with Energy Dispersive X-ray Analysis (EDX). Batch mode experiments were also carried out to assess the adsorption equilibrium in aqueous solution. The adsorption capacity (qe) was found to be 7.2 mg/g. The effect of three parameters, that is pH of the solution (2.0-7.0), initial concentration (10-70 mg/L) and adsorbent dose (0.05-0.5 g/100 mL) was studied for the removal of Cr(VI) by SHC. Box-Behnken model was used as an experimental design. The optimum pH, adsorbent dose and initial Cr(VI) concentration were found to be 2.0, 5.0 g/L and 40 mg/L, respectively. Under these conditions, removal efficiency of Cr(VI) was found to be 90.8%. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Chen, Wei; Tsui, Kwok-Leung; Allen, Janet K.; Mistree, Farrokh
1994-01-01
In this paper we introduce a comprehensive and rigorous robust design procedure to overcome some limitations of the current approaches. A comprehensive approach is general enough to model the two major types of robust design applications, namely, robust design associated with the minimization of the deviation of performance caused by the deviation of noise factors (uncontrollable parameters), and robust design due to the minimization of the deviation of performance caused by the deviation of control factors (design variables). We achieve mathematical rigor by using, as a foundation, principles from the design of experiments and optimization. Specifically, we integrate the Response Surface Method (RSM) with the compromise Decision Support Problem (DSP). Our approach is especially useful for design problems where there are no closed-form solutions and system performance is computationally expensive to evaluate. The design of a solar powered irrigation system is used as an example. Our focus in this paper is on illustrating our approach rather than on the results per se.
NASA Astrophysics Data System (ADS)
Robin, C.; Gérard, M.; Quinaud, M.; d'Arbigny, J.; Bultel, Y.
2016-09-01
The prediction of Proton Exchange Membrane Fuel Cell (PEMFC) lifetime is one of the major challenges to optimize both material properties and dynamic control of the fuel cell system. In this study, by a multiscale modeling approach, a mechanistic catalyst dissolution model is coupled to a dynamic PEMFC cell model to predict the performance loss of the PEMFC. Results are compared to two 2000-h experimental aging tests. More precisely, an original approach is introduced to estimate the loss of an equivalent active surface area during an aging test. Indeed, when the computed Electrochemical Catalyst Surface Area profile is fitted on the experimental measures from Cyclic Voltammetry, the computed performance loss of the PEMFC is underestimated. To be able to predict the performance loss measured by polarization curves during the aging test, an equivalent active surface area is obtained by a model inversion. This methodology enables to successfully find back the experimental cell voltage decay during time. The model parameters are fitted from the polarization curves so that they include the global degradation. Moreover, the model captures the aging heterogeneities along the surface of the cell observed experimentally. Finally, a second 2000-h durability test in dynamic operating conditions validates the approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lara-Castells, María Pilar de, E-mail: Pilar.deLara.Castells@csic.es; Mitrushchenkov, Alexander O.; Stoll, Hermann
2015-09-14
A combined density functional (DFT) and incremental post-Hartree-Fock (post-HF) approach, proven earlier to calculate He-surface potential energy surfaces [de Lara-Castells et al., J. Chem. Phys. 141, 151102 (2014)], is applied to describe the van der Waals dominated Ag{sub 2}/graphene interaction. It extends the dispersionless density functional theory developed by Pernal et al. [Phys. Rev. Lett. 103, 263201 (2009)] by including periodic boundary conditions while the dispersion is parametrized via the method of increments [H. Stoll, J. Chem. Phys. 97, 8449 (1992)]. Starting with the elementary cluster unit of the target surface (benzene), continuing through the realistic cluster model (coronene), andmore » ending with the periodic model of the extended system, modern ab initio methodologies for intermolecular interactions as well as state-of-the-art van der Waals-corrected density functional-based approaches are put together both to assess the accuracy of the composite scheme and to better characterize the Ag{sub 2}/graphene interaction. The present work illustrates how the combination of DFT and post-HF perspectives may be efficient to design simple and reliable ab initio-based schemes in extended systems for surface science applications.« less
NASA Astrophysics Data System (ADS)
Castaldini, D.; Genevois, R.; Panizza, M.; Puccinelli, A.; Berti, M.; Simoni, A.
This paper illustrates research addressing the subject of the earthquake-induced surface effects by means of a multidisciplinary approach: tectonics, neotectonics, seismology, geology, hydrogeology, geomorphology, soil/rock mechanics have been considered. The research is aimed to verify in areas affected by earthquake-triggered landslides a methodology for the identification of potentially unstable areas. The research was organized according to regional and local scale studies. In order to better emphasise the complexity of the relationships between all the parameters affecting the stability conditions of rock slopes in static and dynamic conditions a new integrated approach, Rock Engineering Systems (RES), was applied in the Northern Apennines. In the paper, the different phases of the research are described in detail and an example of the application of RES method in a sample area is reported. A significant aspect of the study can be seen in its attempt to overcome the exclusively qualitative aspects of research into the relationship between earthquakes and induced surface effects, and to advance the idea of beginning a process by which this interaction can be quantified.
Nicholls, David P
2018-04-01
The faithful modelling of the propagation of linear waves in a layered, periodic structure is of paramount importance in many branches of the applied sciences. In this paper, we present a novel numerical algorithm for the simulation of such problems which is free of the artificial singularities present in related approaches. We advocate for a surface integral formulation which is phrased in terms of impedance-impedance operators that are immune to the Dirichlet eigenvalues which plague the Dirichlet-Neumann operators that appear in classical formulations. We demonstrate a high-order spectral algorithm to simulate these latter operators based upon a high-order perturbation of surfaces methodology which is rapid, robust and highly accurate. We demonstrate the validity and utility of our approach with a sequence of numerical simulations.
NASA Astrophysics Data System (ADS)
Lazzari, M.; Loperte, A.; Perrone, A.
2010-03-01
This work, carried out with an integrated methodological approach, focuses on the use of near surface geophysics techniques, such as ground penetrating radar and electrical resistivity tomography (ERT), and geomorphological analysis, in order to reconstruct the cave distribution and geometry in a urban context and, in particular, in historical centres. The interaction during recent centuries between human activity (caves excavation, birth and growth of an urban area) and the characters of the natural environment were the reasons of a progressive increase in hazard and vulnerability levels of several sites. The reconstruction of a detailed cave map distribution is the first step to define the anthropic and geomorphological hazard in urban areas, fundamental basis for planning and assessing the risk.
NASA Astrophysics Data System (ADS)
Nicholls, David P.
2018-04-01
The faithful modelling of the propagation of linear waves in a layered, periodic structure is of paramount importance in many branches of the applied sciences. In this paper, we present a novel numerical algorithm for the simulation of such problems which is free of the artificial singularities present in related approaches. We advocate for a surface integral formulation which is phrased in terms of impedance-impedance operators that are immune to the Dirichlet eigenvalues which plague the Dirichlet-Neumann operators that appear in classical formulations. We demonstrate a high-order spectral algorithm to simulate these latter operators based upon a high-order perturbation of surfaces methodology which is rapid, robust and highly accurate. We demonstrate the validity and utility of our approach with a sequence of numerical simulations.
Computerized Design of Low-noise Face-milled Spiral Bevel Gears
NASA Technical Reports Server (NTRS)
Litvin, Faydor L.; Zhang, YI; Handschuh, Robert F.
1994-01-01
An advanced design methodology is proposed for the face-milled spiral bevel gears with modified tooth surface geometry that provides a reduced level of noise and has a stabilized bearing contact. The approach is based on the local synthesis of the gear drive that provides the 'best' machine-tool settings. The theoretical aspects of the local synthesis approach are based on the application of a predesigned parabolic function for absorption of undesirable transmission errors caused by misalignment and the direct relations between principal curvatures and directions for mating surfaces. The meshing and contact of the gear drive is synthesized and analyzed by a computer program. The generation of gears with the proposed geometry design can be accomplished by application of existing equipment. A numerical example that illustrates the proposed theory is presented.
Computerized design of low-noise face-milled spiral bevel gears
NASA Astrophysics Data System (ADS)
Litvin, Faydor L.; Zhang, Yi; Handschuh, Robert F.
1994-08-01
An advanced design methodology is proposed for the face-milled spiral bevel gears with modified tooth surface geometry that provides a reduced level of noise and has a stabilized bearing contact. The approach is based on the local synthesis of the gear drive that provides the 'best' machine-tool settings. The theoretical aspects of the local synthesis approach are based on the application of a predesigned parabolic function for absorption of undesirable transmission errors caused by misalignment and the direct relations between principal curvatures and directions for mating surfaces. The meshing and contact of the gear drive is synthesized and analyzed by a computer program. The generation of gears with the proposed geometry design can be accomplished by application of existing equipment. A numerical example that illustrates the proposed theory is presented.
ERIC Educational Resources Information Center
Lee, Raymond T.; Brotheridge, Celeste M.
2011-01-01
Purpose: The purpose of this paper is to understand, from the child care worker's perspective, how work experience, display rules, and affectivity are related to emotional labor. It also examines the utility of separating surface acting into its two components: the hiding and faking of emotions. Design/methodology/approach: This study is based on…
2015-04-01
Computational Engineering unstructured RANS/LES/DES solver , Tenasi, was used to predict drag and simulate the free surface flow around the ACV over a...using a second-order accurate Roe approximate Riemann scheme, while viscous fluxes are evaluated using a second-order directional derivative approach...Predictions of rigid body ship motions for the SI75 container ship in incident waves and methodology for a one-way coupling of the Tenasi flow solver
Novakovic, Dunja; Saarinen, Jukka; Rojalin, Tatu; Antikainen, Osmo; Fraser-Miller, Sara J; Laaksonen, Timo; Peltonen, Leena; Isomäki, Antti; Strachan, Clare J
2017-11-07
Two nonlinear imaging modalities, coherent anti-Stokes Raman scattering (CARS) and sum-frequency generation (SFG), were successfully combined for sensitive multimodal imaging of multiple solid-state forms and their changes on drug tablet surfaces. Two imaging approaches were used and compared: (i) hyperspectral CARS combined with principal component analysis (PCA) and SFG imaging and (ii) simultaneous narrowband CARS and SFG imaging. Three different solid-state forms of indomethacin-the crystalline gamma and alpha forms, as well as the amorphous form-were clearly distinguished using both approaches. Simultaneous narrowband CARS and SFG imaging was faster, but hyperspectral CARS and SFG imaging has the potential to be applied to a wider variety of more complex samples. These methodologies were further used to follow crystallization of indomethacin on tablet surfaces under two storage conditions: 30 °C/23% RH and 30 °C/75% RH. Imaging with (sub)micron resolution showed that the approach allowed detection of very early stage surface crystallization. The surfaces progressively crystallized to predominantly (but not exclusively) the gamma form at lower humidity and the alpha form at higher humidity. Overall, this study suggests that multimodal nonlinear imaging is a highly sensitive, solid-state (and chemically) specific, rapid, and versatile imaging technique for understanding and hence controlling (surface) solid-state forms and their complex changes in pharmaceuticals.
NASA Astrophysics Data System (ADS)
Klaas, D. K. S. Y.; Imteaz, M. A.; Sudiayem, I.; Klaas, E. M. E.; Klaas, E. C. M.
2017-10-01
In groundwater modelling, robust parameterisation of sub-surface parameters is crucial towards obtaining an agreeable model performance. Pilot point is an alternative in parameterisation step to correctly configure the distribution of parameters into a model. However, the methodology given by the current studies are considered less practical to be applied on real catchment conditions. In this study, a practical approach of using geometric features of pilot point and distribution of hydraulic gradient over the catchment area is proposed to efficiently configure pilot point distribution in the calibration step of a groundwater model. A development of new pilot point distribution, Head Zonation-based (HZB) technique, which is based on the hydraulic gradient distribution of groundwater flow, is presented. Seven models of seven zone ratios (1, 5, 10, 15, 20, 25 and 30) using HZB technique were constructed on an eogenetic karst catchment in Rote Island, Indonesia and their performances were assessed. This study also concludes some insights into the trade-off between restricting and maximising the number of pilot points and offers a new methodology for selecting pilot point properties and distribution method in the development of a physically-based groundwater model.
de Faria, Janaína T; Rocha, Pollyana F; Converti, Attilio; Passos, Flávia M L; Minim, Luis A; Sampaio, Fábio C
2013-12-01
The aim of our study was to select the optimal operating conditions to permeabilize Kluyveromyces lactis cells using ethanol as a solvent as an alternative to cell disruption and extraction. Cell permeabilization was carried out by a non-mechanical method consisting of chemical treatment with ethanol, and the results were expressed as β-galactosidase activity. Experiments were conducted under different conditions of ethanol concentration, treatment time and temperature according to a central composite rotatable design (CCRD), and the collected results were then worked out by response surface methodology (RSM). Cell permeabilization was improved by an increase in ethanol concentration and simultaneous decreases in the incubation temperature and treatment time. Such an approach allowed us to identify an optimal range of the independent variables within which the β-galactosidase activity was optimized. A maximum permeabilization of 2,816 mmol L(-1) oNP min(-1) g(-1) was obtained by treating cells with 75.0% v/v of ethanol at 20.0 °C for 15.0 min. The proposed methodology resulted to be effective and suited for K. lactis cells permeabilization at a lab-scale and promises to be of possible interest for future applications mainly in the food industry.
Krauter, Paula; Edwards, Donna; Yang, Lynn; Tucker, Mark
2011-09-01
Decontamination and recovery of a facility or outdoor area after a wide-area biological incident involving a highly persistent agent (eg, Bacillus anthracis spores) is a complex process that requires extensive information and significant resources, which are likely to be limited, particularly if multiple facilities or areas are affected. This article proposes a systematic methodology for evaluating information to select the decontamination or alternative treatments that optimize use of resources if decontamination is required for the facility or area. The methodology covers a wide range of approaches, including volumetric and surface decontamination, monitored natural attenuation, and seal and abandon strategies. A proposed trade-off analysis can help decision makers understand the relative appropriateness, efficacy, and labor, skill, and cost requirements of the various decontamination methods for the particular facility or area needing treatment--whether alone or as part of a larger decontamination effort. Because the state of decontamination knowledge and technology continues to evolve rapidly, the methodology presented here is designed to accommodate new strategies and materials and changing information.
Dujardin, J; Batelaan, O; Canters, F; Boel, S; Anibas, C; Bronders, J
2011-01-15
The estimation of surface-subsurface water interactions is complex and highly variable in space and time. It is even more complex when it has to be estimated in urban areas, because of the complex patterns of the land-cover in these areas. In this research a modeling approach with integrated remote sensing analysis has been developed for estimating water fluxes in urban environments. The methodology was developed with the aim to simulate fluxes of contaminants from polluted sites. Groundwater pollution in urban environments is linked to patterns of land use and hence it is essential to characterize the land cover in a detail. An object-oriented classification approach applied on high-resolution satellite data has been adopted. To assign the image objects to one of the land-cover classes a multiple layer perceptron approach was adopted (Kappa of 0.86). Groundwater recharge has been simulated using the spatially distributed WetSpass model and the subsurface water flow using MODFLOW in order to identify and budget water fluxes. The developed methodology is applied to a brownfield case site in Vilvoorde, Brussels (Belgium). The obtained land use map has a strong impact on the groundwater recharge, resulting in a high spatial variability. Simulated groundwater fluxes from brownfield to the receiving River Zenne were independently verified by measurements and simulation of groundwater-surface water interaction based on thermal gradients in the river bed. It is concluded that in order to better quantify total fluxes of contaminants from brownfields in the groundwater, remote sensing imagery can be operationally integrated in a modeling procedure. Copyright © 2010 Elsevier B.V. All rights reserved.
Surface loading of a viscoelastic earth-I. General theory
NASA Astrophysics Data System (ADS)
Tromp, Jeroen; Mitrovica, Jerry X.
1999-06-01
We present a new normal-mode formalism for computing the response of an aspherical, self-gravitating, linear viscoelastic earth model to an arbitrary surface load. The formalism makes use of recent advances in the theory of the Earth's free oscillations, and is based upon an eigenfunction expansion methodology, rather than the tradi-tional Love-number approach to surface-loading problems. We introduce a surface-load representation theorem analogous to Betti's reciprocity relation in seismology. Taking advantage of this theorem and the biorthogonality of the viscoelastic modes, we determine the complete response to a surface load in the form of a Green's function. We also demonstrate that each viscoelastic mode has its own unique energy partitioning, which can be used to characterize it. In subsequent papers, we apply the theory to spherically symmetric and aspherical earth models.
NASA Astrophysics Data System (ADS)
Alifanov, O. M.; Paleshkin, A. V.; Terent‧ev, V. V.; Firsyuk, S. O.
2016-01-01
A methodological approach to determination of the thermal state at a point on the surface of an isothermal element of a small spacecraft has been developed. A mathematical model of heat transfer between surfaces of intricate geometric configuration has been described. In this model, account was taken of the external field of radiant fluxes and of the differentiated mutual influence of the surfaces. An algorithm for calculation of the distribution of the density of the radiation absorbed by surface elements of the object under study has been proposed. The temperature field on the lateral surface of the spacecraft exposed to sunlight and on its shady side has been calculated. By determining the thermal state of magnetic controls of the orientation system as an example, the authors have assessed the contribution of the radiation coming from the solar-cell panels and from the spacecraft surface.
From HADES to PARADISE—atomistic simulation of defects in minerals
NASA Astrophysics Data System (ADS)
Parker, Stephen C.; Cooke, David J.; Kerisit, Sebastien; Marmier, Arnaud S.; Taylor, Sarah L.; Taylor, Stuart N.
2004-07-01
The development of the HADES code by Michael Norgett in the 1970s enabled, for the first time, the routine simulation of point defects in inorganic solids at the atomic scale. Using examples from current research we illustrate how the scope and applications of atomistic simulations have widened with time and yet still follow an approach readily identifiable with this early work. Firstly we discuss the use of the Mott-Littleton methodology to study the segregation of various isovalent cations to the (00.1) and (01.2) surfaces of haematite (agr-Fe2O3). The results show that the size of the impurities has a considerable effect on the magnitude of the segregation energy. We then extend these simulations to investigate the effect of the concentration of the impurities at the surface on the segregation process using a supercell approach. We consider next the effect of segregation to stepped surfaces illustrating this with recent work on segregation of La3+ to CaF2 surfaces, which show enhanced segregation to step edges. We discuss next the application of lattice dynamics to modelling point defects in complex oxide materials by applying this to the study of hydrogen incorporation into bgr-Mg2SiO4. Finally our attention is turned to a method for considering the surface energy of physically defective surfaces and we illustrate its approach by considering the low index surfaces of agr-Al2O3.
Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.
Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán
2014-03-11
While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.
Automated quantification of neurite outgrowth orientation distributions on patterned surfaces
NASA Astrophysics Data System (ADS)
Payne, Matthew; Wang, Dadong; Sinclair, Catriona M.; Kapsa, Robert M. I.; Quigley, Anita F.; Wallace, Gordon G.; Razal, Joselito M.; Baughman, Ray H.; Münch, Gerald; Vallotton, Pascal
2014-08-01
Objective. We have developed an image analysis methodology for quantifying the anisotropy of neuronal projections on patterned substrates. Approach. Our method is based on the fitting of smoothing splines to the digital traces produced using a non-maximum suppression technique. This enables precise estimates of the local tangents uniformly along the neurite length, and leads to unbiased orientation distributions suitable for objectively assessing the anisotropy induced by tailored surfaces. Main results. In our application, we demonstrate that carbon nanotubes arrayed in parallel bundles over gold surfaces induce a considerable neurite anisotropy; a result which is relevant for regenerative medicine. Significance. Our pipeline is generally applicable to the study of fibrous materials on 2D surfaces and should also find applications in the study of DNA, microtubules, and other polymeric materials.
Koundouri, P; Ker Rault, P; Pergamalis, V; Skianis, V; Souliotis, I
2016-01-01
The development of the Water Framework Directive aimed to establish an integrated framework of water management at European level. This framework revolves around inland surface waters, transitional waters, coastal waters and ground waters. In the process of achieving the environment and ecological objectives set from the Directive, the role of economics is put in the core of the water management. An important feature of the Directive is the recovery of total economic cost of water services by all users. The total cost of water services can be disaggregated into environmental, financial and resource costs. Another important aspect of the directive is the identification of major drivers and pressures in each River Basin District. We describe a methodology that is aiming to achieve sustainable and environmental and socioeconomic management of freshwater ecosystem services. The Ecosystem Services Approach is in the core of the suggested methodology for the implementation of a more sustainable and efficient water management. This approach consists of the following three steps: (i) socio-economic characterization of the River Basin area, (ii) assessment of the current recovery of water use cost, and (iii) identification and suggestion of appropriate programs of measures for sustainable water management over space and time. This methodology is consistent with a) the economic principles adopted explicitly by the Water Framework Directive (WFD), b) the three-step WFD implementation approach adopted in the WATECO document, c) the Ecosystem Services Approach to valuing freshwater goods and services to humans. Furthermore, we analyze how the effects of multiple stressors and socio-economic development can be quantified in the context of freshwater resources management. We also attempt to estimate the value of four ecosystem services using the benefit transfer approach for the Anglian River Basin, which showed the significance of such services. Copyright © 2015. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Bernet, Daniel; Prasuhn, Volker; Weingartner, Rolf
2015-04-01
Several case studies in Switzerland highlight that many buildings which are damaged by floods are not located within the inundation zones of rivers, but outside the river network. In urban areas, such flooding can be caused by drainage system surcharge, low infiltration capacity of the urbanized landscape etc. However, in rural and peri-urban areas inundations are more likely caused by surface runoff formed on natural and arable land. Such flash floods have very short response time, occur rather diffusely and, thus, are very difficult to observe directly. In our approach, we use data records from private, but mostly from public insurance companies. The latter, present in 19 out of the total 26 Cantons of Switzerland, insure (almost) every building within the respective administrative zones and, in addition, hold a monopoly position. Damage claims, including flood damages, are usually recorded and, thus, data records from such public insurance companies are a very profitable data source to better understand surface runoff leading to damages. Although practitioners agree that this process is relevant, there seems to be a knowledge gap concerning spatial and temporal distributions as well as triggers and influencing factors of such damage events. Within the framework of a research project, we want to address this research gap and improve the understanding of the process chain from surface runoff formation up to possible damages to buildings. This poster introduces the methodology, which will be applied to a dataset including data from the majority of all 19 public insurance companies for buildings in Switzerland, counting over 50'000 damage claims, in order to better understand surface runoff. The goal is to infer spatial and temporal patterns as well as drivers and influencing factors of surface runoff possibly causing damages. In particular, the workflow of data acquisition, harmonization and treatment is outlined. Furthermore associated problems and challenges are discussed. Ultimately, the improved process understanding will be used to develop a new modeling approach.
The evolution equation for the flame surface density in turbulent premixed combustion
NASA Technical Reports Server (NTRS)
Trouve, Arnaud
1993-01-01
The mean reaction rate in flamelet models for turbulent premixed combustion depends on two basic quantities: a mean chemical rate, called the flamelet speed, and the flame surface density. Our previous work had been primarily focused on the problem of the structure and topology of turbulent premixed flames, and it was then determined that the flamelet speed, when space-averaged, is only weakly sensitive to the turbulent flow field. Consequently, the flame surface density is the key quantity that conveys most of the effects of the turbulence on the rate of energy release. In flamelet models, this quantity is obtained via a modeled transport equation called the Sigma-equation. Past theoretical work has produced a rigorous approach that leads to an exact but unclosed formulation for the turbulent Sigma-equation. In the exact Sigma-equation, it appears that the dynamical properties of the flame surface density are determined by a single parameter, namely the turbulent flame stretch. Unfortunately, the turbulent flame stretch as well as the flame surface density is not available from experiments, and, in the absence of experimental data, little is known on the validity of the closure assumptions used in current flamelet models. Direct Numerical Simulation (DNS) is the alternative approach to get basic information on these fundamental quantities. In the present work, three-dimensional DNS of premixed flames in isotropic turbulent flow is used to estimate the different terms appearing in the Sigma-equation. A new methodology is proposed to provide the source and sink terms for the flame surface density, resolved both temporally and spatially throughout the turbulent flame brush. Using this methodology, our objective is to extract the turbulent flame stretch from the DNS data base and then perform extensive comparisons with flamelet models. Thanks to the detailed information produced by the DNS-based analysis, it is expected that this type of comparison will not only underscore the shortcomings of current models, but also suggest ways to improve them.
Kang, Dae Y; Kim, Yun-Soung; Ornelas, Gladys; Sinha, Mridu; Naidu, Keerthiga; Coleman, Todd P
2015-09-16
New classes of ultrathin flexible and stretchable devices have changed the way modern electronics are designed to interact with their target systems. Though more and more novel technologies surface and steer the way we think about future electronics, there exists an unmet need in regards to optimizing the fabrication procedures for these devices so that large-scale industrial translation is realistic. This article presents an unconventional approach for facile microfabrication and processing of adhesive-peeled (AP) flexible sensors. By assembling AP sensors on a weakly-adhering substrate in an inverted fashion, we demonstrate a procedure with 50% reduced end-to-end processing time that achieves greater levels of fabrication yield. The methodology is used to demonstrate the fabrication of electrical and mechanical flexible and stretchable AP sensors that are peeled-off their carrier substrates by consumer adhesives. In using this approach, we outline the manner by which adhesion is maintained and buckling is reduced for gold film processing on polydimethylsiloxane substrates. In addition, we demonstrate the compatibility of our methodology with large-scale post-processing using a roll-to-roll approach.
Pelosi, Claudia; Capobianco, Giuseppe; Agresti, Giorgia; Bonifazi, Giuseppe; Morresi, Fabio; Rossi, Sara; Santamaria, Ulderico; Serranti, Silvia
2018-06-05
The aim of this work is to investigate the stability to simulated solar radiation of some paintings samples through a new methodological approach adopting non-invasive spectroscopic techniques. In particular, commercial watercolours and iron oxide based pigments were used, these last ones being prepared for the experimental by gum Arabic in order to propose a possible substitute for traditional reintegration materials. Reflectance spectrophotometry in the visible range and Hyperspectral Imaging in the short wave infrared were chosen as non-invasive techniques for evaluation the stability to irradiation of the chosen pigments. These were studied before and after artificial ageing procedure performed in Solar Box chamber under controlled conditions. Data were treated and elaborated in order to evaluate the sensitivity of the chosen techniques in identifying the variations on paint layers, induced by photo-degradation, before they could be observed by eye. Furthermore a supervised classification method for monitoring the painted surface changes adopting a multivariate approach was successfully applied. Copyright © 2018 Elsevier B.V. All rights reserved.
Kang, Dae Y.; Kim, Yun-Soung; Ornelas, Gladys; Sinha, Mridu; Naidu, Keerthiga; Coleman, Todd P.
2015-01-01
New classes of ultrathin flexible and stretchable devices have changed the way modern electronics are designed to interact with their target systems. Though more and more novel technologies surface and steer the way we think about future electronics, there exists an unmet need in regards to optimizing the fabrication procedures for these devices so that large-scale industrial translation is realistic. This article presents an unconventional approach for facile microfabrication and processing of adhesive-peeled (AP) flexible sensors. By assembling AP sensors on a weakly-adhering substrate in an inverted fashion, we demonstrate a procedure with 50% reduced end-to-end processing time that achieves greater levels of fabrication yield. The methodology is used to demonstrate the fabrication of electrical and mechanical flexible and stretchable AP sensors that are peeled-off their carrier substrates by consumer adhesives. In using this approach, we outline the manner by which adhesion is maintained and buckling is reduced for gold film processing on polydimethylsiloxane substrates. In addition, we demonstrate the compatibility of our methodology with large-scale post-processing using a roll-to-roll approach. PMID:26389915
2000-04-01
system, 8 - experiments on a study of boundary layer spectrum infrared window). before boiling of glass- silicide coating. This simple 3. SAMPLES AND...dependencies of surface temperature of tested materials and make conclusions concerned joint gllass- silicide coating and anode power of generator...obtained using test stagnation point configuration. glass- silicide coating vs anode power of HF-generator. Temperature peak at constant power
Latha, Selvanathan; Sivaranjani, Govindhan; Dhanasekaran, Dharumadurai
2017-09-01
Among diverse actinobacteria, Streptomyces is a renowned ongoing source for the production of a large number of secondary metabolites, furnishing immeasurable pharmacological and biological activities. Hence, to meet the demand of new lead compounds for human and animal use, research is constantly targeting the bioprospecting of Streptomyces. Optimization of media components and physicochemical parameters is a plausible approach for the exploration of intensified production of novel as well as existing bioactive metabolites from various microbes, which is usually achieved by a range of classical techniques including one factor at a time (OFAT). However, the major drawbacks of conventional optimization methods have directed the use of statistical optimization approaches in fermentation process development. Response surface methodology (RSM) is one of the empirical techniques extensively used for modeling, optimization and analysis of fermentation processes. To date, several researchers have implemented RSM in different bioprocess optimization accountable for the production of assorted natural substances from Streptomyces in which the results are very promising. This review summarizes some of the recent RSM adopted studies for the enhanced production of antibiotics, enzymes and probiotics using Streptomyces with the intention to highlight the significance of Streptomyces as well as RSM to the research community and industries.
Empirical retrieval of sea spray aerosol production using satellite microwave radiometry
NASA Astrophysics Data System (ADS)
Savelyev, I. B.; Yelland, M. J.; Norris, S. J.; Salisbury, D.; Pascal, R. W.; Bettenhausen, M. H.; Prytherch, J.; Anguelova, M. D.; Brooks, I. M.
2017-12-01
This study presents a novel approach to obtaining global sea spray aerosol (SSA) production source term by relying on direct satellite observations of the ocean surface, instead of more traditional approaches driven by surface meteorology. The primary challenge in developing this empirical algorithm is to compile a calibrated, consistent dataset of SSA surface flux collected offshore over a variety of conditions (i.e., regions and seasons), thus representative of the global SSA production variability. Such dataset includes observations from SEASAW, HiWASE, and WAGES field campaigns, during which the SSA flux was measured from the bow of a research vessel using consistent and state-of-the-art eddy covariance methodology. These in situ data are matched to observations of the state of the ocean surface from Windsat polarimetric microwave satellite radiometer. Previous studies demonstrated the ability of WindSat to detect variations in surface waves slopes, roughness and foam, which led to the development of retrieval algorithms for surface wind vector and more recently whitecap fraction. Similarly, in this study, microwave emissions from the ocean surface are matched to and calibrated against in situ observations of the SSA production flux. The resulting calibrated empirical algorithm is applicable for retrieval of SSA source term throughout the duration of Windsat mission, from 2003 to present.
Liu, Yanyan; Fan, Liangdong; Cai, Yixiao; Zhang, Wei; Wang, Baoyuan; Zhu, Bin
2017-07-19
Sufficiently high oxygen ion conductivity of electrolyte is critical for good performance of low-temperature solid oxide fuel cells (LT-SOFCs). Notably, material conductivity, reliability, and manufacturing cost are the major barriers hindering LT-SOFC commercialization. Generally, surface properties control the physical and chemical functionalities of materials. Hereby, we report a Sm 3+ , Pr 3+ , and Nd 3+ triple-doped ceria, exhibiting the highest ionic conductivity among reported doped-ceria oxides, 0.125 S cm -1 at 600 °C. It was designed using a two-step wet-chemical coprecipitation method to realize a desired doping for Sm 3+ at the bulk and Pr 3+ /Nd 3+ at surface domains (abbreviated as PNSDC). The redox couple Pr 3+ /Pr 4+ contributes to the extraordinary ionic conductivity. Moreover, the mechanism for ionic conductivity enhancement is demonstrated. The above findings reveal that a joint bulk and surface doping methodology for ceria is a feasible approach to develop new oxide-ion conductors with high impacts on advanced LT-SOFCs.
NASA Astrophysics Data System (ADS)
Belgasam, Tarek M.; Zbib, Hussein M.
2017-12-01
Dual-phase (DP) steels have received widespread attention for their low density and high strength. This low density is of value to the automotive industry for the weight reduction it offers and the attendant fuel savings and emission reductions. Recent studies on developing DP steels showed that the combination of strength/ductility could be significantly improved when changing the volume fraction and grain size of phases in the microstructure depending on microstructure properties. Consequently, DP steel manufacturers are interested in predicting microstructure properties and in optimizing microstructure design. In this work, a microstructure-based approach using representative volume elements (RVEs) was developed. The approach examined the flow behavior of DP steels using virtual tension tests with an RVE to identify specific mechanical properties. Microstructures with varied martensite and ferrite grain sizes, martensite volume fractions, carbon content, and morphologies were studied in 3D RVE approaches. The effect of these microstructure parameters on a combination of strength/ductility of DP steels was examined numerically using the finite element method by implementing a dislocation density-based elastic-plastic constitutive model, and a Response surface methodology to determine the optimum conditions for a required combination of strength/ductility. The results from the numerical simulations are compared with experimental results found in the literature. The developed methodology proves to be a powerful tool for studying the effect and interaction of key microstructural parameters on strength and ductility and thus can be used to identify optimum microstructural conditions.
Kim, Changjae; Habib, Ayman; Pyeon, Muwook; Kwon, Goo-rak; Jung, Jaehoon; Heo, Joon
2016-01-22
Diverse approaches to laser point segmentation have been proposed since the emergence of the laser scanning system. Most of these segmentation techniques, however, suffer from limitations such as sensitivity to the choice of seed points, lack of consideration of the spatial relationships among points, and inefficient performance. In an effort to overcome these drawbacks, this paper proposes a segmentation methodology that: (1) reduces the dimensions of the attribute space; (2) considers the attribute similarity and the proximity of the laser point simultaneously; and (3) works well with both airborne and terrestrial laser scanning data. A neighborhood definition based on the shape of the surface increases the homogeneity of the laser point attributes. The magnitude of the normal position vector is used as an attribute for reducing the dimension of the accumulator array. The experimental results demonstrate, through both qualitative and quantitative evaluations, the outcomes' high level of reliability. The proposed segmentation algorithm provided 96.89% overall correctness, 95.84% completeness, a 0.25 m overall mean value of centroid difference, and less than 1° of angle difference. The performance of the proposed approach was also verified with a large dataset and compared with other approaches. Additionally, the evaluation of the sensitivity of the thresholds was carried out. In summary, this paper proposes a robust and efficient segmentation methodology for abstraction of an enormous number of laser points into plane information.
Kim, Changjae; Habib, Ayman; Pyeon, Muwook; Kwon, Goo-rak; Jung, Jaehoon; Heo, Joon
2016-01-01
Diverse approaches to laser point segmentation have been proposed since the emergence of the laser scanning system. Most of these segmentation techniques, however, suffer from limitations such as sensitivity to the choice of seed points, lack of consideration of the spatial relationships among points, and inefficient performance. In an effort to overcome these drawbacks, this paper proposes a segmentation methodology that: (1) reduces the dimensions of the attribute space; (2) considers the attribute similarity and the proximity of the laser point simultaneously; and (3) works well with both airborne and terrestrial laser scanning data. A neighborhood definition based on the shape of the surface increases the homogeneity of the laser point attributes. The magnitude of the normal position vector is used as an attribute for reducing the dimension of the accumulator array. The experimental results demonstrate, through both qualitative and quantitative evaluations, the outcomes’ high level of reliability. The proposed segmentation algorithm provided 96.89% overall correctness, 95.84% completeness, a 0.25 m overall mean value of centroid difference, and less than 1° of angle difference. The performance of the proposed approach was also verified with a large dataset and compared with other approaches. Additionally, the evaluation of the sensitivity of the thresholds was carried out. In summary, this paper proposes a robust and efficient segmentation methodology for abstraction of an enormous number of laser points into plane information. PMID:26805849
Shazman, Shula; Elber, Gershon; Mandel-Gutfreund, Yael
2011-09-01
Protein nucleic acid interactions play a critical role in all steps of the gene expression pathway. Nucleic acid (NA) binding proteins interact with their partners, DNA or RNA, via distinct regions on their surface that are characterized by an ensemble of chemical, physical and geometrical properties. In this study, we introduce a novel methodology based on differential geometry, commonly used in face recognition, to characterize and predict NA binding surfaces on proteins. Applying the method on experimentally solved three-dimensional structures of proteins we successfully classify double-stranded DNA (dsDNA) from single-stranded RNA (ssRNA) binding proteins, with 83% accuracy. We show that the method is insensitive to conformational changes that occur upon binding and can be applicable for de novo protein-function prediction. Remarkably, when concentrating on the zinc finger motif, we distinguish successfully between RNA and DNA binding interfaces possessing the same binding motif even within the same protein, as demonstrated for the RNA polymerase transcription-factor, TFIIIA. In conclusion, we present a novel methodology to characterize protein surfaces, which can accurately tell apart dsDNA from an ssRNA binding interfaces. The strength of our method in recognizing fine-tuned differences on NA binding interfaces make it applicable for many other molecular recognition problems, with potential implications for drug design.
Temporal Data Fusion Approaches to Remote Sensing-Based Wetland Classification
NASA Astrophysics Data System (ADS)
Montgomery, Joshua S. M.
This thesis investigates the ecology of wetlands and associated classification in prairie and boreal environments of Alberta, Canada, using remote sensing technology to enhance classification of wetlands in the province. Objectives of the thesis are divided into two case studies, 1) examining how satellite borne Synthetic Aperture Radar (SAR), optical (RapidEye & SPOT) can be used to evaluate surface water trends in a prairie pothole environment (Shepard Slough); and 2) investigating a data fusion methodology combining SAR, optical and Lidar data to characterize wetland vegetation and surface water attributes in a boreal environment (Utikuma Regional Study Area (URSA)). Surface water extent and hydroperiod products were derived from SAR data, and validated using optical imagery with high accuracies (76-97% overall) for both case studies. High resolution Lidar Digital Elevation Models (DEM), Digital Surface Models (DSM), and Canopy Height Model (CHM) products provided the means for data fusion to extract riparian vegetation communities and surface water; producing model accuracies of (R2 0.90) for URSA, and RMSE of 0.2m to 0.7m at Shepard Slough when compared to field and optical validation data. Integration of Alberta and Canadian wetland classifications systems used to classify and determine economic value of wetlands into the methodology produced thematic maps relevant for policy and decision makers for potential wetland monitoring and policy development.
NASA Astrophysics Data System (ADS)
Nejad, Davood Ghoddocy; Khanchi, Ali Reza; Taghizadeh, Majid
2018-06-01
Recovery of vanadium from magnetite ore by direct acid leaching is discussed. The proposed process, which employs a mixture of nitric and sulfuric acids, avoids pyrometallurgical treatments since such treatment consumes a high amount of energy. To determine the optimum conditions of vanadium recovery, the leaching process is optimized through Plackett-Burman (P-B) design and response surface methodology (RSM). In this respect, temperature (80-95°C), liquid to solid ratio (L/S) (3-10 mL g-1), sulfuric acid concentration (3-6 M), nitric acid concentration (5-10 vol.%) and time (4-8 h) are considered as the independent variables. According to the P-B approach, temperature and acid concentrations are, respectively, the most effective parameters in the leaching process. These parameters are optimized using RSM to maximize recovery of vanadium by direct acid leaching. In this way, 86.7% of vanadium can be extracted from magnetic ore.
Subha, Bakthavachallam; Song, Young Chae; Woo, Jung Hui
2015-09-15
The present study aims to optimize the slow release biostimulant ball (BSB) for bioremediation of contaminated coastal sediment using response surface methodology (RSM). Different bacterial communities were evaluated using a pyrosequencing-based approach in contaminated coastal sediments. The effects of BSB size (1-5cm), distance (1-10cm) and time (1-4months) on changes in chemical oxygen demand (COD) and volatile solid (VS) reduction were determined. Maximum reductions of COD and VS, 89.7% and 78.8%, respectively, were observed at a 3cm ball size, 5.5cm distance and 4months; these values are the optimum conditions for effective treatment of contaminated coastal sediment. Most of the variance in COD and VS (0.9291 and 0.9369, respectively) was explained in our chosen models. BSB is a promising method for COD and VS reduction and enhancement of SRB diversity. Copyright © 2015 Elsevier Ltd. All rights reserved.
Code System for Performance Assessment Ground-water Analysis for Low-level Nuclear Waste.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MATTHEW,; KOZAK, W.
1994-02-09
Version 00 The PAGAN code system is a part of the performance assessment methodology developed for use by the U. S. Nuclear Regulatory Commission in evaluating license applications for low-level waste disposal facilities. In this methodology, PAGAN is used as one candidate approach for analysis of the ground-water pathway. PAGAN, Version 1.1 has the capability to model the source term, vadose-zone transport, and aquifer transport of radionuclides from a waste disposal unit. It combines the two codes SURFACE and DISPERSE which are used as semi-analytical solutions to the convective-dispersion equation. This system uses menu driven input/out for implementing a simplemore » ground-water transport analysis and incorporates statistical uncertainty functions for handling data uncertainties. The output from PAGAN includes a time- and location-dependent radionuclide concentration at a well in the aquifer, or a time- and location-dependent radionuclide flux into a surface-water body.« less
Liu, Wenjuan; Ji, Jianlin; Chen, Hua; Ye, Chenyu
2014-01-01
Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients' perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients' impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the 'central point', and three color attributes were optimized to maximize the patients' satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room.
NASA Astrophysics Data System (ADS)
Nejad, Davood Ghoddocy; Khanchi, Ali Reza; Taghizadeh, Majid
2018-03-01
Recovery of vanadium from magnetite ore by direct acid leaching is discussed. The proposed process, which employs a mixture of nitric and sulfuric acids, avoids pyrometallurgical treatments since such treatment consumes a high amount of energy. To determine the optimum conditions of vanadium recovery, the leaching process is optimized through Plackett-Burman (P-B) design and response surface methodology (RSM). In this respect, temperature (80-95°C), liquid to solid ratio (L/S) (3-10 mL g-1), sulfuric acid concentration (3-6 M), nitric acid concentration (5-10 vol.%) and time (4-8 h) are considered as the independent variables. According to the P-B approach, temperature and acid concentrations are, respectively, the most effective parameters in the leaching process. These parameters are optimized using RSM to maximize recovery of vanadium by direct acid leaching. In this way, 86.7% of vanadium can be extracted from magnetic ore.
Mehmood, Tahir; Ahmad, Asif; Ahmed, Anwaar; Ahmed, Zaheer
2017-08-15
The present study was conducted to prepare co-surfactant free, olive-oil based alpha tocopherol nanoemulsions, using a food grade non-ionic surfactant. Response surface methodology (RSM) was used to determine the effects of independent variables (ultrasonic homogenization time, olive oil concentrations and surfactant contents) on different physico-chemical characteristics of O/W nanoemulsions. This study was carried out using a central composite design. The coefficients of determination were greater than 0.900 for all response variables and there were significant effects of independent variables on all responses. The optimum levels of independent variables for the preparation of nanoemulsions were 3min. ultrasonic homogenization time, 4% olive oil content and 2.08% surfactant concentration. The physico-chemical responses at these levels were 151.68nm particle size, 7.17% p-anisidine and 88.64% antioxidant activity. These results will help in design of nanoemulsions with optimum independent variables. Copyright © 2017 Elsevier Ltd. All rights reserved.
Matias-Guiu, Pau; Rodríguez-Bencomo, Juan José; Pérez-Correa, José R; López, Francisco
2018-04-15
Developing new distillation strategies can help the spirits industry to improve quality, safety and process efficiency. Batch stills equipped with a packed column and an internal partial condenser are an innovative experimental system, allowing a fast and flexible management of the rectification. In this study, the impact of four factors (heart-cut volume, head-cut volume, pH and cooling flow rate of the internal partial condenser during the head-cut fraction) on 18 major volatile compounds of Muscat spirits was optimized using response surface methodology and desirability function approaches. Results have shown that high rectification at the beginning of the heart-cut enhances the overall positive aroma compounds of the product, reducing off-flavor compounds. In contrast, optimum levels of heart-cut volume, head-cut volume and pH factors varied depending on the process goal. Finally, three optimal operational conditions (head off-flavors reduction, flowery terpenic enhancement and fruity ester enhancement) were evaluated by chemical and sensory analysis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Ko, Wen-Ching; Chang, Chao-Kai; Wang, Hsiu-Ju; Wang, Shian-Jen; Hsieh, Chang-Wei
2015-04-01
The aim of this study was to develop an optimal microencapsulation method for an oil-soluble component (curcumin) using γ-PGA. The results show that Span80 significantly enhances the encapsulation efficiency (EE) of γ-Na(+)-PGA microcapsules. Therefore, the effects of γ-Na(+)-PGA, curcumin and Span80 concentration on EE of γ-Na(+)-PGA microcapsules were studied by means of response surface methodology (RSM). It was found that the optimal microencapsulation process is achieved by using γ-Na(+)-PGA 6.05%, curcumin 15.97% and Span80 0.61% with a high EE% (74.47 ± 0.20%). Furthermore, the models explain 98% of the variability in the responses. γ-Na(+)-PGA seems to be a good carrier for the encapsulation of curcumin. In conclusion, this simple and versatile approach can potentially be applied to the microencapsulation of various oil-soluble components for food applications. Copyright © 2014 Elsevier Ltd. All rights reserved.
Chen, Hua; Ye, Chenyu
2014-01-01
Color is one of the most powerful aspects of a psychological counseling environment. Little scientific research has been conducted on color design and much of the existing literature is based on observational studies. Using design of experiments and response surface methodology, this paper proposes an optimal color design approach for transforming patients’ perception into color elements. Six indices, pleasant-unpleasant, interesting-uninteresting, exciting-boring, relaxing-distressing, safe-fearful, and active-inactive, were used to assess patients’ impression. A total of 75 patients participated, including 42 for Experiment 1 and 33 for Experiment 2. 27 representative color samples were designed in Experiment 1, and the color sample (L = 75, a = 0, b = -60) was the most preferred one. In Experiment 2, this color sample was set as the ‘central point’, and three color attributes were optimized to maximize the patients’ satisfaction. The experimental results show that the proposed method can get the optimal solution for color design of a counseling room. PMID:24594683
Urban land use: Remote sensing of ground-basin permeability
NASA Technical Reports Server (NTRS)
Tinney, L. R.; Jensen, J. R.; Estes, J. E.
1975-01-01
A remote sensing analysis of the amount and type of permeable and impermeable surfaces overlying an urban recharge basin is discussed. An effective methodology for accurately generating this data as input to a safe yield study is detailed and compared to more conventional alternative approaches. The amount of area inventoried, approximately 10 sq. miles, should provide a reliable base against which automatic pattern recognition algorithms, currently under investigation for this task, can be evaluated. If successful, such approaches can significantly reduce the time and effort involved in obtaining permeability data, an important aspect of urban hydrology dynamics.
Recent Advances in the Analysis of Spiral Bevel Gears
NASA Technical Reports Server (NTRS)
Handschuh, Robert F.
1997-01-01
A review of recent progress for the analysis of spiral bevel gears will be described. The foundation of this work relies on the description of the gear geometry of face-milled spiral bevel gears via the approach developed by Litvin. This methodology was extended by combining the basic gear design data with the manufactured surfaces using a differential geometry approach, and provides the data necessary for assembling three-dimensional finite element models. The finite element models have been utilized to conduct thermal and structural analysis of the gear system. Examples of the methods developed for thermal and structural/contact analysis are presented.
Escorihuela, Jorge; Bañuls, María José; García Castelló, Javier; Toccafondo, Veronica; García-Rupérez, Jaime; Puchades, Rosa; Maquieira, Ángel
2012-12-01
Methodology for the functionalization of silicon-based materials employed for the development of photonic label-free nanobiosensors is reported. The studied functionalization based on organosilane chemistry allowed the direct attachment of biomolecules in a single step, maintaining their bioavailability. Using this immobilization approach in probe microarrays, successful specific detection of bacterial DNA is achieved, reaching hybridization sensitivities of 10 pM. The utility of the immobilization approach for the functionalization of label-free nanobiosensors based on photonic crystals and ring resonators was demonstrated using bovine serum albumin (BSA)/anti-BSA as a model system.
Gu, Junsi; Fahrenkrug, Eli; Maldonado, Stephen
2014-09-02
The substrate-overlayer approach has been used to acquire surface enhanced Raman spectra (SERS) during and after electrochemical atomic layer deposition (ECALD) of CdSe, CdTe, and CdS thin films. The collected data suggest that SERS measurements performed with off-resonance (i.e. far from the surface plasmonic wavelength of the underlying SERS substrate) laser excitation do not introduce perturbations to the ECALD processes. Spectra acquired in this way afford rapid insight on the quality of the semiconductor film during the course of an ECALD process. For example, SERS data are used to highlight ECALD conditions that yield crystalline CdSe and CdS films. In contrast, SERS measurements with short wavelength laser excitation show evidence of photoelectrochemical effects that were not germane to the intended ECALD process. Using the semiconductor films prepared by ECALD, the substrate-overlayer SERS approach also affords analysis of semiconductor surface adsorbates. Specifically, Raman spectra of benzenethiol adsorbed onto CdSe, CdTe, and CdS films are detailed. Spectral shifts in the vibronic features of adsorbate bonding suggest subtle differences in substrate-adsorbate interactions, highlighting the sensitivity of this methodology.
Low-Cost Methodology for Skin Strain Measurement of a Flexed Biological Limb.
Lin, Bevin; Moerman, Kevin M; McMahan, Connor G; Pasch, Kenneth A; Herr, Hugh M
2017-12-01
The purpose of this manuscript is to compute skin strain data from a flexed biological limb, using portable, inexpensive, and easily available resources. We apply and evaluate this approach on a person with bilateral transtibial amputations, imaging left and right residual limbs in extended and flexed knee postures. We map 3-D deformations to a flexed biological limb using freeware and a simple point-and-shoot camera. Mean principal strain, maximum shear strain, as well as lines of maximum, minimum, and nonextension are computed from 3-D digital models to inform directional mappings of the strain field for an unloaded residual limb. Peak tensile strains are ∼0.3 on the anterior surface of the knee in the proximal region of the patella, whereas peak compressive strains are ∼ -0.5 on the posterior surface of the knee. Peak maximum shear strains are ∼0.3 on the posterior surface of the knee. The accuracy and precision of this methodology are assessed for a ground-truth model. The mean point location distance is found to be 0.08 cm, and the overall standard deviation for point location difference vectors is 0.05 cm. This low-cost and mobile methodology may prove critical for applications such as the prosthetic socket interface where whole-limb skin strain data are required from patients in the field outside of traditional, large-scale clinical centers. Such data may inform the design of wearable technologies that directly interface with human skin.
NASA Astrophysics Data System (ADS)
Rudrapati, R.; Sahoo, P.; Bandyopadhyay, A.
2016-09-01
The main aim of the present work is to analyse the significance of turning parameters on surface roughness in computer numerically controlled (CNC) turning operation while machining of aluminium alloy material. Spindle speed, feed rate and depth of cut have been considered as machining parameters. Experimental runs have been conducted as per Box-Behnken design method. After experimentation, surface roughness is measured by using stylus profile meter. Factor effects have been studied through analysis of variance. Mathematical modelling has been done by response surface methodology, to made relationships between the input parameters and output response. Finally, process optimization has been made by teaching learning based optimization (TLBO) algorithm. Predicted turning condition has been validated through confirmatory experiment.
Surface characterization protocol for precision aspheric optics
NASA Astrophysics Data System (ADS)
Sarepaka, RamaGopal V.; Sakthibalan, Siva; Doodala, Somaiah; Panwar, Rakesh S.; Kotaria, Rajendra
2017-10-01
In Advanced Optical Instrumentation, Aspherics provide an effective performance alternative. The aspheric fabrication and surface metrology, followed by aspheric design are complementary iterative processes for Precision Aspheric development. As in fabrication, a holistic approach of aspheric surface characterization is adopted to evaluate actual surface error and to aim at the deliverance of aspheric optics with desired surface quality. Precision optical surfaces are characterized by profilometry or by interferometry. Aspheric profiles are characterized by contact profilometers, through linear surface scans to analyze their Form, Figure and Finish errors. One must ensure that, the surface characterization procedure does not add to the resident profile errors (generated during the aspheric surface fabrication). This presentation examines the errors introduced post-surface generation and during profilometry of aspheric profiles. This effort is to identify sources of errors and is to optimize the metrology process. The sources of error during profilometry may be due to: profilometer settings, work-piece placement on the profilometer stage, selection of zenith/nadir points of aspheric profiles, metrology protocols, clear aperture - diameter analysis, computational limitations of the profiler and the software issues etc. At OPTICA, a PGI 1200 FTS contact profilometer (Taylor-Hobson make) is used for this study. Precision Optics of various profiles are studied, with due attention to possible sources of errors during characterization, with multi-directional scan approach for uniformity and repeatability of error estimation. This study provides an insight of aspheric surface characterization and helps in optimal aspheric surface production methodology.
Zeng, Shanshan; Wang, Lu; Zhang, Lei; Qu, Haibin; Gong, Xingchu
2013-06-01
An activity-based approach to optimize the ultrasonic-assisted extraction of antioxidants from Pericarpium Citri Reticulatae (Chenpi in Chinese) was developed. Response surface optimization based on a quantitative composition-activity relationship model showed the relationships among product chemical composition, antioxidant activity of extract, and parameters of extraction process. Three parameters of ultrasonic-assisted extraction, including the ethanol/water ratio, Chenpi amount, and alkaline amount, were investigated to give optimum extraction conditions for antioxidants of Chenpi: ethanol/water 70:30 v/v, Chenpi amount of 10 g, and alkaline amount of 28 mg. The experimental antioxidant yield under the optimum conditions was found to be 196.5 mg/g Chenpi, and the antioxidant activity was 2023.8 μmol Trolox equivalents/g of the Chenpi powder. The results agreed well with the second-order polynomial regression model. This presented approach promised great application potentials in both food and pharmaceutical industries. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Auwal, Shehu Muhammad; Zarei, Mohammad; Abdul-Hamid, Azizah; Saari, Nazamid
2017-01-01
The stone fish is an under-utilized sea cucumber with many nutritional and ethno-medicinal values. This study aimed to establish the conditions for its optimum hydrolysis with bromelain to generate angiotensin I-converting enzyme (ACE)-inhibitory hydrolysates. Response surface methodology (RSM) based on a central composite design was used to model and optimize the degree of hydrolysis (DH) and ACE-inhibitory activity. Process conditions including pH (4–7), temperature (40–70 °C), enzyme/substrate (E/S) ratio (0.5%–2%) and time (30–360 min) were used. A pH of 7.0, temperature of 40 °C, E/S ratio of 2% and time of 240 min were determined using a response surface model as the optimum levels to obtain the maximum ACE-inhibitory activity of 84.26% at 44.59% degree of hydrolysis. Hence, RSM can serve as an effective approach in the design of experiments to improve the antihypertensive effect of stone fish hydrolysates, which can thus be used as a value-added ingredient for various applications in the functional foods industries. PMID:28362352
Calculation of a solid/liquid surface tension: A methodological study
NASA Astrophysics Data System (ADS)
Dreher, T.; Lemarchand, C.; Soulard, L.; Bourasseau, E.; Malfreyt, P.; Pineau, N.
2018-01-01
The surface tension of a model solid/liquid interface constituted of a graphene sheet surrounded by liquid methane has been computed using molecular dynamics in the Kirkwood-Buff formalism. We show that contrary to the fluid/fluid case, the solid/liquid case can lead to different structurations of the first fluid layer, leading to significantly different values of surface tension. Therefore we present a statistical approach that consists in running a series of molecular simulations of similar systems with different initial conditions, leading to a distribution of surface tensions from which an average value and uncertainty can be extracted. Our results suggest that these distributions converge as the system size increases. Besides we show that surface tension is not particularly sensitive to the choice of the potential energy cutoff and that long-range corrections can be neglected contrary to what we observed in the liquid/vapour interfaces. We have not observed the previously reported commensurability effect.
Strategy for determination of LOD and LOQ values--some basic aspects.
Uhrovčík, Jozef
2014-02-01
The paper is devoted to the evaluation of limit of detection (LOD) and limit of quantification (LOQ) values in concentration domain by using 4 different approaches; namely 3σ and 10σ approaches, ULA2 approach, PBA approach and MDL approach. Brief theoretical analyses of all above mentioned approaches are given together with directions for their practical use. Calculations and correct calibration design are exemplified by using of electrothermal atomic absorption spectrometry for determination of lead in drinking water sample. These validation parameters reached 1.6 μg L(-1) (LOD) and 5.4 μg L(-1) (LOQ) by using 3σ and 10σ approaches. For obtaining relevant values of analyte concentration the influence of calibration design and measurement methodology were examined. The most preferred technique has proven to be a method of preconcentration of the analyte on the surface of the graphite cuvette (boost cycle). © 2013 Elsevier B.V. All rights reserved.
Rine, J.M.; Berg, R.C.; Shafer, J.M.; Covington, E.R.; Reed, J.K.; Bennett, C.B.; Trudnak, J.E.
1998-01-01
A methodology was developed to evaluate and map the contamination potential or aquifer sensitivity of the upper groundwater flow system of a portion of the General Separations Area (GSA) at the Department of Energy's Savannah River Site (SRS) in South Carolina. A Geographic Information System (GIS) was used to integrate diverse subsurface geologic data, soils data, and hydrology utilizing a stack-unit mapping approach to construct mapping layers. This is the first time that such an approach has been used to delineate the hydrogeology of a coastal plain environment. Unit surface elevation maps were constructed for the tops of six Tertiary units derived from over 200 boring logs. Thickness or isopach maps were created for five hydrogeologic units by differencing top and basal surface elevations. The geologic stack-unit map was created by stacking the five isopach maps and adding codes for each stack-unit polygon. Stacked-units were rated according to their hydrogeologic properties and ranked using a logarithmic approach (utility theory) to establish a contamination potential index. Colors were assigned to help display relative importance of stacked-units in preventing or promoting transport of contaminants. The sensitivity assessment included the effects of surface soils on contaminants which are particularly important for evaluating potential effects from surface spills. Hydrogeologic/hydrologic factors did not exhibit sufficient spatial variation to warrant incorporation into contamination potential assessment. Development of this contamination potential mapping system provides a useful tool for site planners, environmental scientists, and regulatory agencies.A methodology was developed to evaluate and map the contamination potential or aquifer sensitivity of the upper groundwater flow system of a portion of the General Separations Area (GSA) at the Department of Energy's Savannah River Site (SRS) in South Carolina. A Geographic Information System (GIS) was used to integrate diverse subsurface geologic data, soils data, and hydrology utilizing a stack-unit mapping approach to construct mapping layers. This is the first time that such an approach has been used to delineate the hydrogeology of a coastal plain environment. Unit surface elevation maps were constructed for the tops of six Tertiary units derived from over 200 boring logs. Thickness or isopach maps were created for five hydrogeologic units by differencing top and basal surface elevations. The geologic stack-unit map was created by stacking the five isopach maps and adding codes for each stack-unit polygon. Stacked-units were rated according to their hydrogeologic properties and ranked using a logarithmic approach (utility theory) to establish a contamination potential index. Colors were assigned to help display relative importance of stacked-units in preventing or promoting transport of contaminants. The sensitivity assessment included the effects of surface soils on contaminants which are particularly important for evaluating potential effects from surface spills. Hydrogeologic/hydrologic factors did not exhibit sufficient spatial variation to warrant incorporation into contamination potential assessment. Development of this contamination potential mapping system provides a useful tool for site planners, environmental scientists, and regulatory agencies.
Connor, Thomas H; Smith, Jerome P
2016-09-01
At the present time, the method of choice to determine surface contamination of the workplace with antineoplastic and other hazardous drugs is surface wipe sampling and subsequent sample analysis with a variety of analytical techniques. The purpose of this article is to review current methodology for determining the level of surface contamination with hazardous drugs in healthcare settings and to discuss recent advances in this area. In addition it will provide some guidance for conducting surface wipe sampling and sample analysis for these drugs in healthcare settings. Published studies on the use of wipe sampling to measure hazardous drugs on surfaces in healthcare settings drugs were reviewed. These studies include the use of well-documented chromatographic techniques for sample analysis in addition to newly evolving technology that provides rapid analysis of specific antineoplastic. Methodology for the analysis of surface wipe samples for hazardous drugs are reviewed, including the purposes, technical factors, sampling strategy, materials required, and limitations. The use of lateral flow immunoassay (LFIA) and fluorescence covalent microbead immunosorbent assay (FCMIA) for surface wipe sample evaluation is also discussed. Current recommendations are that all healthc a re settings where antineoplastic and other hazardous drugs are handled include surface wipe sampling as part of a comprehensive hazardous drug-safe handling program. Surface wipe sampling may be used as a method to characterize potential occupational dermal exposure risk and to evaluate the effectiveness of implemented controls and the overall safety program. New technology, although currently limited in scope, may make wipe sampling for hazardous drugs more routine, less costly, and provide a shorter response time than classical analytical techniques now in use.
NASA Astrophysics Data System (ADS)
de Lara-Castells, María Pilar; Aguirre, Néstor F.; Stoll, Hermann; Mitrushchenkov, Alexander O.; Mateo, David; Pi, Martí
2015-04-01
An ab-initio-based methodological scheme for He-surface interactions and zero-temperature time-dependent density functional theory for superfluid 4He droplets motion are combined to follow the short-time collision dynamics of the Au@4He300 system with the TiO2(110) surface. This composite approach demonstrates the 4He droplet-assisted sticking of the metal species to the surface at low landing energy (below 0.15 eV/atom), thus providing the first theoretical evidence of the experimentally observed 4He droplet-mediated soft-landing deposition of metal nanoparticles on solid surfaces [Mozhayskiy et al., J. Chem. Phys. 127, 094701 (2007) and Loginov et al., J. Phys. Chem. A 115, 7199 (2011)].
Modeling of a ring rosen-type piezoelectric transformer by Hamilton's principle.
Nadal, Clément; Pigache, Francois; Erhart, Jiří
2015-04-01
This paper deals with the analytical modeling of a ring Rosen-type piezoelectric transformer. The developed model is based on a Hamiltonian approach, enabling to obtain main parameters and performance evaluation for the first radial vibratory modes. Methodology is detailed, and final results, both the input admittance and the electric potential distribution on the surface of the secondary part, are compared with numerical and experimental ones for discussion and validation.
NASA Astrophysics Data System (ADS)
Herman, K.; Mircescu, N. E.; Szabo, L.; Leopold, L. F.; Chiş, V.; Leopold, N.
2013-05-01
An improved approach for surface-enhanced Raman scattering (SERS) detection of mixture constituents after thin layer chromatography (TLC) separation is presented. A SERS active silver substrate was prepared under open air conditions, directly on the thin silica film by photo-reduction of silver nitrate, allowing the detection of binary mixtures of cresyl violet, bixine, crystal violet, and Cu(II) complex of 4-(2-pyridylazo)resorcinol. The recorded SERS spectrum provides a unique spectral fingerprint for each molecule; therefore the use of analyte standards is avoided, thus rendering the presented procedure advantageous compared to the conventional detection methodology in TLC.
Preparation of water-soluble magnetic nanocrystals using aryl diazonium salt chemistry.
Griffete, Nébéwia; Herbst, Frédéric; Pinson, Jean; Ammar, Souad; Mangeney, Claire
2011-02-16
A novel and facile methodology for the in situ surface functionalization of Fe(3)O(4) nanoparticles is proposed, based on the use of aryl diazonium salts chemistry. The grafting reaction involves the formation of diazoates in a basic medium. These species are unstable and dediazonize along a homolytic pathway to give aryl radicals which further react with the Fe(3)O(4) NPs during their formation and stop their growth. Advantages of the present approach rely not only on the simplicity, rapidity, and efficiency of the procedure but also on the formation of strong Fe(3)O(4)-aryl surface bonds, highly suitable for further applications.
Bottiglione, F; Carbone, G
2015-01-14
The apparent contact angle of large 2D drops with randomly rough self-affine profiles is numerically investigated. The numerical approach is based upon the assumption of large separation of length scales, i.e. it is assumed that the roughness length scales are much smaller than the drop size, thus making it possible to treat the problem through a mean-field like approach relying on the large-separation of scales. The apparent contact angle at equilibrium is calculated in all wetting regimes from full wetting (Wenzel state) to partial wetting (Cassie state). It was found that for very large values of the roughness Wenzel parameter (r(W) > -1/ cos θ(Y), where θ(Y) is the Young's contact angle), the interface approaches the perfect non-wetting condition and the apparent contact angle is almost equal to 180°. The results are compared with the case of roughness on one single scale (sinusoidal surface) and it is found that, given the same value of the Wenzel roughness parameter rW, the apparent contact angle is much larger for the case of a randomly rough surface, proving that the multi-scale character of randomly rough surfaces is a key factor to enhance superhydrophobicity. Moreover, it is shown that for millimetre-sized drops, the actual drop pressure at static equilibrium weakly affects the wetting regime, which instead seems to be dominated by the roughness parameter. For this reason a methodology to estimate the apparent contact angle is proposed, which relies only upon the micro-scale properties of the rough surface.
Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin
2008-01-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182
Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin
2008-07-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.
Analysis of rainfall-induced slope instability using a field of local factor of safety
Lu, Ning; Şener-Kaya, Başak; Wayllace, Alexandra; Godt, Jonathan W.
2012-01-01
Slope-stability analyses are mostly conducted by identifying or assuming a potential failure surface and assessing the factor of safety (FS) of that surface. This approach of assigning a single FS to a potentially unstable slope provides little insight on where the failure initiates or the ultimate geometry and location of a landslide rupture surface. We describe a method to quantify a scalar field of FS based on the concept of the Coulomb stress and the shift in the state of stress toward failure that results from rainfall infiltration. The FS at each point within a hillslope is called the local factor of safety (LFS) and is defined as the ratio of the Coulomb stress at the current state of stress to the Coulomb stress of the potential failure state under the Mohr-Coulomb criterion. Comparative assessment with limit-equilibrium and hybrid finite element limit-equilibrium methods show that the proposed LFS is consistent with these approaches and yields additional insight into the geometry and location of the potential failure surface and how instability may initiate and evolve with changes in pore water conditions. Quantitative assessments applying the new LFS field method to slopes under infiltration conditions demonstrate that the LFS has the potential to overcome several major limitations in the classical FS methodologies such as the shape of the failure surface and the inherent underestimation of slope instability. Comparison with infinite-slope methods, including a recent extension to variably saturated conditions, shows further enhancement in assessing shallow landslide occurrence using the LFS methodology. Although we use only a linear elastic solution for the state of stress with no post-failure analysis that require more sophisticated elastoplastic or other theories, the LFS provides a new means to quantify the potential instability zones in hillslopes under variably saturated conditions using stress-field based methods.
Agüero, A; Pinedo, P; Simón, I; Cancio, D; Moraleda, M; Trueba, C; Pérez-Sánchez, D
2008-09-15
A methodological approach which includes conceptual developments, methodological aspects and software tools have been developed in the Spanish context, based on the BIOMASS "Reference Biospheres Methodology". The biosphere assessments have to be undertaken with the aim of demonstrating compliance with principles and regulations established to limit the possible radiological impact of radioactive waste disposals on human health and on the environment, and to ensure that future generations will not be exposed to higher radiation levels than those that would be acceptable today. The biosphere in the context of high-level waste disposal is defined as the collection of various radionuclide transfer pathways that may result in releases into the surface environment, transport within and between the biosphere receptors, exposure of humans and biota, and the doses/risks associated with such exposures. The assessments need to take into account the complexity of the biosphere, the nature of the radionuclides released and the long timescales considered. It is also necessary to make assumptions related to the habits and lifestyle of the exposed population, human activities in the long term and possible modifications of the biosphere. A summary on the Spanish methodological approach for biosphere assessment are presented here as well as its application in a Spanish generic case study. A reference scenario has been developed based on current conditions at a site located in Central-West Spain, to indicate the potential impact to the actual population. In addition, environmental change has been considered qualitatively through the use of interaction matrices and transition diagrams. Unit source terms of (36)Cl, (79)Se, (99)Tc, (129)I, (135)Cs, (226)Ra, (231)Pa, (238)U, (237)Np and (239)Pu have been taken. Two exposure groups of infants and adults have been chosen for dose calculations. Results are presented and their robustness is evaluated through the use of uncertainty and sensitivity analyses.
Tiruta-Barna, Ligia; Fantozzi-Merle, Catherine; de Brauer, Christine; Barna, Radu
2006-11-16
The aim of this paper is the investigation of the leaching behaviour of different porous materials containing organic pollutants (PAH: naphthalene and phenanthrene). The assessment methodology of long term leaching behaviour of inorganic materials was extended to cement solidified organic pollutants. Based on a scenario-approach considering environmental factors, matrix and pollutants specificities, the applied methodology is composed of adapted equilibrium and dynamic leaching tests. The contributions of different physical and chemical mechanisms were identified and the leaching behaviour was modelled. The physical parameters of the analysed reference and polluted materials are similar. A difference in the pore size distribution appears for higher naphthalene content. The solubility of the PAH contained in the material is affected by the ionic strength and by the presence of a co-solvent; the solution pH does not influence PAH solubility. The solubility of the major mineral species is not influenced by the presence of the two PAH nor by the presence of the methanol as co-solvent in the range of the tested material compositions. In the case of the leaching of a monolith material the main transport mechanism is the diffusion in the porous system. For both mineral and organic species we observed at least two dynamic domains. At the beginning of the leaching process the released flux is due to the surface dissolution and to the diffusion of the main quantity dissolved in the initial pore solution. The second period is governed by a stationary regime between dissolution in pore water and diffusion. The model, coupling transport and chemical phenomena in the pore solution, at the monolith surface and in the leachate simulates satisfactory the release for both mineral and organic species.
2012-01-01
Background Response surface methodology by Box–Behnken design employing the multivariate approach enables substantial improvement in the method development using fewer experiments, without wastage of large volumes of organic solvents, which leads to high analysis cost. This methodology has not been employed for development of a method for analysis of atorvastatin calcium (ATR-Ca). Results The present research study describes the use of in optimization and validation of a new microwell-based UV-Visible spectrophotometric method of for determination of ATR-Ca in its tablets. By the use of quadratic regression analysis, equations were developed to describe the behavior of the response as simultaneous functions of the selected independent variables. Accordingly, the optimum conditions were determined which included concentration of 2,3-dichloro-5,6-dicyano-1,4-benzoquinone (DDQ), time of reaction and temperature. The absorbance of the colored-CT complex was measured at 460 nm by microwell-plate absorbance reader. The method was validated, in accordance with ICH guidelines for accuracy, precision, selectivity and linearity (r² = 0.9993) over the concentration range of 20–200 μg/ml. The assay was successfully applied to the analysis of ATR-Ca in its pharmaceutical dosage forms with good accuracy and precision. Conclusion The assay described herein has great practical value in the routine analysis of ATR-Ca in quality control laboratories, as it has high throughput property, consumes minimum volume of organic solvent thus it offers the reduction in the exposures of the analysts to the toxic effects of organic solvents, environmentally friendly "Green" approach) and reduction in the analysis cost by 50-fold. PMID:23146143
Methodology for Outdoor Water Savings Model and Spreadsheet Tool for U.S. and Selected States
DOE Office of Scientific and Technical Information (OSTI.GOV)
Williams, Alison A.; Chen, Yuting; Dunham, Camilla
Green lawns and landscaping are archetypical of the populated American landscape, and typically require irrigation, which corresponds to a significant fraction of residential, commercial, and institutional water use. In North American cities, the estimated portion of residential water used for outdoor purposes ranges from 22-38% in cooler climates up to 59-67% in dry and hot environments, while turfgrass coverage within the United States spans 11.1-20.2 million hectares (Milesi et al. 2009). One national estimate uses satellite and aerial photography data to develop a relationship between impervious surface and lawn surface area, yielding a conservative estimate of 16.4 (± 3.6) millionmore » hectares of lawn surface area in the United States—an area three times larger than that devoted to any irrigated crop (Milesi et al. 2005). One approach that holds promise for cutting unnecessary outdoor water use is the increased deployment of “smart” irrigation controllers to increase the water efficiency of irrigation systems. This report describes the methodology and inputs employed in a mathematical model that quantifies the effects of the U.S. Environmental Protection Agency’s WaterSense labeling program for one such type of controller, weather-based irrigation controllers (WBIC). This model builds off that described in “Methodology for National Water Savings Model and Spreadsheet Tool–Outdoor Water Use” and uses a two-tiered approach to quantify outdoor water savings attributable to the WaterSense program for WBIC, as well as net present value (NPV) of that savings. While the first iteration of the model assessed national impacts using averaged national values, this version begins by evaluating impacts in three key large states that make up a sizable portion of the irrigation market: California, Florida, and Texas. These states are considered to be the principal market of “smart” irrigation controllers that may result in the bulk of national savings. Modeled water savings and net present value for these three states should be more accurate and representative than the averaged national values given state-specific inputs such as lot size, water price, and housing stock. To complete the picture of national impacts, the remaining WBIC shipments not assigned to these three states are assessed using the original methodology based on the averaged national values.« less
NASA Astrophysics Data System (ADS)
Joubert-Doriol, Loïc; Izmaylov, Artur F.
2018-03-01
A new methodology of simulating nonadiabatic dynamics using frozen-width Gaussian wavepackets within the moving crude adiabatic representation with the on-the-fly evaluation of electronic structure is presented. The main feature of the new approach is the elimination of any global or local model representation of electronic potential energy surfaces; instead, the electron-nuclear interaction is treated explicitly using the Gaussian integration. As a result, the new scheme does not introduce any uncontrolled approximations. The employed variational principle ensures the energy conservation and leaves the number of electronic and nuclear basis functions as the only parameter determining the accuracy. To assess performance of the approach, a model with two electronic and two nuclear spacial degrees of freedom containing conical intersections between potential energy surfaces has been considered. Dynamical features associated with nonadiabatic transitions and nontrivial geometric (or Berry) phases were successfully reproduced within a limited basis expansion.
2016-01-01
A novel method of extracting heart rate and oxygen saturation from a video-based biosignal is described. The method comprises a novel modular continuous wavelet transform approach which includes: performing the transform, undertaking running wavelet archetyping to enhance the pulse information, extraction of the pulse ridge time–frequency information [and thus a heart rate (HRvid) signal], creation of a wavelet ratio surface, projection of the pulse ridge onto the ratio surface to determine the ratio of ratios from which a saturation trending signal is derived, and calibrating this signal to provide an absolute saturation signal (SvidO2). The method is illustrated through its application to a video photoplethysmogram acquired during a porcine model of acute desaturation. The modular continuous wavelet transform-based approach is advocated by the author as a powerful methodology to deal with noisy, non-stationary biosignals in general. PMID:27382479
Benson, Alex J; Eys, Mark A; Irving, P Gregory
2016-04-01
Many athletes experience a discrepancy between the roles they expect to fulfill and the roles they eventually occupy. Drawing from met expectations theory, we applied response surface methodology to examine how role expectations, in relation to role experiences, influence perceptions of group cohesion among Canadian Interuniversity Sport athletes (N = 153). On the basis of data from two time points, as athletes approached and exceeded their role contribution expectations, they reported higher perceptions of task cohesion. Furthermore, as athletes approached and exceeded their social involvement expectations, they reported higher perceptions of social cohesion. These response surface patterns-pertaining to task and social cohesion-were driven by the positive influence of role experiences. On the basis of the interplay between athletes' role experiences and their perception of the group environment, efforts to improve team dynamics may benefit from focusing on improving the quality of role experiences, in conjunction with developing realistic role expectations.
NASA Astrophysics Data System (ADS)
Alves, Sofia A.; Patel, Sweetu B.; Sukotjo, Cortino; Mathew, Mathew T.; Filho, Paulo N.; Celis, Jean-Pierre; Rocha, Luís A.; Shokuhfar, Tolou
2017-03-01
The modification of surface features such as nano-morphology/topography and chemistry have been employed in the attempt to design titanium oxide surfaces able to overcome the current dental implants failures. The main goal of this study is the synthesis of bone-like structured titanium dioxide (TiO2) nanotubes enriched with Calcium (Ca) and Phosphorous (P) able to enhance osteoblastic cell functions and, simultaneously, display an improved corrosion behavior. To achieve the main goal, TiO2 nanotubes were synthetized and doped with Ca and P by means of a novel methodology which relied, firstly, on the synthesis of TiO2 nanotubes by anodization of titanium in an organic electrolyte followed by reverse polarization and/or anodization, in an aqueous electrolyte. Results show that hydrophilic bone-like structured TiO2 nanotubes were successfully synthesized presenting a highly ordered nano-morphology characterized by non-uniform diameters. The chemical analysis of such nanotubes confirmed the presence of CaCO3, Ca3(PO4)2, CaHPO4 and CaO compounds. The nanotube surfaces submitted to reverse polarization, presented an improved cell adhesion and proliferation compared to smooth titanium. Furthermore, these surfaces displayed a significantly lower passive current in artificial saliva, and so, potential to minimize their bio-degradation through corrosion processes. This study addresses a very simple and promising multidisciplinary approach bringing new insights for the development of novel methodologies to improve the outcome of osseointegrated implants.
Pulsipher, Abigail; Dutta, Debjit; Luo, Wei; Yousaf, Muhammad N
2014-09-01
We report a strategy to rewire cell surfaces for the dynamic control of ligand composition on cell membranes and the modulation of cell-cell interactions to generate three-dimensional (3D) tissue structures applied to stem-cell differentiation, cell-surface tailoring, and tissue engineering. We tailored cell surfaces with bioorthogonal chemical groups on the basis of a liposome-fusion and -delivery method to create dynamic, electroactive, and switchable cell-tissue assemblies through chemistry involving chemoselective conjugation and release. Each step to modify the cell surface: activation, conjugation, release, and regeneration, can be monitored and modulated by noninvasive, label-free analytical techniques. We demonstrate the utility of this methodology by the conjugation and release of small molecules to and from cell surfaces and by the generation of 3D coculture spheroids and multilayered cell tissues that can be programmed to undergo assembly and disassembly on demand. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ghosh, Batu; Shirahata, Naoto
2014-01-01
This review describes a series of representative synthesis processes, which have been developed in the last two decades to prepare silicon quantum dots (QDs). The methods include both top-down and bottom-up approaches, and their methodological advantages and disadvantages are presented. Considerable efforts in surface functionalization of QDs have categorized it into (i) a two-step process and (ii) in situ surface derivatization. Photophysical properties of QDs are summarized to highlight the continuous tuning of photoluminescence color from the near-UV through visible to the near-IR range. The emission features strongly depend on the silicon nanostructures including QD surface configurations. Possible mechanisms of photoluminescence have been summarized to ascertain the future challenges toward industrial use of silicon-based light emitters. PMID:27877634
Yin, Jingjing; Nakas, Christos T; Tian, Lili; Reiser, Benjamin
2018-03-01
This article explores both existing and new methods for the construction of confidence intervals for differences of indices of diagnostic accuracy of competing pairs of biomarkers in three-class classification problems and fills the methodological gaps for both parametric and non-parametric approaches in the receiver operating characteristic surface framework. The most widely used such indices are the volume under the receiver operating characteristic surface and the generalized Youden index. We describe implementation of all methods and offer insight regarding the appropriateness of their use through a large simulation study with different distributional and sample size scenarios. Methods are illustrated using data from the Alzheimer's Disease Neuroimaging Initiative study, where assessment of cognitive function naturally results in a three-class classification setting.
NASA Astrophysics Data System (ADS)
Sato, Shintaro; Takahashi, Masayuki; Ohnishi, Naofumi
2017-05-01
An approach for electrohydrodynamic (EHD) force production is proposed with a focus on a charge cycle on a dielectric surface. The cycle, consisting of positive-charging and neutralizing strokes, is completely different from the conventional methodology, which involves a negative-charging stroke, in that the dielectric surface charge is constantly positive. The two-stroke charge cycle is realized by applying a DC voltage combined with repetitive pulses. Simulation results indicate that the negative pulse eliminates the surface charge accumulated during constant voltage phase, resulting in repetitive EHD force generation. The time-averaged EHD force increases almost linearly with increasing repetitive pulse frequency and becomes one order of magnitude larger than that driven by the sinusoidal voltage, which has the same peak-to-peak voltage.
Towards an Airframe Noise Prediction Methodology: Survey of Current Approaches
NASA Technical Reports Server (NTRS)
Farassat, Fereidoun; Casper, Jay H.
2006-01-01
In this paper, we present a critical survey of the current airframe noise (AFN) prediction methodologies. Four methodologies are recognized. These are the fully analytic method, CFD combined with the acoustic analogy, the semi-empirical method and fully numerical method. It is argued that for the immediate need of the aircraft industry, the semi-empirical method based on recent high quality acoustic database is the best available method. The method based on CFD and the Ffowcs William- Hawkings (FW-H) equation with penetrable data surface (FW-Hpds ) has advanced considerably and much experience has been gained in its use. However, more research is needed in the near future particularly in the area of turbulence simulation. The fully numerical method will take longer to reach maturity. Based on the current trends, it is predicted that this method will eventually develop into the method of choice. Both the turbulence simulation and propagation methods need to develop more for this method to become useful. Nonetheless, the authors propose that the method based on a combination of numerical and analytical techniques, e.g., CFD combined with FW-H equation, should also be worked on. In this effort, the current symbolic algebra software will allow more analytical approaches to be incorporated into AFN prediction methods.
An approach to accidents modeling based on compounds road environments.
Fernandes, Ana; Neves, Jose
2013-04-01
The most common approach to study the influence of certain road features on accidents has been the consideration of uniform road segments characterized by a unique feature. However, when an accident is related to the road infrastructure, its cause is usually not a single characteristic but rather a complex combination of several characteristics. The main objective of this paper is to describe a methodology developed in order to consider the road as a complete environment by using compound road environments, overcoming the limitations inherented in considering only uniform road segments. The methodology consists of: dividing a sample of roads into segments; grouping them into quite homogeneous road environments using cluster analysis; and identifying the influence of skid resistance and texture depth on road accidents in each environment by using generalized linear models. The application of this methodology is demonstrated for eight roads. Based on real data from accidents and road characteristics, three compound road environments were established where the pavement surface properties significantly influence the occurrence of accidents. Results have showed clearly that road environments where braking maneuvers are more common or those with small radii of curvature and high speeds require higher skid resistance and texture depth as an important contribution to the accident prevention. Copyright © 2013 Elsevier Ltd. All rights reserved.
How to Select the most Relevant Roughness Parameters of a Surface: Methodology Research Strategy
NASA Astrophysics Data System (ADS)
Bobrovskij, I. N.
2018-01-01
In this paper, the foundations for new methodology creation which provides solving problem of surfaces structure new standards parameters huge amount conflicted with necessary actual floors quantity of surfaces structure parameters which is related to measurement complexity decreasing are considered. At the moment, there is no single assessment of the importance of a parameters. The approval of presented methodology for aerospace cluster components surfaces allows to create necessary foundation, to develop scientific estimation of surfaces texture parameters, to obtain material for investigators of chosen technological procedure. The methods necessary for further work, the creation of a fundamental reserve and development as a scientific direction for assessing the significance of microgeometry parameters are selected.
Dubey, D P; Tiwari, R N; Dwivedi, Umesh
2006-04-01
Pollution susceptibility of groundwater of Rewa town situated on karstified Bhander limestones of the Bhander group is discussed in this paper. Pollution potential of selected localities in the town has been determined using the DRASTIC INDEX methodology. Pollution potential for these localities varied between 162 to 217. Shallow aquifers in karstified limestones having direct access to surface water were found more susceptible to pollution. Accordingly, remedial measures were suggested for minimising pollution.
Shazman, Shula; Elber, Gershon; Mandel-Gutfreund, Yael
2011-01-01
Protein nucleic acid interactions play a critical role in all steps of the gene expression pathway. Nucleic acid (NA) binding proteins interact with their partners, DNA or RNA, via distinct regions on their surface that are characterized by an ensemble of chemical, physical and geometrical properties. In this study, we introduce a novel methodology based on differential geometry, commonly used in face recognition, to characterize and predict NA binding surfaces on proteins. Applying the method on experimentally solved three-dimensional structures of proteins we successfully classify double-stranded DNA (dsDNA) from single-stranded RNA (ssRNA) binding proteins, with 83% accuracy. We show that the method is insensitive to conformational changes that occur upon binding and can be applicable for de novo protein-function prediction. Remarkably, when concentrating on the zinc finger motif, we distinguish successfully between RNA and DNA binding interfaces possessing the same binding motif even within the same protein, as demonstrated for the RNA polymerase transcription-factor, TFIIIA. In conclusion, we present a novel methodology to characterize protein surfaces, which can accurately tell apart dsDNA from an ssRNA binding interfaces. The strength of our method in recognizing fine-tuned differences on NA binding interfaces make it applicable for many other molecular recognition problems, with potential implications for drug design. PMID:21693557
NASA Astrophysics Data System (ADS)
Moradi, Neshat; Salem, Shiva; Salem, Amin
2018-03-01
This work highlighted the effective activation of bentonite paste to produce nano-porous powder for removal of cationic dye from wastewater. The effects of activation parameters such as soda and moisture contents, ageing time and temperature were analyzed using response surface methodology (RSM). The significance of independent variables and their interactions were tested by blending the obtained powders with wastewater and then the adsorption was evaluated, spectrophotometrically. The experiments were carried out by preparation of pastes according to response surface methodology and central composite design, which is the standard method, was used to evaluate the effects and interactions of four factors on the treatment efficiency. RSM was demonstrated as an appropriate approach for optimization of alkali activation. The optimal conditions obtained from the desirable responses were 5.0 wt% soda and 45.0 wt% moisture, respectively in which the powder activation was carried out at 150 °C. In order to well understand the role of nano-structured material on dye removal, the adsorbents were characterized through X-ray diffraction, Fourier transform infrared spectroscopy, scanning electron microscopy and Brunauer-Emmett-Teller surface area measurement. Finally, the analysis clearly demonstrates that the dye removal onto prepared adsorbent is well fitted with Langmuir isotherm compared to the other isotherm models. The low cost of material and facile process support the further development for commercial application purpose.
NASA Astrophysics Data System (ADS)
Tauro, F.; Piscopia, R.; Grimaldi, S.
2017-12-01
Image-based methodologies, such as large scale particle image velocimetry (LSPIV) and particle tracking velocimetry (PTV), have increased our ability to noninvasively conduct streamflow measurements by affording spatially distributed observations at high temporal resolution. However, progress in optical methodologies has not been paralleled by the implementation of image-based approaches in environmental monitoring practice. We attribute this fact to the sensitivity of LSPIV, by far the most frequently adopted algorithm, to visibility conditions and to the occurrence of visible surface features. In this work, we test both LSPIV and PTV on a data set of 12 videos captured in a natural stream wherein artificial floaters are homogeneously and continuously deployed. Further, we apply both algorithms to a video of a high flow event on the Tiber River, Rome, Italy. In our application, we propose a modified PTV approach that only takes into account realistic trajectories. Based on our findings, LSPIV largely underestimates surface velocities with respect to PTV in both favorable (12 videos in a natural stream) and adverse (high flow event in the Tiber River) conditions. On the other hand, PTV is in closer agreement than LSPIV with benchmark velocities in both experimental settings. In addition, the accuracy of PTV estimations can be directly related to the transit of physical objects in the field of view, thus providing tangible data for uncertainty evaluation.
Evaluation of subsidence hazard in mantled karst setting: a case study from Val d'Orléans (France)
NASA Astrophysics Data System (ADS)
Perrin, Jérôme; Cartannaz, Charles; Noury, Gildas; Vanoudheusden, Emilie
2015-04-01
Soil subsidence/collapse is a major geohazard occurring in karst region. It occurs as suffosion or dropout sinkholes developing in the soft cover. Less frequently it corresponds to a breakdown of karst void ceiling (i.e., collapse sinkhole). This hazard can cause significant engineering challenges. Therefore decision-makers require the elaboration of methodologies for reliable predictions of such hazards (e.g., karst subsidence susceptibility and hazards maps, early-warning monitoring systems). A methodological framework was developed to evaluate relevant conditioning factors favouring subsidence (Perrin et al. submitted) and then to combine these factors to produce karst subsidence susceptibility maps. This approach was applied to a mantled karst area south of Paris (Val d'Orléans). Results show the significant roles of the overburden lithology (presence/absence of low-permeability layer) and of the karst aquifer piezometric surface position within the overburden. In parallel, an experimental site has been setup to improve the understanding of key processes leading to subsidence/collapse and includes piezometers for measurements of water levels and physico-chemical parameters in both the alluvial and karst aquifers as well as surface deformation monitoring. Results should help in designing monitoring systems to anticipate occurrence of subsidence/collapse. Perrin J., Cartannaz C., Noury G., Vanoudheusden E. 2015. A multicriteria approach to karst subsidence hazard mapping supported by Weights-of-Evidence analysis. Submitted to Engineering Geology.
Gradient-Based Aerodynamic Shape Optimization Using ADI Method for Large-Scale Problems
NASA Technical Reports Server (NTRS)
Pandya, Mohagna J.; Baysal, Oktay
1997-01-01
A gradient-based shape optimization methodology, that is intended for practical three-dimensional aerodynamic applications, has been developed. It is based on the quasi-analytical sensitivities. The flow analysis is rendered by a fully implicit, finite volume formulation of the Euler equations.The aerodynamic sensitivity equation is solved using the alternating-direction-implicit (ADI) algorithm for memory efficiency. A flexible wing geometry model, that is based on surface parameterization and platform schedules, is utilized. The present methodology and its components have been tested via several comparisons. Initially, the flow analysis for for a wing is compared with those obtained using an unfactored, preconditioned conjugate gradient approach (PCG), and an extensively validated CFD code. Then, the sensitivities computed with the present method have been compared with those obtained using the finite-difference and the PCG approaches. Effects of grid refinement and convergence tolerance on the analysis and shape optimization have been explored. Finally the new procedure has been demonstrated in the design of a cranked arrow wing at Mach 2.4. Despite the expected increase in the computational time, the results indicate that shape optimization, which require large numbers of grid points can be resolved with a gradient-based approach.
Modeling of nanostructured porous thermoelastic composites with surface effects
NASA Astrophysics Data System (ADS)
Nasedkin, A. V.; Nasedkina, A. A.; Kornievsky, A. S.
2017-01-01
The paper presents an integrated approach for determination of effective properties of anisotropic porous thermoelastic materials with a nanoscale stochastic porosity structure. This approach includes the effective moduli method for composite me-chanics, the simulation of representative volumes and the finite element method. In order to take into account nanoscale sizes of pores, the Gurtin-Murdoch model of surface stresses and the highly conducting interface model are used at the borders between material and pores. The general methodology for determination of effective properties of porous composites is demonstrated for a two-phase composite with special conditions for stresses and heat flux discontinuities at the phase interfaces. The mathematical statements of boundary value problems and the resulting formulas to determine the complete set of effective constants of the two-phase composites with arbitrary anisotropy and with surface properties are described; the generalized statements are formulated and the finite element approximations are given. It is shown that the homogenization procedures for porous composites with surface effects can be considered as special cases of the corresponding procedures for the two-phase composites with interphase stresses and heat fluxes if the moduli of nanoinclusions are negligibly small. These approaches have been implemented in the finite element package ANSYS for a model of porous material with cubic crystal system for various values of surface moduli, porosity and number of pores. It has been noted that the magnitude of the area of the interphase boundaries has influence on the effective moduli of the porous materials with nanosized structure.
Genetic Inventory Task Final Report. Volume 2
NASA Technical Reports Server (NTRS)
Venkateswaran, Kasthuri; LaDuc, Myron T.; Vaishampayan, Parag
2012-01-01
Contaminant terrestrial microbiota could profoundly impact the scientific integrity of extraterrestrial life-detection experiments. It is therefore important to know what organisms persist on spacecraft surfaces so that their presence can be eliminated or discriminated from authentic extraterrestrial biosignatures. Although there is a growing understanding of the biodiversity associated with spacecraft and cleanroom surfaces, it remains challenging to assess the risk of these microbes confounding life-detection or sample-return experiments. A key challenge is to provide a comprehensive inventory of microbes present on spacecraft surfaces. To assess the phylogenetic breadth of microorganisms on spacecraft and associated surfaces, the Genetic Inventory team used three technologies: conventional cloning techniques, PhyloChip DNA microarrays, and 454 tag-encoded pyrosequencing, together with a methodology to systematically collect, process, and archive nucleic acids. These three analysis methods yielded considerably different results: Traditional approaches provided the least comprehensive assessment of microbial diversity, while PhyloChip and pyrosequencing illuminated more diverse microbial populations. The overall results stress the importance of selecting sample collection and processing approaches based on the desired target and required level of detection. The DNA archive generated in this study can be made available to future researchers as genetic-inventory-oriented technologies further mature.
Ashengroph, Morahem; Nahvi, Iraj; Amini, Jahanshir
2013-01-01
For all industrial processes, modelling, optimisation and control are the keys to enhance productivity and ensure product quality. In the current study, the optimization of process parameters for improving the conversion of isoeugenol to vanillin by Psychrobacter sp. CSW4 was investigated by means of Taguchi approach and Box-Behnken statistical design under resting cell conditions. Taguchi design was employed for screening the significant variables in the bioconversion medium. Sequentially, Box-Behnken design experiments under Response Surface Methodology (RSM) was used for further optimization. Four factors (isoeugenol, NaCl, biomass and tween 80 initial concentrations), which have significant effects on vanillin yield, were selected from ten variables by Taguchi experimental design. With the regression coefficient analysis in the Box-Behnken design, a relationship between vanillin production and four significant variables was obtained, and the optimum levels of the four variables were as follows: initial isoeugenol concentration 6.5 g/L, initial tween 80 concentration 0.89 g/L, initial NaCl concentration 113.2 g/L and initial biomass concentration 6.27 g/L. Under these optimized conditions, the maximum predicted concentration of vanillin was 2.25 g/L. These optimized values of the factors were validated in a triplicate shaking flask study and an average of 2.19 g/L for vanillin, which corresponded to a molar yield 36.3%, after a 24 h bioconversion was obtained. The present work is the first one reporting the application of Taguchi design and Response surface methodology for optimizing bioconversion of isoeugenol into vanillin under resting cell conditions.
An Agile Course-Delivery Approach
ERIC Educational Resources Information Center
Capellan, Mirkeya
2009-01-01
In the world of software development, agile methodologies have gained popularity thanks to their lightweight methodologies and flexible approach. Many advocates believe that agile methodologies can provide significant benefits if applied in the educational environment as a teaching method. The need for an approach that engages and motivates…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lara-Castells, María Pilar de, E-mail: Pilar.deLara.Castells@csic.es; Aguirre, Néstor F.; Stoll, Hermann
2015-04-07
An ab-initio-based methodological scheme for He-surface interactions and zero-temperature time-dependent density functional theory for superfluid {sup 4}He droplets motion are combined to follow the short-time collision dynamics of the Au@{sup 4}He{sub 300} system with the TiO{sub 2}(110) surface. This composite approach demonstrates the {sup 4}He droplet-assisted sticking of the metal species to the surface at low landing energy (below 0.15 eV/atom), thus providing the first theoretical evidence of the experimentally observed {sup 4}He droplet-mediated soft-landing deposition of metal nanoparticles on solid surfaces [Mozhayskiy et al., J. Chem. Phys. 127, 094701 (2007) and Loginov et al., J. Phys. Chem. A 115,more » 7199 (2011)].« less
Shannon, Robin; Glowacki, David R
2018-02-15
The chemical master equation is a powerful theoretical tool for analyzing the kinetics of complex multiwell potential energy surfaces in a wide range of different domains of chemical kinetics spanning combustion, atmospheric chemistry, gas-surface chemistry, solution phase chemistry, and biochemistry. There are two well-established methodologies for solving the chemical master equation: a stochastic "kinetic Monte Carlo" approach and a matrix-based approach. In principle, the results yielded by both approaches are identical; the decision of which approach is better suited to a particular study depends on the details of the specific system under investigation. In this Article, we present a rigorous method for accelerating stochastic approaches by several orders of magnitude, along with a method for unbiasing the accelerated results to recover the "true" value. The approach we take in this paper is inspired by the so-called "boxed molecular dynamics" (BXD) method, which has previously only been applied to accelerate rare events in molecular dynamics simulations. Here we extend BXD to design a simple algorithmic strategy for accelerating rare events in stochastic kinetic simulations. Tests on a number of systems show that the results obtained using the BXD rare event strategy are in good agreement with unbiased results. To carry out these tests, we have implemented a kinetic Monte Carlo approach in MESMER, which is a cross-platform, open-source, and freely available master equation solver.
Ratcliffe, Elizabeth; Hourd, Paul; Guijarro-Leach, Juan; Rayment, Erin; Williams, David J; Thomas, Robert J
2013-01-01
Commercial regenerative medicine will require large quantities of clinical-specification human cells. The cost and quality of manufacture is notoriously difficult to control due to highly complex processes with poorly defined tolerances. As a step to overcome this, we aimed to demonstrate the use of 'quality-by-design' tools to define the operating space for economic passage of a scalable human embryonic stem cell production method with minimal cell loss. Design of experiments response surface methodology was applied to generate empirical models to predict optimal operating conditions for a unit of manufacture of a previously developed automatable and scalable human embryonic stem cell production method. Two models were defined to predict cell yield and cell recovery rate postpassage, in terms of the predictor variables of media volume, cell seeding density, media exchange and length of passage. Predicted operating conditions for maximized productivity were successfully validated. Such 'quality-by-design' type approaches to process design and optimization will be essential to reduce the risk of product failure and patient harm, and to build regulatory confidence in cell therapy manufacturing processes.
Abu, Mary Ladidi; Nooh, Hisham Mohd; Oslan, Siti Nurbaya; Salleh, Abu Bakar
2017-11-10
Pichia guilliermondii was found capable of expressing the recombinant thermostable lipase without methanol under the control of methanol dependent alcohol oxidase 1 promoter (AOXp 1). In this study, statistical approaches were employed for the screening and optimisation of physical conditions for T1 lipase production in P. guilliermondii. The screening of six physical conditions by Plackett-Burman Design has identified pH, inoculum size and incubation time as exerting significant effects on lipase production. These three conditions were further optimised using, Box-Behnken Design of Response Surface Methodology, which predicted an optimum medium comprising pH 6, 24 h incubation time and 2% inoculum size. T1 lipase activity of 2.0 U/mL was produced with a biomass of OD 600 23.0. The process of using RSM for optimisation yielded a 3-fold increase of T1 lipase over medium before optimisation. Therefore, this result has proven that T1 lipase can be produced at a higher yield in P. guilliermondii.
Cappato, Leandro P; Martins, Amanda M Dias; Ferreira, Elisa H R; Rosenthal, Amauri
An ascomycetes fungus was isolated from brine storage of green olives of the Arauco cultivar imported from Argentina and identified as Monascus ruber. The combined effects of different concentrations of sodium chloride (3.5-5.5%), sodium benzoate (0-0.1%), potassium sorbate (0-0.05%) and temperature (30-40°C) were investigated on the growth of M. ruber in the brine of stored table olives using a response surface methodology. A full 2 4 factorial design with three central points was first used in order to screen for the important factors (significant and marginally significant factors) and then a Face-Centered Central Composite Design was applied. Both preservatives prevented fungal spoilage, but potassium sorbate was the most efficient to control the fungi growth. The combined use of these preservatives did not show a synergistic effect. The results showed that the use of these salts may not be sufficient to prevent fungal spoilage and the greatest fungal growth was recorded at 30°C. Copyright © 2017 Sociedade Brasileira de Microbiologia. Published by Elsevier Editora Ltda. All rights reserved.
Dubny, Sabrina; Peluso, Fabio; Masson, Ignacio; Othax, Natalia; González Castelain, José
2018-04-01
Using the USEPA methodology we estimated the probabilistic chronic risks for calves and adult cows due to pesticide exposure through oral intake of contaminated surface and ground waters in Tres Arroyos County (Argentina). Because published data on pesticide toxicity endpoints for cows are scarce, we used threshold levels based on interspecies extrapolation methods. The studied waters showed acceptable quality for cattle production since none of the pesticides were present at high-enough concentrations to potentially affect cow health. Moreover, ground waters had better quality than surface waters, with dieldrin and deltamethrin being the pesticides associated with the highest risk values in the former and the latter water compartments, respectively. Our study presents a novel use of the USEPA risk methodology proving it is useful for water quality evaluation in terms of pesticide toxicity for cattle production. This approach represents an alternative tool for water quality management in the absence of specific cattle pesticide regulatory limits. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Raudah; Zulkifli
2018-03-01
The present research focuses on establishing the optimum conditions in converting coffee husk into a densified biomass fuel using starch as a binding agent. A Response Surface Methodology (RSM) approach using Box-Behnken experimental design with three levels (-1, 0, and +1) was employed to obtain the optimum level for each parameter. The briquettes wereproduced by compressing the mixture of coffee husk-starch in a piston and die assembly with the pressure of 2000 psi. Furthermore, starch percentage, pyrolysis time, and particle size were the input parameters for the algorithm. Bomb calorimeter was used to determine the heating value (HHV) of the solid fuel. The result of the study indicated that a combination of 34.71 mesh particle size, 110.93 min pyrolysis time, and 8% starch concentration werethe optimum variables.The HHV and density of the fuel were up to 5644.66 calgr-1 and 0.7069 grcm-3,respectively. The study showed that further research should be conducted to improve the briquette density therefore the coffee husk could be convert into commercialsolid fuel to replace the dependent on fossil fuel.
Sharma, Praveen; Singh, Lakhvinder; Dilbaghi, Neeraj
2009-01-30
The aim of our research was to study, effect of temperature, pH and initial dye concentration on decolorization of diazo dye Acid Red 151 (AR 151) from simulated dye solution using a fungal isolate Aspergillus fumigatus fresenius have been investigated. The central composite design matrix and response surface methodology (RSM) have been applied to design the experiments to evaluate the interactive effects of three most important operating variables: temperature (25-35 degrees C), pH (4.0-7.0), and initial dye concentration (100-200 mg/L) on the biodegradation of AR 151. The total 20 experiments were conducted in the present study towards the construction of a quadratic model. Very high regression coefficient between the variables and the response (R(2)=0.9934) indicated excellent evaluation of experimental data by second-order polynomial regression model. The RSM indicated that initial dye concentration of 150 mg/L, pH 5.5 and a temperature of 30 degrees C were optimal for maximum % decolorization of AR 151 in simulated dye solution, and 84.8% decolorization of AR 151 was observed at optimum growth conditions.
Khamanga, Sandile Maswazi; Walker, Roderick B
2012-01-01
Captopril (CPT) microparticles were manufactured by solvent evaporation using acetone (dispersion phase) and liquid paraffin (manufacturing phase) with Eudragit® and Methocel® as coat materials. Design of experiments and response surface methodology (RSM) approaches were used to optimize the process. The microparticles were characterized based on the percent of drug released and yield, microcapsule size, entrapment efficiency and Hausner ratio. Differential scanning calorimetry (DSC), Infrared (IR) spectroscopy, scanning electron microscopy (SEM) and in vitro dissolution studies were conducted. The microcapsules were spherical, free-flowing and IR and DSC thermograms revealed that CPT was stable. The percent drug released was investigated with respect to Eudragit® RS and Methocel® K100M, Methocel® K15M concentrations and homogenizing speed. The optimal conditions for microencapsulation were 1.12 g Eudragit® RS, 0.67 g Methocel® K100M and 0.39 g Methocel® K15M at a homogenizing speed of 1643 rpm and 89% CPT was released. The value of RSM-mediated microencapsulation of CPT was elucidated.
NASA Astrophysics Data System (ADS)
Luo, Xiaobo; Guan, Rongfa; Chen, Xiaoqiang; Tao, Miao; Ma, Jieqing; Zhao, Jin
2014-06-01
The major component in green tea polyphenols, epigallocatechin-3-gallate (EGCG), has been demonstrated to prevent carcinogenesis. To improve the effectiveness of EGCG, liposomes were used as a carrier in this study. Reverse-phase evaporation method besides response surface methodology is a simple, rapid, and beneficial approach for liposome preparation and optimization. The optimal preparation conditions were as follows: phosphatidylcholine-to-cholesterol ratio of 4.00, EGCG concentration of 4.88 mg/mL, Tween 80 concentration of 1.08 mg/mL, and rotary evaporation temperature of 34.51°C. Under these conditions, the experimental encapsulation efficiency and size of EGCG nanoliposomes were 85.79% ± 1.65% and 180 nm ± 4 nm, which were close with the predicted value. The malondialdehyde value and the release test in vitro indicated that the prepared EGCG nanoliposomes were stable and suitable for more widespread application. Furthermore, compared with free EGCG, encapsulation of EGCG enhanced its inhibitory effect on tumor cell viability at higher concentrations.
2012-01-01
This paper utilizes a statistical approach, the response surface optimization methodology, to determine the optimum conditions for the Acid Black 172 dye removal efficiency from aqueous solution by electrocoagulation. The experimental parameters investigated were initial pH: 4–10; initial dye concentration: 0–600 mg/L; applied current: 0.5-3.5 A and reaction time: 3–15 min. These parameters were changed at five levels according to the central composite design to evaluate their effects on decolorization through analysis of variance. High R2 value of 94.48% shows a high correlation between the experimental and predicted values and expresses that the second-order regression model is acceptable for Acid Black 172 dye removal efficiency. It was also found that some interactions and squares influenced the electrocoagulation performance as well as the selected parameters. Optimum dye removal efficiency of 90.4% was observed experimentally at initial pH of 7, initial dye concentration of 300 mg/L, applied current of 2 A and reaction time of 9.16 min, which is close to model predicted (90%) result. PMID:23369574
NASA Astrophysics Data System (ADS)
Pogorzelski, Stanisław J.; Rochowski, Pawel; Szurkowski, Janusz
2014-02-01
An investigation of water contact angles (CAs), contact angle hysteresis (CAH) was carried out for 1-year to 4-year old needles (Pinus sylvestris) collected in urban (Gdansk) and rural (Karsin) locations using an original measuring technique based on the geometry of the drop on a vertical filament. Concentrations of air pollutants (SO2, NOx, C6H6, and suspended particular matter - SPM) currently considered to be most important in causing direct damage to vegetation were simultaneously monitored. A set of the surface wettability parameters: the apparent surface free energy γSV, adhesive film tension Π, work of adhesion WA, and spreading WS, were determined from CAH data using the approach developed by Chibowski (2003) to quantify the surface energetics of the needle substrata affected by aging and pollution impacts. This formalism relates the total apparent surface free energy of the solid γSV with only three measurable quantities: the surface tension of the probe liquid γLV and its advancing θA and receding θR contact angle hysteresis. Since CAH depends on the outermost wax layer surface roughness and spatial physicochemical heterogeneity of a solid surface, CA data were corrected using surface architecture profiles registered with confocal scanning laser microscopy. It was found that the roughness parameter r is significantly negatively correlated (R = -0.74) with the needle age (collected at Karsin). The needle surface aging process resulted in its surface hydrophilization (CA↓ and CAH↓ with γSV↑ and WA↑). A temporal evolution of the needles wettability was traced with the data point distribution in the 2D space of CAH plotted versus WS. The wettability parameters were closely correlated to pollutant concentrations as evidenced from Spearman's rank correlation procedure (R = 0.63-0.91; p < 0.05). The aim of the study was to validate the established CA methodology to create a new non-invasive, low-cost technique suitable for monitoring of structural changes at interfaces of biological systems.
NASA Technical Reports Server (NTRS)
Joseph, Alicia T.; O'Neil, P. E.; vanderVelde, R.; Gish, T.
2008-01-01
A methodology is presented to correct backscatter (sigma(sup 0)) observations for the effect of vegetation. The proposed methodology is based on the concept that the ratio of the surface scattering over the total amount of scattering (sigma(sup 0)(sub soil)/sigma(sup 0)) is only affected by the vegetation and can be described as a function of the vegetation water content. Backscatter observations sigma(sup 0) from the soil are not influenced by vegetation. Under bare soil conditions (sigma(sup 0)(sub soil)/sigma(sup 0)) equals 1. Under low to moderate biomass and soil moisture conditions, vegetation affects the observed sigma(sup 0) through absorption of the surface scattering and contribution of direct scattering by the vegetation itself. Therefore, the contribution of the surface scattering is smaller than the observed total amount of scattering and decreases as the biomass increases. For dense canopies scattering interactions between the soil surface and vegetation elements (e.g. leaves and stems) also become significant. Because these higher order scattering mechanisms are influenced by the soil surface, an increase in (sigma(sup 0)(sub soil)/sigma(sup 0)) may be observed as the biomass increases under densely vegetated conditions. This methodology is applied within the framework of time series based approach for the retrieval of soil moisture. The data set used for this investigation has been collected during a campaign conducted at USDA's Optimizing Production Inputs for Economic and Environmental Enhancement OPE-3) experimental site in Beltsville, Maryland (USA). This campaign took place during the corn growth cycle from May 10th to 0ctober 2nd, 2002. In this period the corn crops reached a vegetation water content of 5.1 kg m(exp -2) at peak biomass and a soil moisture range varying between 0.00 to 0.26 cubic cm/cubic cm. One of the deployed microwave instruments operated was a multi-frequency (C-band (4.75 GHz) and L-band (1.6 GHz)) quad-polarized (HH, HV, VV, VH) radar which was mounted on a 20 meter long boom. In the OPE-3 field campaign, radar observations were collected once a week at nominal times of 8 am, 10 am, 12 noon and 2 pm. During each data run the radar acquired sixty independent measurements within an azimuth of 120 degrees from a boom height of 12.2 m and at three different incidence angles (15,35, and 55 degrees). The sixty observations were averaged to provide one backscatter value for the study area and its accuracy is estimated to be 51.0 dB. For this investigation the C-band observations have been used. Application of the proposed methodology to the selected data set showed a well-defined relationship between (sigma(sup 0)(sub soil)/sigma(sup 0)) and the vegetation water content. It is found that this relationship can be described with two experimentally determined parameters, which depend on the sensing configuration (e.g. incidence angle and polarization). Through application of the proposed vegetation correction methodology and the obtained parameterizations, the soil moisture retrieval accuracy within the framework of a time series based approach is improved from 0.033 to 0.032 cubic cm/cubic cm, from 0.049 to 0.033 cubic cm/cubic cm and from 0.079 to 0.047 cubic cm/cubic cm for incidence angles of 15,35 and 55 degrees, respectively. Improvement in soil moisture retrieval due to vegetation correction is greater at larger incidence angles (due to the increased path length and larger vegetation effects on the surface signal at the larger angles).
Use of remote sensing for land use policy formulation
NASA Technical Reports Server (NTRS)
1987-01-01
The overall objectives and strategies of the Center for Remote Sensing remain to provide a center for excellence for multidisciplinary scientific expertise to address land-related global habitability and earth observing systems scientific issues. Specific research projects that were underway during the final contract period include: digital classification of coniferous forest types in Michigan's northern lower peninsula; a physiographic ecosystem approach to remote classification and mapping; land surface change detection and inventory; analysis of radiant temperature data; and development of methodologies to assess possible impacts of man's changes of land surface on meteorological parameters. Significant progress in each of the five project areas has occurred. Summaries on each of the projects are provided.
Numerical study of drop spreading on a flat surface
NASA Astrophysics Data System (ADS)
Wang, Sheng; Desjardins, Olivier
2017-11-01
In this talk, we perform a numerical study of a droplet on a flat surface with special emphasis on capturing the spreading dynamics. The computational methodology employed is tailored for simulating large-scale two-phase flows within complex geometries. It combines a conservative level-set method to capture the liquid-gas interface, a conservative immersed boundary method to represent the solid-fluid interface, and a sub-grid curvature model at the triple-point to implicitly impose the contact angle of the liquid-gas interface. The performance of the approach is assessed in the inertial droplet spreading regime, the viscous spreading regime of high viscosity drops, and with the capillary oscillation of low viscosity droplets.
Subsurface Grain Morphology Reconstruction by Differential Aperture X-ray Microscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eisenlohr, Philip; Shanthraj, Pratheek; Vande Kieft, Brendan R.
A multistep, non-destructive grain morphology reconstruction methodology that is applicable to near-surface volumes is developed and tested on synthetic grain structures. This approach probes the subsurface crystal orientation using differential aperture x-ray microscopy on a sparse grid across the microstructure volume of interest. Resulting orientation data are clustered according to proximity in physical and orientation space and used as seed points for an initial Voronoi tessellation to (crudely) approximate the grain morphology. Curvature-driven grain boundary relaxation, simulated by means of the Voronoi implicit interface method, progressively improves the reconstruction accuracy. The similarity between bulk and readily accessible surface reconstruction errormore » provides an objective termination criterion for boundary relaxation.« less
Mahjouri, Najmeh; Ardestani, Mojtaba
2011-01-01
In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.
Stacked Autoencoders for Outlier Detection in Over-the-Horizon Radar Signals
Protopapadakis, Eftychios; Doulamis, Anastasios; Doulamis, Nikolaos; Dres, Dimitrios; Bimpas, Matthaios
2017-01-01
Detection of outliers in radar signals is a considerable challenge in maritime surveillance applications. High-Frequency Surface-Wave (HFSW) radars have attracted significant interest as potential tools for long-range target identification and outlier detection at over-the-horizon (OTH) distances. However, a number of disadvantages, such as their low spatial resolution and presence of clutter, have a negative impact on their accuracy. In this paper, we explore the applicability of deep learning techniques for detecting deviations from the norm in behavioral patterns of vessels (outliers) as they are tracked from an OTH radar. The proposed methodology exploits the nonlinear mapping capabilities of deep stacked autoencoders in combination with density-based clustering. A comparative experimental evaluation of the approach shows promising results in terms of the proposed methodology's performance. PMID:29312449
Werner, Kent; Bosson, Emma; Berglund, Sten
2006-12-01
Safety assessment related to the siting of a geological repository for spent nuclear fuel deep in the bedrock requires identification of potential flow paths and the associated travel times for radionuclides originating at repository depth. Using the Laxemar candidate site in Sweden as a case study, this paper describes modeling methodology, data integration, and the resulting water flow models, focusing on the Quaternary deposits and the upper 150 m of the bedrock. Example simulations identify flow paths to groundwater discharge areas and flow paths in the surface system. The majority of the simulated groundwater flow paths end up in the main surface waters and along the coastline, even though the particles used to trace the flow paths are introduced with a uniform spatial distribution at a relatively shallow depth. The calculated groundwater travel time, determining the time available for decay and retention of radionuclides, is on average longer to the coastal bays than to other biosphere objects at the site. Further, it is demonstrated how GIS-based modeling can be used to limit the number of surface flow paths that need to be characterized for safety assessment. Based on the results, the paper discusses an approach for coupling the present models to a model for groundwater flow in the deep bedrock.
Airport Surface Network Architecture Definition
NASA Technical Reports Server (NTRS)
Nguyen, Thanh C.; Eddy, Wesley M.; Bretmersky, Steven C.; Lawas-Grodek, Fran; Ellis, Brenda L.
2006-01-01
Currently, airport surface communications are fragmented across multiple types of systems. These communication systems for airport operations at most airports today are based dedicated and separate architectures that cannot support system-wide interoperability and information sharing. The requirements placed upon the Communications, Navigation, and Surveillance (CNS) systems in airports are rapidly growing and integration is urgently needed if the future vision of the National Airspace System (NAS) and the Next Generation Air Transportation System (NGATS) 2025 concept are to be realized. To address this and other problems such as airport surface congestion, the Space Based Technologies Project s Surface ICNS Network Architecture team at NASA Glenn Research Center has assessed airport surface communications requirements, analyzed existing and future surface applications, and defined a set of architecture functions that will help design a scalable, reliable and flexible surface network architecture to meet the current and future needs of airport operations. This paper describes the systems approach or methodology to networking that was employed to assess airport surface communications requirements, analyze applications, and to define the surface network architecture functions as the building blocks or components of the network. The systems approach used for defining these functions is relatively new to networking. It is viewing the surface network, along with its environment (everything that the surface network interacts with or impacts), as a system. Associated with this system are sets of services that are offered by the network to the rest of the system. Therefore, the surface network is considered as part of the larger system (such as the NAS), with interactions and dependencies between the surface network and its users, applications, and devices. The surface network architecture includes components such as addressing/routing, network management, network performance and security.
Tricio, Jorge A; Montt, Juan E; Ormeño, Andrea P; Del Real, Alberto J; Naranjo, Claudia A
2017-06-01
The aim of this study was to assess, after one year, the impact of faculty development in teaching and learning skills focused on a learner-centered approach on faculty members' perceptions of and approaches to teaching and on their students' learning experiences and approaches. Before training (2014), all 176 faculty members at a dental school in Chile were invited to complete the Approaches to Teaching Inventory (ATI) to assess their teaching approaches (student- vs. teacher-focused). In 2015, all 496 students were invited to complete the Study Process Questionnaire (R-SPQ-2F) to assess their learning approaches (deep or surface) and the Course Experience Questionnaire (CEQ) to measure their teaching quality perceptions. Subsequently, faculty development workshops on student-centered teaching methodologies were delivered, followed by peer observation. In March 2016, all 176 faculty members and 491 students were invited to complete a second ATI (faculty) and R-SPQ-2 and CEQ (students). Before (2014) and after (2016) the training, 114 (65%) and 116 (66%) faculty members completed the ATI, respectively, and 89 (49%) of the then-181 faculty members completed the perceptions of skills development questionnaire in September 2016. In 2015, 373 students (75%) completed the R-SPQ-2F and CEQ; 412 (83%) completed both questionnaires in 2016. In 2014, the faculty results showed that student-focused teaching was significantly higher in preclinical and clinical courses than in the basic sciences. In 2016, teacher-focused teaching fell significantly; basic science teaching improved the most. Students in both the 2015 and 2016 cohorts had lower mean scores for deep learning approaches from year 1 on, while they increased their scores for surface learning. The students' perceptions of faculty members' good teaching, appropriate assessment, clear goals, and e-learning improved significantly, but perception of appropriate workload did not. Teaching and learning skills development produced significant gains in student-centered teaching for these faculty members and in some students' perceptions of teaching quality. However, student workload needs to be considered to support deep learning.
NASA Technical Reports Server (NTRS)
Owe, Manfred; deJeu, Richard; Walker, Jeffrey; Zukor, Dorothy J. (Technical Monitor)
2001-01-01
A methodology for retrieving surface soil moisture and vegetation optical depth from satellite microwave radiometer data is presented. The procedure is tested with historical 6.6 GHz brightness temperature observations from the Scanning Multichannel Microwave Radiometer over several test sites in Illinois. Results using only nighttime data are presented at this time, due to the greater stability of nighttime surface temperature estimation. The methodology uses a radiative transfer model to solve for surface soil moisture and vegetation optical depth simultaneously using a non-linear iterative optimization procedure. It assumes known constant values for the scattering albedo and roughness. Surface temperature is derived by a procedure using high frequency vertically polarized brightness temperatures. The methodology does not require any field observations of soil moisture or canopy biophysical properties for calibration purposes and is totally independent of wavelength. Results compare well with field observations of soil moisture and satellite-derived vegetation index data from optical sensors.
NASA Astrophysics Data System (ADS)
Sedov, A. V.; Kalinchuk, V. V.; Bocharova, O. V.
2018-01-01
The evaluation of static stresses and strength of units and components is a crucial task for increasing reliability in the operation of vehicles and equipment, to prevent emergencies, especially in structures made of metal and composite materials. At the stage of creation and commissioning of structures to control the quality of manufacturing of individual elements and components, diagnostic control methods are widely used. They are acoustic, ultrasonic, X-ray, radiation methods and others. The using of these methods to control the residual life and the degree of static stresses of units and parts during operation is fraught with great difficulties both in methodology and in instrumentation. In this paper, the authors propose an effective approach of operative control of the degree of static stresses of units and parts of mechanical structures which are in working condition, based on recording the changing in the surface wave properties of a system consisting of a sensor and a controlled environment (unit, part). The proposed approach of low-frequency diagnostics of static stresses presupposes a new adaptive-spectral analysis of a surface wave created by external action (impact). It is possible to estimate implicit stresses of structures in the experiment due to this approach.
NASA Astrophysics Data System (ADS)
Hunka, Frantisek; Matula, Jiri
2017-07-01
Transaction based approach is utilized in some methodologies in business process modeling. Essential parts of these transactions are human beings. The notion of agent or actor role is usually used for them. The paper on a particular example describes possibilities of Design Engineering Methodology for Organizations (DEMO) and Resource-Event-Agent (REA) methodology. Whereas the DEMO methodology can be regarded as a generic methodology having its foundation in the theory of Enterprise Ontology the REA methodology is regarded as the domain specific methodology and has its origin in accountancy systems. The results of these approaches is that the DEMO methodology captures everything that happens in the reality with a good empirical evidence whereas the REA methodology captures only changes connected with economic events. Economic events represent either change of the property rights to economic resource or consumption or production of economic resources. This results from the essence of economic events and their connection to economic resources.
2012-03-22
world’s first powered and controlled flying machine. Numerous flight designs and tests were done by scientists, engineers, and flight enthusiasts...conceptual flight and preliminary designs before they could control the craft with three-axis control and the correct airfoil design . These pioneers...analysis support. Although wind tunnel testing can provide data to predict and develop control surface designs , few SUAV operators opt to utilize wind
Geomorphic and landform survey of Northern Appennine Range (NAR)
NASA Technical Reports Server (NTRS)
Marino, C. M. (Principal Investigator); Zilioli, E.
1977-01-01
The author has identified the following significant results. An approach to landslide hazard detection was developed through the analysis of satellite imagery (LANDSAT 2) showing many landslide areas that occur on marine silts and clays in northern Appennine Range in Italy. A landslide risk score was given for large areas by narrowing and extending well defined areas, whose behavior and reflectivity variation was due to upper surface changes. Results show that this methodology allows evolution pattern of clay outflows to be distinguished.
1995-12-01
34 Environmental Science and Technology, 26:1404-1410 (July 1992). 4. Atlas , Ronald M. and Richard Bartha . Microbial Ecology , Fundamentals and Applica...the impact of physical factors on microbial activity. They cite research by Atlas and Bartha observing that low temperatures inhibit microbial activity...mixture. Atlas and Bartha (4:393-394) explain that a typical petroleum mixture includes aliphatics, alicyclics, aromatics and other organics. The
1995-12-01
Technology, 26:1404-1410 (July 1992). 4. Atlas , Ronald M. and Richard Bartha . Microbial Ecology , Fundamentals and Applica- tions (3rd Edition). Redwood... microbial metabolic activity. Leahy and Colwell (35:307) note the impact of physical factors on microbial activity. They cite research by Atlas and... Bartha observing that low temperatures inhibit microbial activity and research by Bossert and Bartha observing that higher temperatures increase activity
Hassan, Moinuddin; Ilev, Ilko
2014-10-01
Contamination of medical devices has become a critical and prevalent public health safety concern since medical devices are being increasingly used in clinical practices for diagnostics, therapeutics and medical implants. The development of effective sensing methods for real-time detection of pathogenic contamination is needed to prevent and reduce the spread of infections to patients and the healthcare community. In this study, a hollow-core fiber-optic Fourier transform infrared spectroscopy methodology employing a grazing incidence angle based sensing approach (FO-FTIR-GIA) was developed for detection of various biochemical contaminants on medical device surfaces. We demonstrated the sensitivity of FO-FTIR-GIA sensing approach for non-contact and label-free detection of contaminants such as lipopolysaccharide from various surface materials relevant to medical device. The proposed sensing system can detect at a minimum loading concentration of approximately 0.7 μg/cm(2). The FO-FTIR-GIA has the potential for the detection of unwanted pathogen in real time.
NASA Astrophysics Data System (ADS)
Hassan, Moinuddin; Ilev, Ilko
2014-10-01
Contamination of medical devices has become a critical and prevalent public health safety concern since medical devices are being increasingly used in clinical practices for diagnostics, therapeutics and medical implants. The development of effective sensing methods for real-time detection of pathogenic contamination is needed to prevent and reduce the spread of infections to patients and the healthcare community. In this study, a hollow-core fiber-optic Fourier transform infrared spectroscopy methodology employing a grazing incidence angle based sensing approach (FO-FTIR-GIA) was developed for detection of various biochemical contaminants on medical device surfaces. We demonstrated the sensitivity of FO-FTIR-GIA sensing approach for non-contact and label-free detection of contaminants such as lipopolysaccharide from various surface materials relevant to medical device. The proposed sensing system can detect at a minimum loading concentration of approximately 0.7 μg/cm2. The FO-FTIR-GIA has the potential for the detection of unwanted pathogen in real time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hassan, Moinuddin, E-mail: moinuddin.hassan@fda.hhs.gov; Ilev, Ilko
2014-10-15
Contamination of medical devices has become a critical and prevalent public health safety concern since medical devices are being increasingly used in clinical practices for diagnostics, therapeutics and medical implants. The development of effective sensing methods for real-time detection of pathogenic contamination is needed to prevent and reduce the spread of infections to patients and the healthcare community. In this study, a hollow-core fiber-optic Fourier transform infrared spectroscopy methodology employing a grazing incidence angle based sensing approach (FO-FTIR-GIA) was developed for detection of various biochemical contaminants on medical device surfaces. We demonstrated the sensitivity of FO-FTIR-GIA sensing approach for non-contactmore » and label-free detection of contaminants such as lipopolysaccharide from various surface materials relevant to medical device. The proposed sensing system can detect at a minimum loading concentration of approximately 0.7 μg/cm{sup 2}. The FO-FTIR-GIA has the potential for the detection of unwanted pathogen in real time.« less
Enzyme-Mediated Individual Nanoparticle Release Assay
Glass, James R.; Dickerson, Janet C.; Schultz, David A.
2007-01-01
Numerous methods have been developed to measure the presence of macromolecular species in a sample, however methods that detect functional activity, or modulators of that activity are more limited. To address this limitation, an approach was developed that utilizes the optical detection of nanoparticles as a measure of enzyme activity. Nanoparticles are increasingly being used as biological labels in static binding assays; here we describe their use in a release assay format where the enzyme-mediated liberation of individual nanoparticles from a surface is measured. A double stranded fragment of DNA is used as the initial tether to bind the nanoparticles to a solid surface. The nanoparticle spatial distribution and number are determined using dark-field optical microscopy and digital image capture. Site specific cleavage of the DNA tether results in nanoparticle release. The methodology and validation of this approach for measuring enzyme-mediated, individual DNA cleavage events, rapidly, with high specificity, and in real-time is described. This approach was used to detect and discriminate between non-methylated and methylated DNA, and demonstrates a novel platform for high-throughput screening of modulators of enzyme activity. PMID:16620746
NASA Astrophysics Data System (ADS)
Yasin, Sohail; Curti, Massimo; Behary, Nemeshwaree; Perwuelz, Anne; Giraud, Stephane; Rovero, Giorgio; Guan, Jinping; Chen, Guoqiang
The n-methylol dimethyl phosphono propionamide (MDPA) flame retardant compounds are predominantly used for cotton fabric treatments with trimethylol melamine (TMM) to obtain better crosslinking and enhanced flame retardant properties. Nevertheless, such treatments are associated with a toxic issue of cancer-causing formaldehyde release. An eco-friendly finishing was used to get formaldehyde-free fixation of flame retardant to the cotton fabric. Citric acid as a crosslinking agent along with the sodium hypophosphite as a catalyst in the treatment was utilized. The process parameters of the treatment were enhanced for optimized flame retardant properties, in addition, low mechanical loss to the fabric by response surface methodology using Box-Behnken statistical design experiment methodology was achieved. The effects of concentrations on the fabric’s properties (flame retardancy and mechanical properties) were evaluated. The regression equations for the prediction of concentrations and mechanical properties of the fabric were also obtained for the eco-friendly treatment. The R-squared values of all the responses were above 0.95 for the reagents used, indicating the degree of relationship between the predicted values by the Box-Behnken design and the actual experimental results. It was also found that the concentration parameters (crosslinking reagents and catalysts) in the treatment formulation have a prime role in the overall performance of flame retardant cotton fabrics.
Electro-optic holography method for determination of surface shape and deformation
NASA Astrophysics Data System (ADS)
Furlong, Cosme; Pryputniewicz, Ryszard J.
1998-06-01
Current demanding engineering analysis and design applications require effective experimental methodologies for characterization of surface shape and deformation. Such characterization is of primary importance in many applications, because these quantities are related to the functionality, performance, and integrity of the objects of interest, especially in view of advances relating to concurrent engineering. In this paper, a new approach to characterization of surface shape and deformation using a simple optical setup is described. The approach consists of a fiber optic based electro-optic holography (EOH) system based on an IR, temperature tuned laser diode, a single mode fiber optic directional coupler assembly, and a video processing computer. The EOH can be arranged in multiple configurations which include, the three-camera, three- illumination, and speckle correlation modes.In particular, the three-camera mode is described, as well as a brief description of the procedures for obtaining quantitative 3D shape and deformation information. A representative application of the three-camera EOH system demonstrates the viability of the approach as an effective engineering tool. A particular feature of this system and the procedure described in this paper is that the 3D quantitative data are written to data files which can be readily interfaced to commercial CAD/CAM environments.
Broadband ground-motion simulation using a hybrid approach
Graves, R.W.; Pitarka, A.
2010-01-01
This paper describes refinements to the hybrid broadband ground-motion simulation methodology of Graves and Pitarka (2004), which combines a deterministic approach at low frequencies (f 1 Hz). In our approach, fault rupture is represented kinematically and incorporates spatial heterogeneity in slip, rupture speed, and rise time. The prescribed slip distribution is constrained to follow an inverse wavenumber-squared fall-off and the average rupture speed is set at 80% of the local shear-wave velocity, which is then adjusted such that the rupture propagates faster in regions of high slip and slower in regions of low slip. We use a Kostrov-like slip-rate function having a rise time proportional to the square root of slip, with the average rise time across the entire fault constrained empirically. Recent observations from large surface rupturing earthquakes indicate a reduction of rupture propagation speed and lengthening of rise time in the near surface, which we model by applying a 70% reduction of the rupture speed and increasing the rise time by a factor of 2 in a zone extending from the surface to a depth of 5 km. We demonstrate the fidelity of the technique by modeling the strong-motion recordings from the Imperial Valley, Loma Prieta, Landers, and Northridge earthquakes.
Xu, Ping; Kang, Leilei; Mack, Nathan H.; ...
2013-10-21
We investigate surface plasmon assisted catalysis (SPAC) reactions of 4-aminothiophenol (4ATP) to and back from 4,4'-dimercaptoazobenzene (DMAB) by single particle surface enhanced Raman spectroscopy, using a self-designed gas flow cell to control the reductive/oxidative environment over the reactions. Conversion of 4ATP into DMAB is induced by energy transfer (plasmonic heating) from surface plasmon resonance to 4ATP, where O 2 (as an electron acceptor) is essential and H 2O (as a base) can accelerate the reaction. In contrast, hot electron (from surface plasmon decay) induction drives the reverse reaction of DMAB to 4ATP, where H 2O (or H 2) acts asmore » the hydrogen source. More interestingly, the cyclic redox between 4ATP and DMAB by SPAC approach has been demonstrated. Finally, this SPAC methodology presents a unique platform for studying chemical reactions that are not possible under standard synthetic conditions.« less
Fluid-structure interaction of turbulent boundary layer over a compliant surface
NASA Astrophysics Data System (ADS)
Anantharamu, Sreevatsa; Mahesh, Krishnan
2016-11-01
Turbulent flows induce unsteady loads on surfaces in contact with them, which affect material stresses, surface vibrations and far-field acoustics. We are developing a numerical methodology to study the coupled interaction of a turbulent boundary layer with the underlying surface. The surface is modeled as a linear elastic solid, while the fluid follows the spatially filtered incompressible Navier-Stokes equations. An incompressible Large Eddy Simulation finite volume flow approach based on the algorithm of Mahesh et al. is used in the fluid domain. The discrete kinetic energy conserving property of the method ensures robustness at high Reynolds number. The linear elastic model in the solid domain is integrated in space using finite element method and in time using the Newmark time integration method. The fluid and solid domain solvers are coupled using both weak and strong coupling methods. Details of the algorithm, validation, and relevant results will be presented. This work is supported by NSWCCD, ONR.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karakoti, Ajay S.; Yang, Ping; Wang, Weina
2018-02-15
Ligand functionalized nanoparticles have replaced bare nanoparticles from most biological applications. These applications require tight control over size and stability of nanoparticles in aqueous medium. Understanding the mechanism of interaction of nanoparticle surfaces with functional groups of different organic ligands such as carboxylic acids is confounding despite the two decades of research on nanoparticles because of the inability to characterize their surfaces in their immediate environment. Often the surface interaction is understood by correlating the information available, in a piecemeal approach, from surface sensitive spectroscopic information of ligands and the bulk and surface information of nanoparticles. In present study wemore » report the direct interaction of 5-7 nm cerium oxide nanoparticles surface with acetic acid. In-situ XPS study was carried out by freezing the aqueous solution of nanoparticles to liquid nitrogen temperatures. Analysis of data collected concurrently from the ligands as well as functionalized frozen cerium oxide nanoparticles show that the acetic acid binds to the ceria surface in both dissociated and molecular state with equal population over the surface. The cerium oxide surface was populated predominantly with Ce4+ ions consistent with the thermal hydrolysis synthesis. DFT calculations reveal that the acetate ions bind more strongly to the cerium oxide nanoparticles as compared to the water and can replace the hydration sphere of nanoparticles resulting in high acetate/acetic surface coverage. These findings reveal molecular level interaction between the nanoparticle surfaces and ligands giving a better understanding of how materials behave in their immediate aqueous environment. This study also proposes a simple and elegant methodology to directly study the surface functional groups attached to nanoparticles in their immediate aqueous environment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karakoti, Ajay S.; Yang, Ping; Wang, Weina
Ligand functionalized nanoparticles have replaced bare nanoparticles from most biological applications. These applications require tight control over size and stability of nanoparticles in aqueous medium. Understanding the mechanism of interaction of nanoparticle surfaces with functional groups of different organic ligands such as carboxylic acids is confounding despite the two decades of research on nanoparticles because of the inability to characterize their surfaces in their immediate environment. Often the surface interaction is understood by correlating the information available, in a piecemeal approach, from surface sensitive spectroscopic information of ligands and the bulk and surface information of nanoparticles. In present study wemore » report the direct interaction of 5-7 nm cerium oxide nanoparticles surface with acetic acid. In-situ XPS study was carried out by freezing the aqueous solution of nanoparticles to liquid nitrogen temperatures. Analysis of data collected concurrently from the ligands as well as functionalized frozen cerium oxide nanoparticles show that the acetic acid binds to the ceria surface in both dissociated and molecular state with equal population over the surface. The cerium oxide surface was populated predominantly with Ce4+ ions consistent with the thermal hydrolysis synthesis. DFT calculations reveal that the acetate ions bind more strongly to the cerium oxide nanoparticles as compared to the water and can replace the hydration sphere of nanoparticles resulting in high acetate/acetic surface coverage. These findings reveal molecular level interaction between the nanoparticle surfaces and ligands giving a better understanding of how materials behave in their immediate aqueous environment. This study also proposes a simple and elegant methodology to directly study the surface functional groups attached to nanoparticles in their immediate aqueous environment.« less
NASA Astrophysics Data System (ADS)
Guglielmino, F.; Nunnari, G.; Puglisi, G.; Spata, A.
2009-04-01
We propose a new technique, based on the elastic theory, to efficiently produce an estimate of three-dimensional surface displacement maps by integrating sparse Global Position System (GPS) measurements of deformations and Differential Interferometric Synthetic Aperture Radar (DInSAR) maps of movements of the Earth's surface. The previous methodologies known in literature, for combining data from GPS and DInSAR surveys, require two steps: the first, in which sparse GPS measurements are interpolated in order to fill in GPS displacements at the DInSAR grid, and the second, to estimate the three-dimensional surface displacement maps by using a suitable optimization technique. One of the advantages of the proposed approach is that both these steps are unified. We propose a linear matrix equation which accounts for both GPS and DInSAR data whose solution provide simultaneously the strain tensor, the displacement field and the rigid body rotation tensor throughout the entire investigated area. The mentioned linear matrix equation is solved by using the Weighted Least Square (WLS) thus assuring both numerical robustness and high computation efficiency. The proposed methodology was tested on both synthetic and experimental data, these last from GPS and DInSAR measurements carried out on Mt. Etna. The goodness of the results has been evaluated by using standard errors. These tests also allow optimising the choice of specific parameters of this algorithm. This "open" structure of the method will allow in the near future to take account of other available data sets, such as additional interferograms, or other geodetic data (e.g. levelling, tilt, etc.), in order to achieve even higher accuracy.
Teamwork in the Terminal Area: Organizational Issues and Solutions
NASA Technical Reports Server (NTRS)
Parke, Bonny K.; Kanki, Barbara G.; Rosekind, Mark (Technical Monitor)
1997-01-01
Dynamic growth and technology advances in commercial aviation have turned the terminal area into a complex, multi-organization workplace which requires the smooth coordination of many operational teams. In addition to pilots, cabin crew, air traffic controllers, and dispatch (who nominally work together throughout a flight), surface operations additionally involve local, ground and ramp controllers, ramp agents, maintenance, dozens of service contractors, and any number of teams who are responsible for airport operations. Under abnormal or emergency conditions, even more teams become actively involved. In order to accommodate growth and to meet productivity and safety challenges, numerous changes are being made in surface operations. Unfortunately, it is often the case that changes in technologies, organizational roles, procedures, and training are developed and implemented in isolated and piecemeal fashion without regard to cross organizational impact. Thus, there is a need for evaluation methodologies which assure integrated system safety for all organizations. Such methodologies should aid the understanding of how organizations work together and how changes in one domain affects the next. In this study, we develop one approach toward addressing these organizational issues. Examples of surface operations in abnormal situations are examined in regard to their impact on personnel in the terminal area. Timelines are given for the responses to incidents, along with the necessary communication links, the specific roles that members of terminal teams have, and any overlapping responsibilities. Suggestions to improve cross-operational teamwork are given. Methods of graphic representation are explored, both in regards to human links and access to information. The outcome of such an approach should enhance the understanding which is critical for resolving organizational conflicts and maximizing system effectiveness.
Rodriguez-Gonzalez, Pablo; Bouchet, Sylvain; Monperrus, Mathilde; Tessier, Emmanuel; Amouroux, David
2013-03-01
The fate of mercury (Hg) and tin (Sn) compounds in ecosystems is strongly determined by their alkylation/dealkylation pathways. However, the experimental determination of those transformations is still not straightforward and methodologies need to be refined. The purpose of this work is the development of a comprehensive and adaptable tool for an accurate experimental assessment of specific formation/degradation yields and half-lives of elemental species in different aquatic environments. The methodology combines field incubations of coastal waters and surface sediments with the addition of species-specific isotopically enriched tracers and a mathematical approach based on the deconvolution of isotopic patterns. The method has been applied to the study of the environmental reactivity of Hg and Sn compounds in coastal water and surface sediment samples collected in two different coastal ecosystems of the South French Atlantic Coast (Arcachon Bay and Adour Estuary). Both the level of isotopically enriched species and the spiking solution composition were found to alter dibutyltin and monomethylmercury degradation yields, while no significant changes were measurable for tributyltin and Hg(II). For butyltin species, the presence of light was found to be the main source of degradation and removal of these contaminants from surface coastal environments. In contrast, photomediated processes do not significantly influence either the methylation of mercury or the demethylation of methylmercury. The proposed method constitutes an advancement from the previous element-specific isotopic tracers' approaches, which allows for instance to discriminate the extent of net and oxidative Hg demethylation and to identify which debutylation step is controlling the environmental persistence of butyltin compounds.
Theory and Methodology in Researching Emotions in Education
ERIC Educational Resources Information Center
Zembylas, Michalinos
2007-01-01
Differing theoretical approaches to the study of emotions are presented: emotions as private (psychodynamic approaches); emotions as sociocultural phenomena (social constructionist approaches); and a third perspective (interactionist approaches) transcending these two. These approaches have important methodological implications in studying…
Yu, Xiao-cui; Liu, Gao-feng; Wang, Xin
2011-02-01
To optimize the preparation technics of wumeitougu oral liquid (WTOL) by response surface methodology. Based on the single-factor tests, the times of WTOL extraction, alcohol precipitation concentration and pH value were selected as three factors for box-behnken central composite design. The response surface methodology was used to optimize the parameters of the preparation. Under the condition of extraction time 1.5 h, extraction times 2.772, the relative density 1.12, alcohol precipitation concentration 68.704%, and pH value 5.0, he theory highest content of Asperosaponin VI was up to 549.908 mg/L. Considering the actual situation, the conditions were amended to three extract times, alcohol precipitation concentration 69%, pH value 5.0, and the content of Dipsacaceae VI saponin examined was 548.63 mg/L which was closed to the theoretical value. The optimized preparation technics of WTOL by response surface methodology is reasonable and feasible.
Methodological Approaches in MOOC Research: Retracing the Myth of Proteus
ERIC Educational Resources Information Center
Raffaghelli, Juliana Elisa; Cucchiara, Stefania; Persico, Donatella
2015-01-01
This paper explores the methodological approaches most commonly adopted in the scholarly literature on Massive Open Online Courses (MOOCs), published during the period January 2008-May 2014. In order to identify trends, gaps and criticalities related to the methodological approaches of this emerging field of research, we analysed 60 papers…
Microengineering neocartilage scaffolds.
Petersen, Erik F; Spencer, Richard G S; McFarland, Eric W
2002-06-30
Advances in micropatterning methodologies have made it possible to create structures with precise architecture on the surface of cell culture substrata. We applied these techniques to fabricate microfeatures (15-65 microm wide; 40 microm deep) on the surface of a flexible, biocompatible polysaccharide gel. The micropatterned polymer gels were subsequently applied as scaffolds for chondrocyte culture and proved effective in maintaining key aspects of the chondrogenic phenotype. These were rounded cell morphology and a positive and statistically significant (p < 0.0001) immunofluorescence assay for the production of type II collagen throughout the maximum culture time of 10 days after cell seeding. Further, cells housed within individual surface features were observed to proliferate, while serial application of chondrocytes resulted in the formation of cellular aggregates. These methods represent a novel approach to the problem of engineering reparative cartilage in vitro. Copyright 2002 Wiley Periodicals, Inc.
Economic method for helical gear flank surface characterisation
NASA Astrophysics Data System (ADS)
Koulin, G.; Reavie, T.; Frazer, R. C.; Shaw, B. A.
2018-03-01
Typically the quality of a gear pair is assessed based on simplified geometric tolerances which do not always correlate with functional performance. In order to identify and quantify functional performance based parameters, further development of the gear measurement approach is required. Methodology for interpolation of the full active helical gear flank surface, from sparse line measurements, is presented. The method seeks to identify the minimum number of line measurements required to sufficiently characterise an active gear flank. In the form ground gear example presented, a single helix and three profile line measurements was considered to be acceptable. The resulting surfaces can be used to simulate the meshing engagement of a gear pair and therefore provide insight into functional performance based parameters. Therefore the assessment of the quality can be based on the predicted performance in the context of an application.
Mechanical modulation method for ultrasensitive phase measurements in photonics biosensing.
Patskovsky, S; Maisonneuve, M; Meunier, M; Kabashin, A V
2008-12-22
A novel polarimetry methodology for phase-sensitive measurements in single reflection geometry is proposed for applications in optical transduction-based biological sensing. The methodology uses altering step-like chopper-based mechanical phase modulation for orthogonal s- and p- polarizations of light reflected from the sensing interface and the extraction of phase information at different harmonics of the modulation. We show that even under a relatively simple experimental arrangement, the methodology provides the resolution of phase measurements as low as 0.007 deg. We also examine the proposed approach using Total Internal Reflection (TIR) and Surface Plasmon Resonance (SPR) geometries. For TIR geometry, the response appears to be strongly dependent on the prism material with the best values for high refractive index Si. The detection limit for Si-based TIR is estimated as 10(-5) in terms Refractive Index Units (RIU) change. SPR geometry offers much stronger phase response due to a much sharper phase characteristics. With the detection limit of 3.2*10(-7) RIU, the proposed methodology provides one of best sensitivities for phase-sensitive SPR devices. Advantages of the proposed method include high sensitivity, simplicity of experimental setup and noise immunity as a result of a high stability modulation.
Gubskaya, Anna V.; Khan, I. John; Valenzuela, Loreto M.; Lisnyak, Yuriy V.; Kohn, Joachim
2013-01-01
The objectives of this work were: (1) to select suitable compositions of tyrosine-derived polycarbonates for controlled delivery of voclosporin, a potent drug candidate to treat ocular diseases, (2) to establish a structure-function relationship between key molecular characteristics of biodegradable polymer matrices and drug release kinetics, and (3) to identify factors contributing in the rate of drug release. For the first time, the experimental study of polymeric drug release was accompanied by a hierarchical sequence of three computational methods. First, suitable polymer compositions used in subsequent neural network modeling were determined by means of response surface methodology (RSM). Second, accurate artificial neural network (ANN) models were built to predict drug release profiles for fifteen polymers located outside the initial design space. Finally, thermodynamic properties and hydrogen-bonding patterns of model drug-polymer complexes were studied using molecular dynamics (MD) technique to elucidate a role of specific interactions in drug release mechanism. This research presents further development of methodological approaches to meet challenges in the design of polymeric drug delivery systems. PMID:24039300
NASA Astrophysics Data System (ADS)
Cheyney, S.; Fishwick, S.; Hill, I. A.; Linford, N. T.
2015-08-01
Despite the development of advanced processing and interpretation tools for magnetic data sets in the fields of mineral and hydrocarbon industries, these methods have not achieved similar levels of adoption for archaeological or very near surface surveys. Using a synthetic data set we demonstrate that certain methodologies and assumptions used to successfully invert more regional-scale data can lead to large discrepancies between the true and recovered depths when applied to archaeological-type anomalies. We propose variations to the current approach, analysing the choice of the depth-weighting function, mesh design and parameter constraints, to develop an appropriate technique for the 3-D inversion of archaeological-scale data sets. The results show a successful recovery of a synthetic scenario, as well as a case study of a Romano-Celtic temple in the UK. For the case study, the final susceptibility model is compared with two coincident ground penetrating radar surveys, showing a high correlation with the comparative depth slices. The new approach takes interpretation of archaeological data sets beyond a simple 2-D visual interpretation based on pattern recognition.
NASA Astrophysics Data System (ADS)
Vieceli, Nathália; Nogueira, Carlos A.; Pereira, Manuel F. C.; Durão, Fernando O.; Guimarães, Carlos; Margarido, Fernanda
2018-01-01
The recovery of lithium from hard rock minerals has received increased attention given the high demand for this element. Therefore, this study optimized an innovative process, which does not require a high-temperature calcination step, for lithium extraction from lepidolite. Mechanical activation and acid digestion were suggested as crucial process parameters, and experimental design and response-surface methodology were applied to model and optimize the proposed lithium extraction process. The promoting effect of amorphization and the formation of lithium sulfate hydrate on lithium extraction yield were assessed. Several factor combinations led to extraction yields that exceeded 90%, indicating that the proposed process is an effective approach for lithium recovery.
Advancing the integration of spatial data to map human and natural drivers on coral reefs
Gove, Jamison M.; Walecka, Hilary R.; Donovan, Mary K.; Williams, Gareth J.; Jouffray, Jean-Baptiste; Crowder, Larry B.; Erickson, Ashley; Falinski, Kim; Friedlander, Alan M.; Kappel, Carrie V.; Kittinger, John N.; McCoy, Kaylyn; Norström, Albert; Nyström, Magnus; Oleson, Kirsten L. L.; Stamoulis, Kostantinos A.; White, Crow; Selkoe, Kimberly A.
2018-01-01
A major challenge for coral reef conservation and management is understanding how a wide range of interacting human and natural drivers cumulatively impact and shape these ecosystems. Despite the importance of understanding these interactions, a methodological framework to synthesize spatially explicit data of such drivers is lacking. To fill this gap, we established a transferable data synthesis methodology to integrate spatial data on environmental and anthropogenic drivers of coral reefs, and applied this methodology to a case study location–the Main Hawaiian Islands (MHI). Environmental drivers were derived from time series (2002–2013) of climatological ranges and anomalies of remotely sensed sea surface temperature, chlorophyll-a, irradiance, and wave power. Anthropogenic drivers were characterized using empirically derived and modeled datasets of spatial fisheries catch, sedimentation, nutrient input, new development, habitat modification, and invasive species. Within our case study system, resulting driver maps showed high spatial heterogeneity across the MHI, with anthropogenic drivers generally greatest and most widespread on O‘ahu, where 70% of the state’s population resides, while sedimentation and nutrients were dominant in less populated islands. Together, the spatial integration of environmental and anthropogenic driver data described here provides a first-ever synthetic approach to visualize how the drivers of coral reef state vary in space and demonstrates a methodological framework for implementation of this approach in other regions of the world. By quantifying and synthesizing spatial drivers of change on coral reefs, we provide an avenue for further research to understand how drivers determine reef diversity and resilience, which can ultimately inform policies to protect coral reefs. PMID:29494613
A new technique for the measurement of surface shear stress vectors using liquid crystal coatings
NASA Technical Reports Server (NTRS)
Reda, Daniel C.; Muratore, J. J., Jr.
1994-01-01
Research has recently shown that liquid crystal coating (LCC) color-change response to shear depends on both shear stress magnitude and direction. Additional research was thus conducted to extend the LCC method from a flow-visualization tool to a surface shear stress vector measurement technique. A shear-sensitive LCC was applied to a planar test surface and illuminated by white light from the normal direction. A fiber optic probe was used to capture light scattered by the LCC from a point on the centerline of a turbulent, tangential-jet flow. Both the relative shear stress magnitude and the relative in-plane view angle between the sensor and the centerline shear vector were systematically varied. A spectrophotometer was used to obtain scattered-light spectra which were used to quantify the LCC color (dominant wavelength) as a function of shear stress magnitude and direction. At any fixed shear stress magnitude, the minimum dominant wavelength was measured when the shear vector was aligned with and directed away from the observer; changes in the relative in-plane view angle to either side of this vector/observer aligned position resulted in symmetric Gaussian increases in measured dominant wavelength. Based on these results, a vector measurement methodology, involving multiple oblique-view observations of the test surface, was formulated. Under present test conditions, the measurement resolution of this technique was found to be +/- 1 deg for vector orientations and +/- 5% for vector magnitudes. An approach t o extend the present methodology to full-surface applications is proposed.
NASA Astrophysics Data System (ADS)
Hazwan, M. H. M.; Shayfull, Z.; Sharif, S.; Nasir, S. M.; Zainal, N.
2017-09-01
In injection moulding process, quality and productivity are notably important and must be controlled for each product type produced. Quality is measured as the extent of warpage of moulded parts while productivity is measured as a duration of moulding cycle time. To control the quality, many researchers have introduced various of optimisation approaches which have been proven enhanced the quality of the moulded part produced. In order to improve the productivity of injection moulding process, some of researches have proposed the application of conformal cooling channels which have been proven reduced the duration of moulding cycle time. Therefore, this paper presents an application of alternative optimisation approach which is Response Surface Methodology (RSM) with Glowworm Swarm Optimisation (GSO) on the moulded part with straight-drilled and conformal cooling channels mould. This study examined the warpage condition of the moulded parts before and after optimisation work applied for both cooling channels. A front panel housing have been selected as a specimen and the performance of proposed optimisation approach have been analysed on the conventional straight-drilled cooling channels compared to the Milled Groove Square Shape (MGSS) conformal cooling channels by simulation analysis using Autodesk Moldflow Insight (AMI) 2013. Based on the results, melt temperature is the most significant factor contribute to the warpage condition and warpage have optimised by 39.1% after optimisation for straight-drilled cooling channels and cooling time is the most significant factor contribute to the warpage condition and warpage have optimised by 38.7% after optimisation for MGSS conformal cooling channels. In addition, the finding shows that the application of optimisation work on the conformal cooling channels offers the better quality and productivity of the moulded part produced.
Munthe-Kaas, Heather; Bohren, Meghan A; Glenton, Claire; Lewin, Simon; Noyes, Jane; Tunçalp, Özge; Booth, Andrew; Garside, Ruth; Colvin, Christopher J; Wainwright, Megan; Rashidian, Arash; Flottorp, Signe; Carlsen, Benedicte
2018-01-25
The GRADE-CERQual (Confidence in Evidence from Reviews of Qualitative research) approach has been developed by the GRADE (Grading of Recommendations Assessment, Development and Evaluation) Working Group. The approach has been developed to support the use of findings from qualitative evidence syntheses in decision-making, including guideline development and policy formulation. CERQual includes four components for assessing how much confidence to place in findings from reviews of qualitative research (also referred to as qualitative evidence syntheses): (1) methodological limitations, (2) coherence, (3) adequacy of data and (4) relevance. This paper is part of a series providing guidance on how to apply CERQual and focuses on CERQual's methodological limitations component. We developed the methodological limitations component by searching the literature for definitions, gathering feedback from relevant research communities and developing consensus through project group meetings. We tested the CERQual methodological limitations component within several qualitative evidence syntheses before agreeing on the current definition and principles for application. When applying CERQual, we define methodological limitations as the extent to which there are concerns about the design or conduct of the primary studies that contributed evidence to an individual review finding. In this paper, we describe the methodological limitations component and its rationale and offer guidance on how to assess methodological limitations of a review finding as part of the CERQual approach. This guidance outlines the information required to assess methodological limitations component, the steps that need to be taken to assess methodological limitations of data contributing to a review finding and examples of methodological limitation assessments. This paper provides guidance for review authors and others on undertaking an assessment of methodological limitations in the context of the CERQual approach. More work is needed to determine which criteria critical appraisal tools should include when assessing methodological limitations. We currently recommend that whichever tool is used, review authors provide a transparent description of their assessments of methodological limitations in a review finding. We expect the CERQual approach and its individual components to develop further as our experiences with the practical implementation of the approach increase.
Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A.D.
2013-01-01
Soil surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to changes in climate and grazing regimes.
NASA Astrophysics Data System (ADS)
Yi, S.; Li, N.; Xiang, B.; Wang, X.; Ye, B.; McGuire, A. D.
2013-07-01
surface temperature is a critical boundary condition for the simulation of soil temperature by environmental models. It is influenced by atmospheric and soil conditions and by vegetation cover. In sophisticated land surface models, it is simulated iteratively by solving surface energy budget equations. In ecosystem, permafrost, and hydrology models, the consideration of soil surface temperature is generally simple. In this study, we developed a methodology for representing the effects of vegetation cover and atmospheric factors on the estimation of soil surface temperature for alpine grassland ecosystems on the Qinghai-Tibetan Plateau. Our approach integrated measurements from meteorological stations with simulations from a sophisticated land surface model to develop an equation set for estimating soil surface temperature. After implementing this equation set into an ecosystem model and evaluating the performance of the ecosystem model in simulating soil temperature at different depths in the soil profile, we applied the model to simulate interactions among vegetation cover, freeze-thaw cycles, and soil erosion to demonstrate potential applications made possible through the implementation of the methodology developed in this study. Results showed that (1) to properly estimate daily soil surface temperature, algorithms should use air temperature, downward solar radiation, and vegetation cover as independent variables; (2) the equation set developed in this study performed better than soil surface temperature algorithms used in other models; and (3) the ecosystem model performed well in simulating soil temperature throughout the soil profile using the equation set developed in this study. Our application of the model indicates that the representation in ecosystem models of the effects of vegetation cover on the simulation of soil thermal dynamics has the potential to substantially improve our understanding of the vulnerability of alpine grassland ecosystems to changes in climate and grazing regimes.
A novel method to scale up fungal endophyte isolations
USDA-ARS?s Scientific Manuscript database
Estimations of species diversity are influenced by sampling intensity which in turn is influenced by methodology. For fungal endophyte diversity studies, the methodology includes surface-sterilization prior to isolation of endophytes. Surface-sterilization is an essential component of fungal endophy...
Stochastic response surface methodology: A study in the human health area
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt; Oliveira, Amílcar, E-mail: amilcar.oliveira@uab.pt; Centro de Estatística e Aplicações, Universidade de Lisboa
2015-03-10
In this paper we review Stochastic Response Surface Methodology as a tool for modeling uncertainty in the context of Risk Analysis. An application in the survival analysis in the breast cancer context is implemented with R software.
NASA Astrophysics Data System (ADS)
Han, Shin-Chan; Razeghi, S. Mahdiyeh
2017-11-01
We present a methodology to invert a regional set of vertical displacement data from Global Positioning System (GPS) to determine the surface mass redistribution. It is assumed that GPS deformation is a result of the Earth's elastic response to the surface mass load of hydrology, atmosphere, and/or ocean. We develop an algorithm to estimate the spectral information of displacements from "regional" GPS data through regional spherical (Slepian) basis functions and apply the load Love numbers to estimate the mass load. The same approach is applied to determine global mass changes from "global" geopotential change data of Gravity Recovery and Climate Experiment (GRACE). We rigorously examine all systematic errors caused by various truncations (spherical harmonic series and Slepian series) and the smoothing constraint applied to the GPS inversion. We demonstrate the technique by processing 16 years of daily vertical motions determined from 114 GPS stations in Australia. The GPS-inverted surface mass changes are validated against GRACE data, atmosphere and ocean models, and a land surface model. Seasonal and interannual terrestrial mass variations from GPS are in good agreement with GRACE data and the water storage models. The GPS recovery compares better with the water storage model around the smaller coastal basins than two different GRACE solutions. The submonthly mass changes from GPS provide meaningful results agreeing with atmospheric mass changes in central Australia. Finally, it is suggested to integrate GPS and GRACE data to draw a comprehensive picture of daily mass changes on different continents.
Staphylococcus aureus and Staphylococcus epidermidis infections on implants.
Oliveira, W F; Silva, P M S; Silva, R C S; Silva, G M M; Machado, G; Coelho, L C B B; Correia, M T S
2018-02-01
Infections are one of the main reasons for removal of implants from patients, and usually need difficult and expensive treatments. Staphylococcus aureus and Staphylococcus epidermidis are the most frequently detected pathogens. We reviewed the epidemiology and pathogenesis of implant-related infections. Relevant studies were identified by electronic searching of the following databases: PubMed, ScienceDirect, Academic Google, and CAPES Journal Portal. This review reports epidemiological studies of implant infections caused by S. aureus and S. epidermidis. We discuss some methodologies used in the search for new compounds with antibiofilm activity and the main strategies for biomaterial surface modifications to avoid bacterial plaque formation and consequent infection. S. aureus and S. epidermidis are frequently involved in infections in catheters and orthopaedic/breast implants. Different methodologies have been used to test the potential antibiofilm properties of compounds; for example, crystal violet dye is widely used for in-vitro biofilm quantification due to its low cost and good reproducibility. Changes in the surface biomaterials are necessary to prevent biofilm formation. Some studies have investigated the immobilization of antibiotics on the surfaces of materials used in implants. Other approaches have been used as a way to avoid the spread of bacterial resistance to antimicrobials, such as the functionalization of these surfaces with silver and natural compounds, as well as the electrical treatment of these substrates. Copyright © 2017 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.
Foundations and latest advances in replica exchange transition interface sampling.
Cabriolu, Raffaela; Skjelbred Refsnes, Kristin M; Bolhuis, Peter G; van Erp, Titus S
2017-10-21
Nearly 20 years ago, transition path sampling (TPS) emerged as an alternative method to free energy based approaches for the study of rare events such as nucleation, protein folding, chemical reactions, and phase transitions. TPS effectively performs Monte Carlo simulations with relatively short molecular dynamics trajectories, with the advantage of not having to alter the actual potential energy surface nor the underlying physical dynamics. Although the TPS approach also introduced a methodology to compute reaction rates, this approach was for a long time considered theoretically attractive, providing the exact same results as extensively long molecular dynamics simulations, but still expensive for most relevant applications. With the increase of computer power and improvements in the algorithmic methodology, quantitative path sampling is finding applications in more and more areas of research. In particular, the transition interface sampling (TIS) and the replica exchange TIS (RETIS) algorithms have, in turn, improved the efficiency of quantitative path sampling significantly, while maintaining the exact nature of the approach. Also, open-source software packages are making these methods, for which implementation is not straightforward, now available for a wider group of users. In addition, a blooming development takes place regarding both applications and algorithmic refinements. Therefore, it is timely to explore the wide panorama of the new developments in this field. This is the aim of this article, which focuses on the most efficient exact path sampling approach, RETIS, as well as its recent applications, extensions, and variations.
Foundations and latest advances in replica exchange transition interface sampling
NASA Astrophysics Data System (ADS)
Cabriolu, Raffaela; Skjelbred Refsnes, Kristin M.; Bolhuis, Peter G.; van Erp, Titus S.
2017-10-01
Nearly 20 years ago, transition path sampling (TPS) emerged as an alternative method to free energy based approaches for the study of rare events such as nucleation, protein folding, chemical reactions, and phase transitions. TPS effectively performs Monte Carlo simulations with relatively short molecular dynamics trajectories, with the advantage of not having to alter the actual potential energy surface nor the underlying physical dynamics. Although the TPS approach also introduced a methodology to compute reaction rates, this approach was for a long time considered theoretically attractive, providing the exact same results as extensively long molecular dynamics simulations, but still expensive for most relevant applications. With the increase of computer power and improvements in the algorithmic methodology, quantitative path sampling is finding applications in more and more areas of research. In particular, the transition interface sampling (TIS) and the replica exchange TIS (RETIS) algorithms have, in turn, improved the efficiency of quantitative path sampling significantly, while maintaining the exact nature of the approach. Also, open-source software packages are making these methods, for which implementation is not straightforward, now available for a wider group of users. In addition, a blooming development takes place regarding both applications and algorithmic refinements. Therefore, it is timely to explore the wide panorama of the new developments in this field. This is the aim of this article, which focuses on the most efficient exact path sampling approach, RETIS, as well as its recent applications, extensions, and variations.
Hwang, Seung Hwan; Kwon, Shin Hwa; Wang, Zhiqiang; Kim, Tae Hyun; Kang, Young-Hee; Lee, Jae-Yong; Lim, Soon Sung
2016-08-26
Protein tyrosine phosphatase expressed in insulin-sensitive tissues (such as liver, muscle, and adipose tissue) has a key role in the regulation of insulin signaling and pathway activation, making protein tyrosine phosphatase a promising target for the treatment of type 2 diabetes mellitus and obesity and response surface methodology (RSM) is an effective statistical technique for optimizing complex processes using a multi-variant approach. In this study, Zea mays L. (Purple corn kernel, PCK) and its constituents were investigated for protein tyrosine phosphatase 1β (PTP1β) inhibitory activity including enzyme kinetic study and to improve total yields of anthocyanins and polyphenols, four extraction parameters, including temperature, time, solid-liquid ratio, and solvent volume, were optimized by RSM. Isolation of seven polyphenols and five anthocyanins was achieved by PTP1β assay. Among them, cyanidin-3-(6"malonylglucoside) and 3'-methoxyhirsutrin showed the highest PTP1β inhibition with IC50 values of 54.06 and 64.04 μM, respectively and 4.52 mg gallic acid equivalent/g (GAE/g) of total polyphenol content (TPC) and 43.02 mg cyanidin-3-glucoside equivalent/100 g (C3GE/100g) of total anthocyanin content (TAC) were extracted at 40 °C for 8 h with a 33 % solid-liquid ratio and a 1:15 solvent volume. Yields were similar to predictions of 4.58 mg GAE/g of TPC and 42.28 mg C3GE/100 g of TAC. These results indicated that PCK and 3'-methoxyhirsutrin and cyanidin-3-(6"malonylglucoside) might be active natural compounds and could be apply by optimizing of extraction process using response surface methodology.
NASA Technical Reports Server (NTRS)
Smith, Andrew; LaVerde, Bruce; Fulcher, Clay; Hunt, Ron
2012-01-01
An approach for predicting the vibration, strain, and force responses of a flight-like vehicle panel assembly to acoustic pressures is presented. Important validation for the approach is provided by comparison to ground test measurements in a reverberant chamber. The test article and the corresponding analytical model were assembled in several configurations to demonstrate the suitability of the approach for response predictions when the vehicle panel is integrated with equipment. Critical choices in the analysis necessary for convergence of the predicted and measured responses are illustrated through sensitivity studies. The methodology includes representation of spatial correlation of the pressure field over the panel surface. Therefore, it is possible to demonstrate the effects of hydrodynamic coincidence in the response. The sensitivity to pressure patch density clearly illustrates the onset of coincidence effects on the panel response predictions.
Vander Zanden, Hannah B.; Tucker, Anton D.; Hart, Kristen M.; Lamont, Margaret M.; Fujisaki, Ikuko; Addison, David S.; Mansfield, Katherine L.; Phillips, Katrina F.; Wunder, Michael B.; Bowen, Gabriel J.; Pajuelo, Mariela; Bolten, Alan B.; Bjorndal, Karen A.
2015-01-01
Stable isotope analysis is a useful tool to track animal movements in both terrestrial and marine environments. These intrinsic markers are assimilated through the diet and may exhibit spatial gradients as a result of biogeochemical processes at the base of the food web. In the marine environment, maps to predict the spatial distribution of stable isotopes are limited, and thus determining geographic origin has been reliant upon integrating satellite telemetry and stable isotope data. Migratory sea turtles regularly move between foraging and reproductive areas. Whereas most nesting populations can be easily accessed and regularly monitored, little is known about the demographic trends in foraging populations. The purpose of the present study was to examine migration patterns of loggerhead nesting aggregations in the Gulf of Mexico (GoM), where sea turtles have been historically understudied. Two methods of geographic assignment using stable isotope values in known-origin samples from satellite telemetry were compared: 1) a nominal approach through discriminant analysis and 2) a novel continuous-surface approach using bivariate carbon and nitrogen isoscapes (isotopic landscapes) developed for this study. Tissue samples for stable isotope analysis were obtained from 60 satellite-tracked individuals at five nesting beaches within the GoM. Both methodological approaches for assignment resulted in high accuracy of foraging area determination, though each has advantages and disadvantages. The nominal approach is more appropriate when defined boundaries are necessary, but up to 42% of the individuals could not be considered in this approach. All individuals can be included in the continuous-surface approach, and individual results can be aggregated to identify geographic hotspots of foraging area use, though the accuracy rate was lower than nominal assignment. The methodological validation provides a foundation for future sea turtle studies in the region to inexpensively determine geographic origin for large numbers of untracked individuals. Regular monitoring of sea turtle nesting aggregations with stable isotope sampling can be used to fill critical data gaps regarding habitat use and migration patterns. Probabilistic assignment to origin with isoscapes has not been previously used in the marine environment, but the methods presented here could also be applied to other migratory marine species.
Zanden, Hannah B Vander; Tucker, Anton D; Hart, Kristen M; Lamont, Margaret M; Fuisaki, Ikuko; Addison, David; Mansfield, Katherine L; Phillips, Katrina F; Wunder, Michael B; Bowen, Gabriel J; Pajuelo, Mariela; Bolten, Alan B; Bjorndal, Karen A
2015-03-01
Stable isotope analysis is a useful tool to track animal movements in both terrestrial and marine environments. These intrinsic markers are assimilated through the diet and may exhibit spatial gradients as a result of biogeochemical processes at the base of the food web. In the marine environment, maps to predict the spatial distribution of stable isotopes are limited, and thus determining geographic origin has been reliant upon integrating satellite telemetry and stable isotope data. Migratory sea turtles regularly move between foraging and reproductive areas. Whereas most nesting populations can be easily accessed and regularly monitored, little is known about the demographic trends in foraging populations. The purpose of the present study was to examine migration patterns of loggerhead nesting aggregations in the Gulf of Mexico (GoM), where sea turtles have been historically understudied. Two methods of geographic assignment using stable isotope values in known-origin samples from satellite telemetry were compared: (1) a nominal approach through discriminant analysis and (2) a novel continuous-surface approach using bivariate carbon and nitrogen isoscapes (isotopic landscapes) developed for this study. Tissue samples for stable isotope analysis were obtained from 60 satellite-tracked individuals at five nesting beaches within the GoM. Both methodological approaches for assignment resulted in high accuracy of foraging area determination, though each has advantages and disadvantages. The nominal approach is more appropriate when defined boundaries are necessary, but up to 42% of the individuals could not be considered in this approach. All individuals can be included in the continuous-surface approach, and individual results can be aggregated to identify geographic hotspots of foraging area use, though the accuracy rate was lower than nominal assignment. The methodological validation provides a foundation for future sea turtle studies in the region to inexpensively determine geographic origin for large numbers of untracked individuals. Regular monitoring of sea turtle nesting aggregations with stable isotope sampling can be used to fill critical data gaps regarding habitat use and migration patterns. Probabilistic assignment to origin with isoscapes has not been previously used in the marine environment, but the methods presented here could also be applied to other migratory marine species.
Song, Chuan-xia; Chen, Hong-mei; Dai, Yu; Kang, Min; Hu, Jia; Deng, Yun
2014-11-01
To optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase by Plackett-Burman design combined with Central Composite Design (CCD) response surface methodology. To select the main influencing factors by Plackett-Burman design, using CCD response surface methodology to optimize the process of Icraiin be hydrolyzed to Baohuoside I by cellulase. Taking substrate concentration, the pH of buffer and reaction time as independent variables, with conversion rate of icariin as dependent variable,using regression fitting of completely quadratic response surface between independent variable and dependent variable,the optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase was intuitively analyzed by 3D surface chart, and taking verification tests and predictive analysis. The best enzymatic hydrolytic process was as following: substrate concentration 8. 23 mg/mL, pH 5. 12 of buffer,reaction time 35. 34 h. The optimum process of Icraiin be hydrolyzed to Baohuoside I by cellulase is determined by Plackett-Burman design combined with CCD response surface methodology. The optimized enzymatic hydrolytic process is simple, convenient, accurate, reproducible and predictable.
Elliptic surface grid generation in three-dimensional space
NASA Technical Reports Server (NTRS)
Kania, Lee
1992-01-01
A methodology for surface grid generation in three dimensional space is described. The method solves a Poisson equation for each coordinate on arbitrary surfaces using successive line over-relaxation. The complete surface curvature terms were discretized and retained within the nonhomogeneous term in order to preserve surface definition; there is no need for conventional surface splines. Control functions were formulated to permit control of grid orthogonality and spacing. A method for interpolation of control functions into the domain was devised which permits their specification not only at the surface boundaries but within the interior as well. An interactive surface generation code which makes use of this methodology is currently under development.
Roy, Uttariya; Sengupta, Shubhalakshmi; Banerjee, Priya; Das, Papita; Bhowal, Avijit; Datta, Siddhartha
2018-06-18
This study focuses on the investigation of removal of textile dye (Reactive Yellow) by a combined approach of sorption integrated with biodegradation using low cost adsorbent fly ash immobilized with Pseudomonas sp. To ensure immobilization of bacterial species on treated fly ash, fly ash with immobilized bacterial cells was characterized using Fourier transform infrared (FTIR) spectroscopy, scanning electron microscopy (SEM), and fluorescence microscopy. Comparative batch studies were carried out using Pseudomonas sp, fly ash and immobilized Pseudomonas sp on flyash and were observed that immobilized Pseudomonas sp on flyash acted as better decolourizing agent. The optimized pH, temperature, and immobilized adsorbent dosage for highest percentage of dye removal were observed to be pH 6, 303 K, 1.2 g/L in all the cases. At optimum condition, the highest percentage of dye removal was found to be 88.51%, 92.62% and 98.72% for sorption (flyash), biodegradation (Pseudomonas sp) and integral approach (Pseudomonas sp on flyash) respectively. Optimization of operating parameters of textile dye decolourization was done by response surface methodology (RSM) using Design Expert 7 software. Phytotoxicity evaluation with Cicer arietinum revealed that seeds exposed to untreated dye effluents showed considerably lower growth, inhibited biochemical, and enzyme parameters with compared to those exposed to treated textile effluents. Thus this immobilized inexpensive technique could be used for removal of synthetic dyes present in textile wastewater. Copyright © 2018 Elsevier Ltd. All rights reserved.
Geomorphometric analysis of cave ceiling channels mapped with 3-D terrestrial laser scanning
NASA Astrophysics Data System (ADS)
Gallay, Michal; Hochmuth, Zdenko; Kaňuk, Ján; Hofierka, Jaroslav
2016-05-01
The change of hydrological conditions during the evolution of caves in carbonate rocks often results in a complex subterranean geomorphology, which comprises specific landforms such as ceiling channels, anastomosing half tubes, or speleothems organized vertically in different levels. Studying such complex environments traditionally requires tedious mapping; however, this is being replaced with terrestrial laser scanning technology. Laser scanning overcomes the problem of reaching high ceilings, providing new options to map underground landscapes with unprecedented level of detail and accuracy. The acquired point cloud can be handled conveniently with dedicated software, but applying traditional geomorphometry to analyse the cave surface is limited. This is because geomorphometry has been focused on parameterization and analysis of surficial terrain. The theoretical and methodological concept has been based on two-dimensional (2-D) scalar fields, which are sufficient for most cases of the surficial terrain. The terrain surface is modelled with a bivariate function of altitude (elevation) and represented by a raster digital elevation model. However, the cave is a 3-D entity; therefore, a different approach is required for geomorphometric analysis. In this paper, we demonstrate the benefits of high-resolution cave mapping and 3-D modelling to better understand the palaeohydrography of the Domica cave in Slovakia. This methodological approach adopted traditional geomorphometric methods in a unique manner and also new methods used in 3-D computer graphics, which can be applied to study other 3-D geomorphological forms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nardes, Alexandre M.; Ahn, Sungmo; Rourke, Devin
2016-12-01
We introduce a simple methodology to integrate prefabricated nanostructured-electrodes in solution-processed organic photovoltaic (OPV) devices. The tailored 'photonic electrode' nanostructure is used for light management in the device and for hole collection. This approach opens up new possibilities for designing photonically active structures that can enhance the absorption of sub-bandgap photons in the active layer. We discuss the design, fabrication and characterization of photonic electrodes, and the methodology for integrating them to OPV devices using a simple lamination technique. We demonstrate theoretically and experimentally that OPV devices using photonic electrodes show a factor of ca. 5 enhancement in external quantummore » efficiency (EQE) in the near infrared region. We use simulations to trace this observed efficiency enhancement to surface plasmon polariton modes in the nanostructure.« less
Terlier, T; Lee, J; Lee, K; Lee, Y
2018-02-06
Technological progress has spurred the development of increasingly sophisticated analytical devices. The full characterization of structures in terms of sample volume and composition is now highly complex. Here, a highly improved solution for 3D characterization of samples, based on an advanced method for 3D data correction, is proposed. Traditionally, secondary ion mass spectrometry (SIMS) provides the chemical distribution of sample surfaces. Combining successive sputtering with 2D surface projections enables a 3D volume rendering to be generated. However, surface topography can distort the volume rendering by necessitating the projection of a nonflat surface onto a planar image. Moreover, the sputtering is highly dependent on the probed material. Local variation of composition affects the sputter yield and the beam-induced roughness, which in turn alters the 3D render. To circumvent these drawbacks, the correlation of atomic force microscopy (AFM) with SIMS has been proposed in previous studies as a solution for the 3D chemical characterization. To extend the applicability of this approach, we have developed a methodology using AFM-time-of-flight (ToF)-SIMS combined with an empirical sputter model, "dynamic-model-based volume correction", to universally correct 3D structures. First, the simulation of 3D structures highlighted the great advantages of this new approach compared with classical methods. Then, we explored the applicability of this new correction to two types of samples, a patterned metallic multilayer and a diblock copolymer film presenting surface asperities. In both cases, the dynamic-model-based volume correction produced an accurate 3D reconstruction of the sample volume and composition. The combination of AFM-SIMS with the dynamic-model-based volume correction improves the understanding of the surface characteristics. Beyond the useful 3D chemical information provided by dynamic-model-based volume correction, the approach permits us to enhance the correlation of chemical information from spectroscopic techniques with the physical properties obtained by AFM.
NASA Astrophysics Data System (ADS)
Rojo, Pilar; Royo, Santiago; Caum, Jesus; Ramírez, Jorge; Madariaga, Ines
2015-02-01
Peripheral refraction, the refractive error present outside the main direction of gaze, has lately attracted interest due to its alleged relationship with the progression of myopia. The ray tracing procedures involved in its calculation need to follow an approach different from those used in conventional ophthalmic lens design, where refractive errors are compensated only in the main direction of gaze. We present a methodology for the evaluation of the peripheral refractive error in ophthalmic lenses, adapting the conventional generalized ray tracing approach to the requirements of the evaluation of peripheral refraction. The nodal point of the eye and a retinal conjugate surface will be used to evaluate the three-dimensional distribution of refractive error around the fovea. The proposed approach enables us to calculate the three-dimensional peripheral refraction induced by any ophthalmic lens at any direction of gaze and to personalize the lens design to the requirements of the user. The complete evaluation process for a given user prescribed with a -5.76D ophthalmic lens for foveal vision is detailed, and comparative results obtained when the geometry of the lens is modified and when the central refractive error is over- or undercorrected. The methodology is also applied for an emmetropic eye to show its application for refractive errors other than myopia.
Modeling solvation effects in real-space and real-time within density functional approaches
NASA Astrophysics Data System (ADS)
Delgado, Alain; Corni, Stefano; Pittalis, Stefano; Rozzi, Carlo Andrea
2015-10-01
The Polarizable Continuum Model (PCM) can be used in conjunction with Density Functional Theory (DFT) and its time-dependent extension (TDDFT) to simulate the electronic and optical properties of molecules and nanoparticles immersed in a dielectric environment, typically liquid solvents. In this contribution, we develop a methodology to account for solvation effects in real-space (and real-time) (TD)DFT calculations. The boundary elements method is used to calculate the solvent reaction potential in terms of the apparent charges that spread over the van der Waals solute surface. In a real-space representation, this potential may exhibit a Coulomb singularity at grid points that are close to the cavity surface. We propose a simple approach to regularize such singularity by using a set of spherical Gaussian functions to distribute the apparent charges. We have implemented the proposed method in the Octopus code and present results for the solvation free energies and solvatochromic shifts for a representative set of organic molecules in water.
Modeling solvation effects in real-space and real-time within density functional approaches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Delgado, Alain; Centro de Aplicaciones Tecnológicas y Desarrollo Nuclear, Calle 30 # 502, 11300 La Habana; Corni, Stefano
2015-10-14
The Polarizable Continuum Model (PCM) can be used in conjunction with Density Functional Theory (DFT) and its time-dependent extension (TDDFT) to simulate the electronic and optical properties of molecules and nanoparticles immersed in a dielectric environment, typically liquid solvents. In this contribution, we develop a methodology to account for solvation effects in real-space (and real-time) (TD)DFT calculations. The boundary elements method is used to calculate the solvent reaction potential in terms of the apparent charges that spread over the van der Waals solute surface. In a real-space representation, this potential may exhibit a Coulomb singularity at grid points that aremore » close to the cavity surface. We propose a simple approach to regularize such singularity by using a set of spherical Gaussian functions to distribute the apparent charges. We have implemented the proposed method in the OCTOPUS code and present results for the solvation free energies and solvatochromic shifts for a representative set of organic molecules in water.« less
Reliability Modeling Methodology for Independent Approaches on Parallel Runways Safety Analysis
NASA Technical Reports Server (NTRS)
Babcock, P.; Schor, A.; Rosch, G.
1998-01-01
This document is an adjunct to the final report An Integrated Safety Analysis Methodology for Emerging Air Transport Technologies. That report presents the results of our analysis of the problem of simultaneous but independent, approaches of two aircraft on parallel runways (independent approaches on parallel runways, or IAPR). This introductory chapter presents a brief overview and perspective of approaches and methodologies for performing safety analyses for complex systems. Ensuing chapter provide the technical details that underlie the approach that we have taken in performing the safety analysis for the IAPR concept.
NASA Astrophysics Data System (ADS)
Yao, H.; Fang, H.; Li, C.; Liu, Y.; Zhang, H.; van der Hilst, R. D.; Huang, Y. C.
2014-12-01
Ambient noise tomography has provided essential constraints on crustal and uppermost mantle shear velocity structure in global seismology. Recent studies demonstrate that high frequency (e.g., ~ 1 Hz) surface waves between receivers at short distances can be successfully retrieved from ambient noise cross-correlation and then be used for imaging near surface or shallow crustal shear velocity structures. This approach provides important information for strong ground motion prediction in seismically active area and overburden structure characterization in oil and gas fields. Here we propose a new tomographic method to invert all surface wave dispersion data for 3-D variations of shear wavespeed without the intermediate step of phase or group velocity maps.The method uses frequency-dependent propagation paths and a wavelet-based sparsity-constrained tomographic inversion. A fast marching method is used to compute, at each period, surface wave traveltimes and ray paths between sources and receivers. This avoids the assumption of great-circle propagation that is used in most surface wave tomographic studies, but which is not appropriate in complex media. The wavelet coefficients of the velocity model are estimated with an iteratively reweighted least squares (IRLS) algorithm, and upon iterations the surface wave ray paths and the data sensitivity matrix are updated from the newly obtained velocity model. We apply this new method to determine the 3-D near surface wavespeed variations in the Taipei basin of Taiwan, Hefei urban area and a shale and gas production field in China using the high-frequency interstation Rayleigh wave dispersion data extracted from ambient noisecross-correlation. The results reveal strong effects of off-great-circle propagation of high-frequency surface waves in these regions with above 30% shear wavespeed variations. The proposed approach is more efficient and robust than the traditional two-step surface wave tomography for imaging complex structures. In the future, approximate 3-D sensitivity kernels for dispersion data will be incorporated to account for finite-frequency effect of surface wave propagation. In addition, our approach provides a consistent framework for joint inversion of surface wave dispersion and body wave traveltime data for 3-D Vp and Vs structures.
Reviewing the methodology of an integrative review.
Hopia, Hanna; Latvala, Eila; Liimatainen, Leena
2016-12-01
Whittemore and Knafl's updated description of methodological approach for integrative review was published in 2005. Since then, the five stages of the approach have been regularly used as a basic conceptual structure of the integrative reviews conducted by nursing researchers. However, this methodological approach is seldom examined from the perspective of how systematically and rigorously the stages are implemented in the published integrative reviews. To appraise the selected integrative reviews on the basis of the methodological approach according to the five stages published by Whittemore and Knafl in 2005. A literature review was used in this study. CINAHL (Cumulative Index to Nursing and Allied Health), PubMed, OVID (Journals@Ovid) and the Cochrane Library databases were searched for integrative reviews published between 2002 and 2014. Papers were included if they used the methodological approach described by Whittemore and Knafl, were published in English and were focused on nursing education or nursing expertise. A total of 259 integrative review publications for potential inclusion were identified. Ten integrative reviews fulfilled the inclusion criteria. Findings from the studies were extracted and critically examined according to the five methodological stages. The reviews assessed followed the guidelines of the stated methodology approach to different extents. The stages of literature search, data evaluation and data analysis were fairly poorly formulated and only partially implemented in the studies included in the sample. The other two stages, problem identification and presentation, followed those described in the methodological approach quite well. Increasing use of research in clinical practice is inevitable, and therefore, integrative reviews can play a greater role in developing evidence-based nursing practices. Because of this, nurse researchers should pay more attention to sound integrative nursing research to systematise the review process and make it more rigorous. © 2016 Nordic College of Caring Science.
NASA Astrophysics Data System (ADS)
Casu, F.; de Luca, C.; Lanari, R.; Manunta, M.; Zinno, I.
2016-12-01
A methodology for computing surface deformation time series and mean velocity maps of large areas is presented. Our approach relies on the availability of a multi-temporal set of Synthetic Aperture Radar (SAR) data collected from ascending and descending orbits over an area of interest, and also permits to estimate the vertical and horizontal (East-West) displacement components of the Earth's surface. The adopted methodology is based on an advanced Cloud Computing implementation of the Differential SAR Interferometry (DInSAR) Parallel Small Baseline Subset (P-SBAS) processing chain which allows the unsupervised processing of large SAR data volumes, from the raw data (level-0) imagery up to the generation of DInSAR time series and maps. The presented solution, which is highly scalable, has been tested on the ascending and descending ENVISAT SAR archives, which have been acquired over a large area of Southern California (US) that extends for about 90.000 km2. Such an input dataset has been processed in parallel by exploiting 280 computing nodes of the Amazon Web Services Cloud environment. Moreover, to produce the final mean deformation velocity maps of the vertical and East-West displacement components of the whole investigated area, we took also advantage of the information available from external GPS measurements that permit to account for possible regional trends not easily detectable by DInSAR and to refer the P-SBAS measurements to an external geodetic datum. The presented results clearly demonstrate the effectiveness of the proposed approach that paves the way to the extensive use of the available ERS and ENVISAT SAR data archives. Furthermore, the proposed methodology can be particularly suitable to deal with the very huge data flow provided by the Sentinel-1 constellation, thus permitting to extend the DInSAR analyses at a nearly global scale. This work is partially supported by: the DPC-CNR agreement, the EPOS-IP project and the ESA GEP project.
Fast Object Motion Estimation Based on Dynamic Stixels.
Morales, Néstor; Morell, Antonio; Toledo, Jonay; Acosta, Leopoldo
2016-07-28
The stixel world is a simplification of the world in which obstacles are represented as vertical instances, called stixels, standing on a surface assumed to be planar. In this paper, previous approaches for stixel tracking are extended using a two-level scheme. In the first level, stixels are tracked by matching them between frames using a bipartite graph in which edges represent a matching cost function. Then, stixels are clustered into sets representing objects in the environment. These objects are matched based on the number of stixels paired inside them. Furthermore, a faster, but less accurate approach is proposed in which only the second level is used. Several configurations of our method are compared to an existing state-of-the-art approach to show how our methodology outperforms it in several areas, including an improvement in the quality of the depth reconstruction.
Panel acoustic contribution analysis.
Wu, Sean F; Natarajan, Logesh Kumar
2013-02-01
Formulations are derived to analyze the relative panel acoustic contributions of a vibrating structure. The essence of this analysis is to correlate the acoustic power flow from each panel to the radiated acoustic pressure at any field point. The acoustic power is obtained by integrating the normal component of the surface acoustic intensity, which is the product of the surface acoustic pressure and normal surface velocity reconstructed by using the Helmholtz equation least squares based nearfield acoustical holography, over each panel. The significance of this methodology is that it enables one to analyze and rank relative acoustic contributions of individual panels of a complex vibrating structure to acoustic radiation anywhere in the field based on a single set of the acoustic pressures measured in the near field. Moreover, this approach is valid for both interior and exterior regions. Examples of using this method to analyze and rank the relative acoustic contributions of a scaled vehicle cabin are demonstrated.
A method for the design of unsymmetrical optical systems using freeform surfaces
NASA Astrophysics Data System (ADS)
Reshidko, Dmitry; Sasian, Jose
2017-11-01
Optical systems that do not have axial symmetry can provide useful and unique solutions to certain imaging problems. However, the complexity of the optical design task grows as the degrees of symmetry are reduced and lost: there are more aberration terms to control, and achieving a sharp image over a wide field-of-view at fast optical speeds becomes challenging. Plane-symmetric optical systems represent a large family of practical non-axially symmetric systems that are simple enough to be easily described and thus are well understood. Design methodologies and aberration theory of plane-symmetric optical systems have been discussed in the literature, and various interesting solutions have been reported [1-4]. The little discussed in the literature technique of confocal systems is effective for the design of unsymmetrical optics. A confocal unsymmetrical system is constructed in such a way that there is sharp image along a given ray (called the optical axis ray (OAR)) surface after surface. It is possible to show that such a system can have a reduced number of field aberrations, and that the system will behave closer to an axially symmetric system [5-6]. In this paper, we review a methodology for the design of unsymmetrical optical systems. We utilize an aspherical/freeform surface constructed by superposition of a conic expressed in a coordinate system that is centered on the off-axis surface segment rather than centered on the axis of symmetry, and an XY polynomial. The conic part of the aspherical/freeform surface describes the base shape that is required to achieve stigmatic imaging surface after surface along the OAR. The XY polynomial adds a more refined shape description to the surface sag and provides effective degrees of freedom for higher-order aberration correction. This aspheric/freeform surface profile is able to best model the ideal reflective surface and to allow one to intelligently approach the optical design. Examples of two- and threemirror unobscured wide field-of-view reflective systems are provided to show how the methods and corresponding aspheric/freeform surface are applied. We also demonstrate how the method can be extended to design a monolithic freeform objective.
On the c-Si/SiO2 interface recombination parameters from photo-conductance decay measurements
NASA Astrophysics Data System (ADS)
Bonilla, Ruy S.; Wilshaw, Peter R.
2017-04-01
The recombination of electric charge carriers at semiconductor surfaces continues to be a limiting factor in achieving high performance optoelectronic devices, including solar cells, laser diodes, and photodetectors. The theoretical model and a solution algorithm for surface recombination have been previously reported. However, their successful application to experimental data for a wide range of both minority excess carrier concentrations and dielectric fixed charge densities has not previously been shown. Here, a parametrisation for the semiconductor-dielectric interface charge Q i t is used in a Shockley-Read-Hall extended formalism to describe recombination at the c-Si/SiO2 interface, and estimate the physical parameters relating to the interface trap density D i t , and the electron and hole capture cross-sections σ n and σ p . This approach gives an excellent description of the experimental data without the need to invoke a surface damage region in the c-Si/SiO2 system. Band-gap tail states have been observed to limit strongly the effectiveness of field effect passivation. This approach provides a methodology to determine interface recombination parameters in any semiconductor-insulator system using macro scale measuring techniques.
Montowska, Magdalena; Alexander, Morgan R; Tucker, Gregory A; Barrett, David A
2014-10-21
In this Article, our previously developed ambient LESA-MS methodology is implemented to analyze five types of thermally treated meat species, namely, beef, pork, horse, chicken, and turkey meat, to select and identify heat-stable and species-specific peptide markers. In-solution tryptic digests of cooked meats were deposited onto a polymer surface, followed by LESA-MS analysis and evaluation using multivariate data analysis and tandem electrospray MS. The five types of cooked meat were clearly discriminated using principal component analysis and orthogonal partial least-squares discriminant analysis. 23 heat stable peptide markers unique to species and muscle protein were identified following data-dependent tandem LESA-MS analysis. Surface extraction and direct ambient MS analysis of mixtures of cooked meat species was performed for the first time and enabled detection of 10% (w/w) of pork, horse, and turkey meat and 5% (w/w) of chicken meat in beef, using the developed LESA-MS/MS analysis. The study shows, for the first time, that ambient LESA-MS methodology displays specificity sufficient to be implemented effectively for the analysis of processed and complex peptide digests. The proposed approach is much faster and simpler than other measurement tools for meat speciation; it has potential for application in other areas of meat science or food production.
A validated methodology for the 3D reconstruction of cochlea geometries using human microCT images
NASA Astrophysics Data System (ADS)
Sakellarios, A. I.; Tachos, N. S.; Rigas, G.; Bibas, T.; Ni, G.; Böhnke, F.; Fotiadis, D. I.
2017-05-01
Accurate reconstruction of the inner ear is a prerequisite for the modelling and understanding of the inner ear mechanics. In this study, we present a semi-automated methodology for accurate reconstruction of the major inner ear structures (scalae, basilar membrane, stapes and semicircular canals). For this purpose, high resolution microCT images of a human specimen were used. The segmentation methodology is based on an iterative level set algorithm which provides the borders of the structures of interest. An enhanced coupled level set method which allows the simultaneous multiple image labeling without any overlapping regions has been developed for this purpose. The marching cube algorithm was applied in order to extract the surface from the segmented volume. The reconstructed geometries are then post-processed to improve the basilar membrane geometry to realistically represent physiologic dimensions. The final reconstructed model is compared to the available data from the literature. The results show that our generated inner ear structures are in good agreement with the published ones, while our approach is the most realistic in terms of the basilar membrane thickness and width reconstruction.
NASA Astrophysics Data System (ADS)
Campanelli, Alessandra; Bellafiore, Debora; Bensi, Manuel; Bignami, Francesco; Caccamo, Giuseppe; Celussi, Mauro; Del Negro, Paola; Ferrarin, Christian; Marini, Mauro; Paschini, Elio; Zaggia, Luca
2014-05-01
As part of the actions of the flagship project RITMARE (Ricerca ITaliana per il MARE) a daily oceanographic survey was performed on 29th November 2013 in front of the Po River delta (Northern Adriatic Sea). The Po river affects a large part of the Northern Adriatic Sea with strong implications on the circulation and functionality of the basin. Physical-chemical and biological properties of coastal waters were investigated after a moderate flood occurred around 25th-27th November. The cruise activities, carried out using a small research boat, were mainly focused on the test of a methodological approach to investigate the environment variability after a flood event in the framework of rapid assessment. The effects of the flood on the coastal waters, have been evaluated in the field using operational forecasts and real-time satellite imagery to assist field measurements and samplings. Surface satellite chlorophyll maps and surface salinity and current maps obtained from a numerical model forced by meteorological forecast and river data were analyzed to better identify the Po plume dispersion during and after the event in order to better locate offshore monitoring stations at the sea. Profiles of Temperature, Salinity, Turbidity, Fluorescence and Colored Dissolved Organic Matter (CDOM) throughout the water column were collected at 7 stations in front of the Po River delta. Sea surface water samples were also collected for the analysis of nutrients, Dissolved Organic Carbon (DOC) and CDOM (surface and bottom). The CDOM regulates the penetration of UV light throughout the water column and mediates photochemical reactions, playing an important role in many marine biogeochemical processes. Satellite images showed a strong color front that separates the higher-chlorophyll coastal water from the more oligotrophic mid-basin and eastern boundary Adriatic waters. In front of the river mouth, the surface layer was characterized by low salinity (14-15), high turbidity (8-11 NTU) and high CDOM (20-22 ppb) values. These parameters showed a strong gradient from coast to offshore and from surface to the bottom. The fluorescence values were more variable since the phytoplankton growth is not quickly correlated with the load of riverborne materials. The higher fluorescence values (1.8-2 µg l-1) were, in fact, detected offshore and at bottom. A good correlation between salinity versus CDOM (R2=0.84) and salinity versus Spectral slope (SCDOM275-295; R2=0.86) were found. These features reveal the role of CDOM as tracer of the freshwater inputs. Chemical analysis of waters affected by the river plume display high concentration of organic carbon (100-160 µmol l-1) and nutrients strenghtening this zone as one of the most eutrophic area of the Mediterranean Sea (Campanelli et al. 2011, Marini et al. 2008). The synergy of actions applied in the test has proved useful to better analyze the variability of coastal water characteristics after a river flood. However, a similar methodological approach could be reasonably applied to the rapid assessment of different events (i.e. harmful phytoplankton growth, chemical spills) which can occur in the area or in areas with similar features. The definition of methodologies for rapid assessment of marine processes can be a useful tool for the future integrated management of coastal zone. References Campanelli, A., F. Grilli, E. Paschini, M. Marini, 2011. The influence of an exceptional Po River flood on the physical and chemical oceanographic properties of the Adriatic Sea. Dynam. Atmos. Oceans, 52: 284-297. Marini, M., B.H. Jones, A. Campanelli, F. Grilli & C.M. Lee. 2008. Seasonal variability and Po River plume influence on biochemical properties along western Adriatic coast. J. Geophys. Res., 113: C05S90, doi:10.1029/2007JC004370.
DOT National Transportation Integrated Search
1995-01-01
Prepared ca. 1995. This paper illustrates the use of the simulation-optimization technique of response surface methodology (RSM) in traffic signal optimization of urban networks. It also quantifies the gains of using the common random number (CRN) va...
Magnitude and variability of land evaporation and its components at the global scale
USDA-ARS?s Scientific Manuscript database
A physics-based methodology is applied to estimate global land-surface evaporation from multi-satellite observations. GLEAM (Global Land-surface Evaporation: the Amsterdam Methodology) combines a wide range of remotely sensed observations within a Priestley and Taylor-based framework. Daily actual e...
Rink, Cameron L; Wernke, Matthew M; Powell, Heather M; Tornero, Mark; Gnyawali, Surya C; Schroeder, Ryan M; Kim, Jayne Y; Denune, Jeffrey A; Albury, Alexander W; Gordillo, Gayle M; Colvin, James M; Sen, Chandan K
2017-07-01
Objective: (1) Develop a standardized approach to quantitatively measure residual limb skin health. (2) Report reference residual limb skin health values in people with transtibial and transfemoral amputation. Approach: Residual limb health outcomes in individuals with transtibial ( n = 5) and transfemoral ( n = 5) amputation were compared to able-limb controls ( n = 4) using noninvasive imaging (hyperspectral imaging and laser speckle flowmetry) and probe-based approaches (laser doppler flowmetry, transcutaneous oxygen, transepidermal water loss, surface electrical capacitance). Results: A standardized methodology that employs noninvasive imaging and probe-based approaches to measure residual limb skin health are described. Compared to able-limb controls, individuals with transtibial and transfemoral amputation have significantly lower transcutaneous oxygen tension, higher transepidermal water loss, and higher surface electrical capacitance in the residual limb. Innovation: Residual limb health as a critical component of prosthesis rehabilitation for individuals with lower limb amputation is understudied in part due to a lack of clinical measures. Here, we present a standardized approach to measure residual limb health in people with transtibial and transfemoral amputation. Conclusion: Technology advances in noninvasive imaging and probe-based measures are leveraged to develop a standardized approach to quantitatively measure residual limb health in individuals with lower limb loss. Compared to able-limb controls, resting residual limb physiology in people that have had transfemoral or transtibial amputation is characterized by lower transcutaneous oxygen tension and poorer skin barrier function.
Martins, Rui; Oliveira, Paulo Eduardo; Schmitt, Aurore
2012-06-10
We discuss here the estimation of age at death from two indicators (pubic symphysis and the sacro-pelvic surface of the ilium) based on four different osteological series from Portugal, Great-Britain, South Africa or USA (European origin). These samples and the scoring system of the two indicators were used by Schmitt et al. (2002), applying the methodology proposed by Lucy et al. (1996). In the present work, the same data was processed using a modification of the empirical method proposed by Lucy et al. (2002). The various probability distributions are estimated from training data by using kernel density procedures and Jackknife methodology. Bayes's theorem is then used to produce the posterior distribution from which point and interval estimates may be made. This statistical approach reduces the bias of the estimates to less than 70% of what was obtained by the initial method. This reduction going up to 52% if knowledge of sex of the individual is available, and produces an age for all the individuals that improves age at death assessment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rasera, L. G.; Mariethoz, G.; Lane, S. N.
2017-12-01
Frequent acquisition of high-resolution digital elevation models (HR-DEMs) over large areas is expensive and difficult. Satellite-derived low-resolution digital elevation models (LR-DEMs) provide extensive coverage of Earth's surface but at coarser spatial and temporal resolutions. Although useful for large scale problems, LR-DEMs are not suitable for modeling hydrologic and geomorphic processes at scales smaller than their spatial resolution. In this work, we present a multiple-point geostatistical approach for downscaling a target LR-DEM based on available high-resolution training data and recurrent high-resolution remote sensing images. The method aims at generating several equiprobable HR-DEMs conditioned to a given target LR-DEM by borrowing small scale topographic patterns from an analogue containing data at both coarse and fine scales. An application of the methodology is demonstrated by using an ensemble of simulated HR-DEMs as input to a flow-routing algorithm. The proposed framework enables a probabilistic assessment of the spatial structures generated by natural phenomena operating at scales finer than the available terrain elevation measurements. A case study in the Swiss Alps is provided to illustrate the methodology.
Vega-Garzon, Lina Patricia; Gomez-Miranda, Ingry Natalia; Peñuela, Gustavo A
2018-05-01
Response Surface Methodology was used for optimizing operating variables for a multi-frequency ultrasound reactor using BP-3 as a model compound. The response variable was the Triclosan degradation percent after 10 sonication min. Frequency at levels from 574, 856 and 1134 kHz were used. Power density, pulse time (PT), silent time (ST) and PT/ST ratio effects were also analyzed. 2 2 and 2 3 experimental designs were used for screening purposes and a central composite design was used for optimization. An optimum value of 79.2% was obtained for a frequency of 574 kHz, a power density of 200 W/L, and a PT/ST ratio of 10. Significant variables were frequency and power level, the first having an optimum value after which degradation decreases while power density level had a strong positive effect on the whole operational range. PT, ST, and PT/ST ratio were not significant variables although it was shown that pulsed mode ultrasound has better degradation rates than continuous mode ultrasound; the effect less significant at higher power levels. Copyright © 2017. Published by Elsevier B.V.
Flight Control Using Distributed Shape-Change Effector Arrays
NASA Technical Reports Server (NTRS)
Raney, David L.; Montgomery, Raymond C.; Green, Lawrence I.; Park, Michael A.
2000-01-01
Recent discoveries in material science and fluidics have been used to create a variety of novel effector devices that offer great potential to enable new approaches to aerospace vehicle flight control. Examples include small inflatable blisters, shape-memory alloy diaphragms, and piezoelectric patches that may be used to produce distortions or bumps on the surface of an airfoil to generate control moments. Small jets have also been used to produce a virtual shape-change through fluidic means by creating a recirculation bubble on the surface of an airfoil. An advanced aerospace vehicle might use distributed arrays of hundreds of such devices to generate moments for stabilization and maneuver control, either augmenting or replacing conventional ailerons, flaps or rudders. This research demonstrates the design and use of shape-change device arrays for a tailless aircraft in a low-rate maneuvering application. A methodology for assessing the control authority of the device arrays is described, and a suite of arrays is used in a dynamic simulation to illustrate allocation and deployment methodologies. Although the authority of the preliminary shape-change array designs studied in this paper appeared quite low, the simulation results indicate that the effector suite possessed sufficient authority to stabilize and maneuver the vehicle in mild turbulence.
Ojha, Nupur; Das, Nilanjana
2018-02-01
Polyhydroxyalkanoates (PHAs) are three-level group of biodegradable polymers and attractive substitutes over conventional plastics to avoid the pollution problems. The yeast strain isolated from sugarcane juice, identified as Wickerhamomyces anomalus VIT-NN01, was used for the production of polyhydroxyalkanoates (PHA). Response surface methodology (RSM), three-level six variables Box-Behnken design (BBD), was employed to optimize the factors such as pH 8.0, temperature 37°C, sugarcane molasses (35g/L) supplemented with co-substrate palm oil (0.5%),corn steep liquor (2%) after a period of 96h of incubation for the maximum yield (19.50±0.3g/L) of PHA. It was well in close agreement with the predicted value obtained by RSM model yield (19.55±0.1g/L).Characterization of the extracted polymer was done using FTIR, GC-MS, XRD, TGA and AFM analysis. NMR spectroscopic analysis revealed that the biopolymer was poly (3-hydroxybutyrate-co-3-hydroxyvalerate), copolymer of PHA. This is the first report on optimization of PHA production using yeast strain isolated from natural sources. Copyright © 2017 Elsevier B.V. All rights reserved.
Transferring Codified Knowledge: Socio-Technical versus Top-Down Approaches
ERIC Educational Resources Information Center
Guzman, Gustavo; Trivelato, Luiz F.
2008-01-01
Purpose: This paper aims to analyse and evaluate the transfer process of codified knowledge (CK) performed under two different approaches: the "socio-technical" and the "top-down". It is argued that the socio-technical approach supports the transfer of CK better than the top-down approach. Design/methodology/approach: Case study methodology was…
Wilhelm, Jan; Walz, Michael; Stendel, Melanie; Bagrets, Alexei; Evers, Ferdinand
2013-05-14
We present a modification of the standard electron transport methodology based on the (non-equilibrium) Green's function formalism to efficiently simulate STM-images. The novel feature of this method is that it employs an effective embedding technique that allows us to extrapolate properties of metal substrates with adsorbed molecules from quantum-chemical cluster calculations. To illustrate the potential of this approach, we present an application to STM-images of C58-dimers immobilized on Au(111)-surfaces that is motivated by recent experiments.
Ciric, Milica; Moon, Christina D; Leahy, Sinead C; Creevey, Christopher J; Altermann, Eric; Attwood, Graeme T; Rakonjac, Jasna; Gagic, Dragana
2014-05-12
In silico, secretome proteins can be predicted from completely sequenced genomes using various available algorithms that identify membrane-targeting sequences. For metasecretome (collection of surface, secreted and transmembrane proteins from environmental microbial communities) this approach is impractical, considering that the metasecretome open reading frames (ORFs) comprise only 10% to 30% of total metagenome, and are poorly represented in the dataset due to overall low coverage of metagenomic gene pool, even in large-scale projects. By combining secretome-selective phage display and next-generation sequencing, we focused the sequence analysis of complex rumen microbial community on the metasecretome component of the metagenome. This approach achieved high enrichment (29 fold) of secreted fibrolytic enzymes from the plant-adherent microbial community of the bovine rumen. In particular, we identified hundreds of heretofore rare modules belonging to cellulosomes, cell-surface complexes specialised for recognition and degradation of the plant fibre. As a method, metasecretome phage display combined with next-generation sequencing has a power to sample the diversity of low-abundance surface and secreted proteins that would otherwise require exceptionally large metagenomic sequencing projects. As a resource, metasecretome display library backed by the dataset obtained by next-generation sequencing is ready for i) affinity selection by standard phage display methodology and ii) easy purification of displayed proteins as part of the virion for individual functional analysis.
2017-03-06
design of antenna and radar systems, energy absorption and scattering by rough-surfaces. This work has lead to significant new methodologies , including...problems in the field of electromagnetic propagation and scattering, with applicability to design of antenna and radar systems, energy absorption...and scattering by rough-surfaces. This work has lead to significant new methodologies , including introduction of a certain Windowed Green Function
Fiber Optic Wing Shape Sensing on NASA's Ikhana UAV
NASA Technical Reports Server (NTRS)
Richards, Lance; Parker, Allen R.; Ko, William L.; Piazza, Anthony
2008-01-01
This document discusses the development of fiber optic wing shape sensing on NASA's Ikhana vehicle. The Dryden Flight Research Center's Aerostructures Branch initiated fiber-optic instrumentation development efforts in the mid-1990s. Motivated by a failure to control wing dihedral resulting in a mishap with the Helios aircraft, new wing displacement techniques were developed. Research objectives for Ikhana included validating fiber optic sensor measurements and real-time wing shape sensing predictions; the validation of fiber optic mathematical models and design tools; assessing technical viability and, if applicable, developing methodology and approaches to incorporate wing shape measurements within the vehicle flight control system; and, developing and flight validating approaches to perform active wing shape control using conventional control surfaces and active material concepts.
Multi Response Optimization of Laser Micro Marking Process:A Grey- Fuzzy Approach
NASA Astrophysics Data System (ADS)
Shivakoti, I.; Das, P. P.; Kibria, G.; Pradhan, B. B.; Mustafa, Z.; Ghadai, R. K.
2017-07-01
The selection of optimal parametric combination for efficient machining has always become a challenging issue for the manufacturing researcher. The optimal parametric combination always provides a better machining which improves the productivity, product quality and subsequently reduces the production cost and time. The paper presents the hybrid approach of Grey relational analysis and Fuzzy logic to obtain the optimal parametric combination for better laser beam micro marking on the Gallium Nitride (GaN) work material. The response surface methodology has been implemented for design of experiment considering three parameters with their five levels. The parameter such as current, frequency and scanning speed has been considered and the mark width, mark depth and mark intensity has been considered as the process response.
Accounting for Laminar Run & Trip Drag in Supersonic Cruise Performance Testing
NASA Technical Reports Server (NTRS)
Goodsell, Aga M.; Kennelly, Robert A.
1999-01-01
An improved laminar run and trip drag correction methodology for supersonic cruise performance testing was derived. This method required more careful analysis of the flow visualization images which revealed delayed transition particularly on the inboard upper surface, even for the largest trip disks. In addition, a new code was developed to estimate the laminar run correction. Once the data were corrected for laminar run, the correct approach to the analysis of the trip drag became evident. Although the data originally appeared confusing, the corrected data are consistent with previous results. Furthermore, the modified approach, which was described in this presentation, extends prior historical work by taking into account the delayed transition caused by the blunt leading edges.
USDA-ARS?s Scientific Manuscript database
Streptococcus thermophilus normally exhibits different survival rates in different bacteria medium during freeze-drying. In this study, response surface methodology (RSM) was applied on the design of experiments for optimizing the cryoprotective medium. Results showed that the most significant facto...
Guo, F; Zheng, H; Cheng, Y; Song, S; Zheng, Z; Jia, S
2018-02-01
Poly-ε-L-lysine is a natural homo-polyamide of L-lysine with excellent antimicrobial properties, which can be used as a novel preservative and has a wide range of applications. In this paper, the fermentation medium for ε-PL production by Streptomyces diastatochromogenes 6#-7 was optimized by Response Surface Methodology. The results of Plackett-Burman design showed that glucose, yeast extract and (NH 4 ) 2 SO 4 were the major influencing factors in ε-PL production of S. diastatochromogenes 6#-7. The optimal concentrations of glucose, yeast extract and (NH 4 ) 2 SO 4 were determined to be 60, 7·5 and 7·5 g l -1 according to Box-Behnken experiment and regression analysis, respectively. Under the optimized conditions, the ε-PL yield in shake-flask fermentation was 0·948 ± 0·030 g l -1 , which was in good agreement with the predicted value of 0·970 g l -1 . The yield was improved by 43·1% from that with the initial medium. In 5 l jar-fermenter the ε-PL yield reached 25·5 g l -1 , which was increased by 56·4% from the original medium. In addition, the fermentation time was reduced from 174 to 120 h. Medium optimization is a very practical and valuable tool for fermentation industry to improve product yield and minimize by-products as well as reduce overall manufacturing costs. The response surface methodology is not new, but it is still a very effective method in medium optimization research. This study used ε-polylysine fermentation as an example to demonstrate how the product yield can be significantly increased by medium optimization through surface response methodology. Similar approach can be used in other microbial fermentations such as in pharmaceutical, food, agricultural and energy industries. As an example, ε-polylysine is one of a few newly approved natural food-grade antimicrobials for food and beverages preservations. Yield improvement is economically beneficial to not only ε-polylysine manufacturers but also to their users and consumers due to lower costs and price. © 2017 The Society for Applied Microbiology.
Mehri, M
2012-12-01
An artificial neural network (ANN) approach was used to develop feed-forward multilayer perceptron models to estimate the nutritional requirements of digestible lysine (dLys), methionine (dMet), and threonine (dThr) in broiler chicks. Sixty data lines representing response of the broiler chicks during 3 to 16 d of age to dietary levels of dLys (0.88-1.32%), dMet (0.42-0.58%), and dThr (0.53-0.87%) were obtained from literature and used to train the networks. The prediction values of ANN were compared with those of response surface methodology to evaluate the fitness of these 2 methods. The models were tested using R(2), mean absolute deviation, mean absolute percentage error, and absolute average deviation. The random search algorithm was used to optimize the developed ANN models to estimate the optimal values of dietary dLys, dMet, and dThr. The ANN models were used to assess the relative importance of each dietary input on the bird performance using sensitivity analysis. The statistical evaluations revealed the higher accuracy of ANN to predict the bird performance compared with response surface methodology models. The optimization results showed that the maximum BW gain may be obtained with dietary levels of 1.11, 0.51, and 0.78% of dLys, dMet, and dThr, respectively. Minimum feed conversion ratio may be achieved with dietary levels of 1.13, 0.54, 0.78% of dLys, dMet, and dThr, respectively. The sensitivity analysis on the models indicated that dietary Lys is the most important variable in the growth performance of the broiler chicks, followed by dietary Thr and Met. The results of this research revealed that the experimental data of a response-surface-methodology design could be successfully used to develop the well-designed ANN for pattern recognition of bird growth and optimization of nutritional requirements. The comparison between the 2 methods also showed that the statistical methods may have little effect on the ideal ratios of dMet and dThr to dLys in broiler chicks using multivariate optimization.
NASA Astrophysics Data System (ADS)
Johnson, Erika; Cowen, Edwin
2013-11-01
The effect of increased bed roughness on the free surface turbulence signature of an open channel flow is investigated with the goal of incorporating the findings into a methodology to remotely monitor volumetric flow rates. Half of a wide (B = 2 m) open channel bed is covered with a 3 cm thick layer of loose gravel (D50 = 0.6 cm). Surface PIV (particle image velocimetry) experiments are conducted for a range of flow depths (B/H = 10-30) and Reynolds numbers (ReH = 10,000-60,000). It is well established that bed roughness in wall-bounded flows enhances the vertical velocity fluctuations (e.g. Krogstad et al. 1992). When the vertical velocity fluctuations approach the free surface they are redistributed (e.g. Cowen et al. 1995) to the surface parallel component directions. It is anticipated and confirmed that the interaction of these two phenomena result in enhanced turbulence at the free surface. The effect of the rough bed on the integral length scales and the second order velocity structure functions calculated at the free surface are investigated. These findings have important implications for developing new technologies in stream gaging.
Cathcart, Nicole; Kitaev, Vladimir
2016-09-08
A powerful approach to augment the diversity of well-defined metal nanoparticle (MNP) morphologies, essential for MNP advanced applications, is symmetry breaking combined with seeded growth. Utilizing this approach enabled the formation of bimorphic silver nanoparticles (bi-AgNPs) consisting of two shapes linked by one regrowth point. Bi-AgNPs were formed by using an adsorbing polymer, poly(acrylic acid), PAA, to block the surface of a decahedral AgNP seed and restricting growth of new silver to a single nucleation point. First, we have realized 2-D growth of platelets attached to decahedra producing nanoscale shapes reminiscent of apples, fishes, mushrooms and kites. 1-D bimorphic growth of rods (with chloride) and 3-D bimorphic growth of cubes and bipyramids (with bromide) were achieved by using halides to induce preferential (100) stabilization over (111) of platelets. Furthermore, the universality of the formation of bimorphic nanoparticles was demonstrated by using different seeds. Bi-AgNPs exhibit strong SERS enhancement due to regular cavities at the necks. Overall, the reported approach to symmetry breaking and bimorphic nanoparticle growth offers a powerful methodology for nanoscale shape design.
NASA Astrophysics Data System (ADS)
Cathcart, Nicole; Kitaev, Vladimir
2016-09-01
A powerful approach to augment the diversity of well-defined metal nanoparticle (MNP) morphologies, essential for MNP advanced applications, is symmetry breaking combined with seeded growth. Utilizing this approach enabled the formation of bimorphic silver nanoparticles (bi-AgNPs) consisting of two shapes linked by one regrowth point. Bi-AgNPs were formed by using an adsorbing polymer, poly(acrylic acid), PAA, to block the surface of a decahedral AgNP seed and restricting growth of new silver to a single nucleation point. First, we have realized 2-D growth of platelets attached to decahedra producing nanoscale shapes reminiscent of apples, fishes, mushrooms and kites. 1-D bimorphic growth of rods (with chloride) and 3-D bimorphic growth of cubes and bipyramids (with bromide) were achieved by using halides to induce preferential (100) stabilization over (111) of platelets. Furthermore, the universality of the formation of bimorphic nanoparticles was demonstrated by using different seeds. Bi-AgNPs exhibit strong SERS enhancement due to regular cavities at the necks. Overall, the reported approach to symmetry breaking and bimorphic nanoparticle growth offers a powerful methodology for nanoscale shape design.
NASA Astrophysics Data System (ADS)
Izquierdo, Mª Teresa; de Yuso, Alicia Martínez; Valenciano, Raquel; Rubio, Begoña; Pino, Mª Rosa
2013-01-01
The objective of this study was to evaluate the adsorption capacity of toluene and hexane over activated carbons prepared according an experimental design, considering as variables the activation temperature, the impregnation ratio and the activation time. The response surface methodology was applied to optimize the adsorption capacity of the carbons regarding the preparation conditions that determine the physicochemical characteristics of the activated carbons. The methodology of preparation produced activated carbons with surface areas and micropore volumes as high as 1128 m2/g and 0.52 cm3/g, respectively. Moreover, the activated carbons exhibit mesoporosity, ranging from 64.6% to 89.1% the percentage of microporosity. The surface chemistry was characterized by TPD, FTIR and acid-base titration obtaining different values of surface groups from the different techniques because the limitation of each technique, but obtaining similar trends for the activated carbons studied. The exhaustive characterization of the activated carbons allows to state that the measured surface area does not explain the adsorption capacity for either toluene or n-hexane. On the other hand, the surface chemistry does not explain the adsorption results either. A compromise between physical and chemical characteristics can be obtained from the appropriate activation conditions, and the response surface methodology gives the optimal activated carbon to maximize adsorption capacity. Low activation temperature, intermediate impregnation ratio lead to high toluene and n-hexane adsorption capacities depending on the activation time, which a determining factor to maximize toluene adsorption.
Navigating the grounded theory terrain. Part 1.
Hunter, Andrew; Murphy, Kathy; Grealish, Annmarie; Casey, Dympna; Keady, John
2011-01-01
The decision to use grounded theory is not an easy one and this article aims to illustrate and explore the methodological complexity and decision-making process. It explores the decision making of one researcher in the first two years of a grounded theory PhD study looking at the psychosocial training needs of nurses and healthcare assistants working with people with dementia in residential care. It aims to map out three different approaches to grounded theory: classic, Straussian and constructivist. In nursing research, grounded theory is often referred to but it is not always well understood. This confusion is due in part to the history of grounded theory methodology, which is one of development and divergent approaches. Common elements across grounded theory approaches are briefly outlined, along with the key differences of the divergent approaches. Methodological literature pertaining to the three chosen grounded theory approaches is considered and presented to illustrate the options and support the choice made. The process of deciding on classical grounded theory as the version best suited to this research is presented. The methodological and personal factors that directed the decision are outlined. The relative strengths of Straussian and constructivist grounded theories are reviewed. All three grounded theory approaches considered offer the researcher a structured, rigorous methodology, but researchers need to understand their choices and make those choices based on a range of methodological and personal factors. In the second article, the final methodological decision will be outlined and its research application described.
Estepp, Justin R.; Christensen, James C.
2015-01-01
The passive brain-computer interface (pBCI) framework has been shown to be a very promising construct for assessing cognitive and affective state in both individuals and teams. There is a growing body of work that focuses on solving the challenges of transitioning pBCI systems from the research laboratory environment to practical, everyday use. An interesting issue is what impact methodological variability may have on the ability to reliably identify (neuro)physiological patterns that are useful for state assessment. This work aimed at quantifying the effects of methodological variability in a pBCI design for detecting changes in cognitive workload. Specific focus was directed toward the effects of replacing electrodes over dual sessions (thus inducing changes in placement, electromechanical properties, and/or impedance between the electrode and skin surface) on the accuracy of several machine learning approaches in a binary classification problem. In investigating these methodological variables, it was determined that the removal and replacement of the electrode suite between sessions does not impact the accuracy of a number of learning approaches when trained on one session and tested on a second. This finding was confirmed by comparing to a control group for which the electrode suite was not replaced between sessions. This result suggests that sensors (both neurological and peripheral) may be removed and replaced over the course of many interactions with a pBCI system without affecting its performance. Future work on multi-session and multi-day pBCI system use should seek to replicate this (lack of) effect between sessions in other tasks, temporal time courses, and data analytic approaches while also focusing on non-stationarity and variable classification performance due to intrinsic factors. PMID:25805963
Estepp, Justin R; Christensen, James C
2015-01-01
The passive brain-computer interface (pBCI) framework has been shown to be a very promising construct for assessing cognitive and affective state in both individuals and teams. There is a growing body of work that focuses on solving the challenges of transitioning pBCI systems from the research laboratory environment to practical, everyday use. An interesting issue is what impact methodological variability may have on the ability to reliably identify (neuro)physiological patterns that are useful for state assessment. This work aimed at quantifying the effects of methodological variability in a pBCI design for detecting changes in cognitive workload. Specific focus was directed toward the effects of replacing electrodes over dual sessions (thus inducing changes in placement, electromechanical properties, and/or impedance between the electrode and skin surface) on the accuracy of several machine learning approaches in a binary classification problem. In investigating these methodological variables, it was determined that the removal and replacement of the electrode suite between sessions does not impact the accuracy of a number of learning approaches when trained on one session and tested on a second. This finding was confirmed by comparing to a control group for which the electrode suite was not replaced between sessions. This result suggests that sensors (both neurological and peripheral) may be removed and replaced over the course of many interactions with a pBCI system without affecting its performance. Future work on multi-session and multi-day pBCI system use should seek to replicate this (lack of) effect between sessions in other tasks, temporal time courses, and data analytic approaches while also focusing on non-stationarity and variable classification performance due to intrinsic factors.
NASA Astrophysics Data System (ADS)
Bouillard, Jacques X.; Vignes, Alexis
2014-02-01
In this paper, an inhalation health and explosion safety risk assessment methodology for nanopowders is described. Since toxicological threshold limit values are still unknown for nanosized substances, detailed risk assessment on specific plants may not be carried out. A simple approach based on occupational hazard/exposure band expressed in mass concentrations is proposed for nanopowders. This approach is consolidated with an iso surface toxicological scaling method, which has the merit, although incomplete, to provide concentration threshold levels for which new metrological instruments should be developed for proper air monitoring in order to ensure safety. Whenever the processing or use of nanomaterials is introducing a risk to the worker, a specific nano pictogram is proposed to inform the worker. Examples of risk assessment of process equipment (i.e., containment valves) processing various nanomaterials are provided. Explosion risks related to very reactive nanomaterials such as aluminum nanopowders can be assessed using this new analysis methodology adapted to nanopowders. It is nevertheless found that to formalize and extend this approach, it is absolutely necessary to develop new relevant standard apparatuses and to qualify individual and collective safety barriers with respect to health and explosion risks. In spite of these uncertainties, it appears, as shown in the second paper (Part II) that health and explosion risks, evaluated for given MWCNTs and aluminum nanoparticles, remain manageable in their continuous fabrication mode, considering current individual and collective safety barriers that can be put in place. The authors would, however, underline that peculiar attention must be paid to non-continuous modes of operations, such as process equipment cleaning steps, that are often under-analyzed and are too often forgotten critical steps needing vigilance in order to minimize potential toxic and explosion risks.
A demand-centered, hybrid life-cycle methodology for city-scale greenhouse gas inventories.
Ramaswami, Anu; Hillman, Tim; Janson, Bruce; Reiner, Mark; Thomas, Gregg
2008-09-01
Greenhouse gas (GHG) accounting for individual cities is confounded by spatial scale and boundary effects that impact the allocation of regional material and energy flows. This paper develops a demand-centered, hybrid life-cycle-based methodology for conducting city-scale GHG inventories that incorporates (1) spatial allocation of surface and airline travel across colocated cities in larger metropolitan regions, and, (2) life-cycle assessment (LCA) to quantify the embodied energy of key urban materials--food, water, fuel, and concrete. The hybrid methodology enables cities to separately report the GHG impact associated with direct end-use of energy by cities (consistent with EPA and IPCC methods), as well as the impact of extra-boundary activities such as air travel and production of key urban materials (consistent with Scope 3 protocols recommended by the World Resources Institute). Application of this hybrid methodology to Denver, Colorado, yielded a more holistic GHG inventory that approaches a GHG footprint computation, with consistency of inclusions across spatial scale as well as convergence of city-scale per capita GHG emissions (approximately 25 mt CO2e/person/year) with state and national data. The method is shown to have significant policy impacts, and also demonstrates the utility of benchmarks in understanding energy use in various city sectors.
Revolution in Field Science: Apollo Approach to Inaccessible Surface Exploration
NASA Astrophysics Data System (ADS)
Clark, P. E.
2010-07-01
The extraordinary challenge mission designers, scientists, and engineers, faced in planning the first human expeditions to the surface of another solar system body led to the development of a distinctive and even revolutionary approach to field work. Not only were those involved required to deal effectively with the extreme limitation in resources available for and access to a target as remote as the lunar surface; they were required to developed a rigorous approach to science activities ranging from geological field work to deploying field instruments. Principal aspects and keys to the success of the field work are discussed here, including the highly integrated, intensive, and lengthy science planning, simulation, and astronaut training; the development of a systematic scheme for description and documentation of geological sites and samples; and a flexible yet disciplined methodology for site documentation and sample collection. The capability for constant communication with a ‘backroom’ of geological experts who make requests and weigh in on surface operations was innovative and very useful in encouraging rapid dissemination of information to the greater community in general. An extensive archive of the Apollo era science activity related documents provides evidence of the principal aspects and keys to the success of the field work. The Apollo Surface Journal allows analysis of the astronaut’s performance in terms of capability for traveling on foot, documentation and sampling of field stations, and manual operation of tools and instruments, all as a function of time. The application of these analysis as ‘lessons learned’ for planning the next generation of human or robotic field science activities on the Moon and elsewhere are considered here as well.
NASA Technical Reports Server (NTRS)
Hidalgo, Homero, Jr.
2000-01-01
An innovative methodology for determining structural target mode selection and mode selection based on a specific criterion is presented. An effective approach to single out modes which interact with specific locations on a structure has been developed for the X-33 Launch Vehicle Finite Element Model (FEM). We presented Root-Sum-Square (RSS) displacement method computes resultant modal displacement for each mode at selected degrees of freedom (DOF) and sorts to locate modes with highest values. This method was used to determine modes, which most influenced specific locations/points on the X-33 flight vehicle such as avionics control components, aero-surface control actuators, propellant valve and engine points for use in flight control stability analysis and for flight POGO stability analysis. Additionally, the modal RSS method allows for primary or global target vehicle modes to also be identified in an accurate and efficient manner.
A new technique for simulating composite material
NASA Technical Reports Server (NTRS)
Volakis, John L.
1991-01-01
This project dealt with the development on new methodologies and algorithms for the multi-spectrum electromagnetic characterization of large scale nonmetallic airborne vehicles and structures. A robust, low memory, and accurate methodology was developed which is particularly suited for modern machine architectures. This is a hybrid finite element method that combines two well known numerical solution approaches. That of the finite element method for modeling volumes and the boundary integral method which yields exact boundary conditions for terminating the finite element mesh. In addition, a variety of high frequency results were generated (such as diffraction coefficients for impedance surfaces and material layers) and a class of boundary conditions were developed which hold promise for more efficient simulations. During the course of this project, nearly 25 detailed research reports were generated along with an equal number of journal papers. The reports, papers, and journal articles are listed in the appendices along with their abstracts.
Fogolari, Federico; Moroni, Elisabetta; Wojciechowski, Marcin; Baginski, Maciej; Ragona, Laura; Molinari, Henriette
2005-04-01
The pH-driven opening and closure of beta-lactoglobulin EF loop, acting as a lid and closing the internal cavity of the protein, has been studied by molecular dynamics (MD) simulations and free energy calculations based on molecular mechanics/Poisson-Boltzmann (PB) solvent-accessible surface area (MM/PBSA) methodology. The forms above and below the transition pH differ presumably only in the protonation state of residue Glu89. MM/PBSA calculations are able to reproduce qualitatively the thermodynamics of the transition. The analysis of MD simulations using a combination of MM/PBSA methodology and the colony energy approach is able to highlight the driving forces implied in the transition. The analysis suggests that global rearrangements take place before the equilibrium local conformation is reached. This conclusion may bear general relevance to conformational transitions in all lipocalins and proteins in general. (c) 2005 Wiley-Liss, Inc.
A framework for assessing the adequacy and effectiveness of software development methodologies
NASA Technical Reports Server (NTRS)
Arthur, James D.; Nance, Richard E.
1990-01-01
Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.
NASA Astrophysics Data System (ADS)
Moreno, H. A.; Ogden, F. L.; Steinke, R. C.; Alvarez, L. V.
2015-12-01
Triangulated Irregular Networks (TINs) are increasingly popular for terrain representation in high performance surface and hydrologic modeling by their skill to capture significant changes in surface forms such as topographical summits, slope breaks, ridges, valley floors, pits and cols. This work presents a methodology for estimating slope, aspect and the components of the incoming solar radiation by using a vectorial approach within a topocentric coordinate system by establishing geometric relations between groups of TIN elements and the sun position. A normal vector to the surface of each TIN element describes slope and aspect while spherical trigonometry allows computing a unit vector defining the position of the sun at each hour and DOY. Thus, a dot product determines the radiation flux at each TIN element. Remote shading is computed by scanning the projection of groups of TIN elements in the direction of the closest perpendicular plane to the sun vector. Sky view fractions are computed by a simplified scanning algorithm in prescribed directions and are useful to determine diffuse radiation. Finally, remote radiation scattering is computed from the sky view factor complementary functions for prescribed albedo values of the surrounding terrain only for significant angles above the horizon. This methodology represents an improvement on the current algorithms to compute terrain and radiation parameters on TINs in an efficient manner. All terrain features (e.g. slope, aspect, sky view factors and remote sheltering) can be pre-computed and stored for easy access for a subsequent ground surface or hydrologic simulation.
NASA Astrophysics Data System (ADS)
Leandro, J.; Schumann, A.; Pfister, A.
2016-04-01
Some of the major challenges in modelling rainfall-runoff in urbanised areas are the complex interaction between the sewer system and the overland surface, and the spatial heterogeneity of the urban key features. The former requires the sewer network and the system of surface flow paths to be solved simultaneously. The latter is still an unresolved issue because the heterogeneity of runoff formation requires high detailed information and includes a large variety of feature specific rainfall-runoff dynamics. This paper discloses a methodology for considering the variability of building types and the spatial heterogeneity of land surfaces. The former is achieved by developing a specific conceptual rainfall-runoff model and the latter by defining a fully distributed approach for infiltration processes in urban areas with limited storage capacity dependent on OpenStreetMaps (OSM). The model complexity is increased stepwise by adding components to an existing 2D overland flow model. The different steps are defined as modelling levels. The methodology is applied in a German case study. Results highlight that: (a) spatial heterogeneity of urban features has a medium to high impact on the estimated overland flood-depths, (b) the addition of multiple urban features have a higher cumulative effect due to the dynamic effects simulated by the model, (c) connecting the runoff from buildings to the sewer contributes to the non-linear effects observed on the overland flood-depths, and (d) OSM data is useful in identifying pounding areas (for which infiltration plays a decisive role) and permeable natural surface flow paths (which delay the flood propagation).
Toward a 35-years North American Precipitation and Surface Reanalysis
NASA Astrophysics Data System (ADS)
Gasset, N.; Fortin, V.
2017-12-01
In support of the International Watersheds Initiative (IWI) of the International Joint Commission (IJC), a 35-years precipitation and surface reanalysis covering North America at a 3-hours and 15-km resolution is currently being developed at the Canadian Meteorological Centre (CMC). A deterministic reforecast / dynamical downscaling approach is followed where a global reanalysis (ERA-Interim) is used as initial condition of the Global Environmental Multi-scale model (GEM). Moreover, the latter is coupled with precipitation and surface data assimilation systems, i.e. the Canadian Precipitation Analysis (CaPA) and the Canadian Land Data Assimilation System (CaLDAS). While optimized to be more computationally efficient in the context of a reforecast experiment, all systems used are closely related to model versions and configurations currently run operationally at CMC, meaning they have undergone a strict and thorough validation procedure.As a proof of concept and in order to identify the optimal set-up before achieving the 35-years reanalysis, several configurations of the approach are evaluated for the years 2010-2014 using both standard CMC validation methodology as well as more dedicated scores such as comparison against the currently available products (North American Regional Reanalysis, MERRA-Land and the newly released ERA5 reanalysis). A special attention is dedicated to the evaluation of analysed variables, i.e. precipitation, snow depth, surface/ground temperature and moisture over the whole domain of interest. Results from these preliminary samples are very encouraging and the optimal set-up is identified. The coupled approach, i.e. GEM+CaPA/CaLDAS, always shows clear improvements over classical reforecast and dynamical downscaling where surface observations are present. Furthermore, results are inline or better than currently available products and the reference CMC operational approach that was operated from 2012 to 2016 (GEM 3.3, 10-km resolution). This reanalysis will allow for bias correction of current estimates and forecasts, and help decision maker understand and communicate by how much the current forecasted state of the system differs from the recent past.
Methodology or method? A critical review of qualitative case study reports.
Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia
2014-01-01
Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners.
Chitosan based grey wastewater treatment--a statistical design approach.
Thirugnanasambandham, K; Sivakumar, V; Prakash Maran, J; Kandasamy, S
2014-01-01
In this present study, grey wastewater was treated under different operating conditions such as agitation time (1-3 min), pH (2.5-5.5), chitosan dose (0.3-0.6g/l) and settling time (10-20 min) using response surface methodology (RSM). Four factors with three levels Box-Behnken response surface design (BBD) were employed to optimize and investigate the effect of process variables on the responses such as turbidity, BOD and COD removal. The results were analyzed by Pareto analysis of variance (ANOVA) and second order polynomial models were developed in order to predict the responses. Under the optimum conditions, experimental values such as turbidity (96%), BOD (91%) and COD (73%) removals are closely agreed with predicted values. Copyright © 2013 Elsevier Ltd. All rights reserved.
A method for UV-bonding in the fabrication of glass electrophoretic microchips.
Huang, Z; Sanders, J C; Dunsmor, C; Ahmadzadeh, H; Landers, J P
2001-10-01
This paper presents an approach for the development of methodologies amenable to simple and inexpensive microchip fabrication, potentially applicable to dissimilar materials bonding and chip integration. The method involves a UV-curable glue that can be used for glass microchip fabrication bonding at room temperature. This involves nothing more than fabrication of glue "guide channels" into the microchip architecture that upon exposure to the appropriate UV light source, bonds the etched plate and cover plate together. The microchip performance was verified by capillary zone electrophoresis (CZE) of small fluorescent molecules with no microchannel surface modification carried out, as well as with a DNA fragment separation following surface modification. The performance of these UV-bonded electrophoretic microchips indicates that this method may provide an alternative to high temperature bonding.
The origin of life and its methodological challenge.
Wächtershäuser, G
1997-08-21
The problem of the origin of life is discussed from a methodological point of view as an encounter between the teleological thinking of the historian and the mechanistic thinking of the chemist; and as the Kantian task of replacing teleology by mechanism. It is shown how the Popperian situational logic of historic understanding and the Popperian principle of explanatory power of scientific theories, when jointly applied to biochemistry, lead to a methodology of biochemical retrodiction, whereby common precursor functions are constructed for disparate successor functions. This methodology is exemplified by central tenets of the theory of the chemo-autotrophic origin of life: the proposal of a surface metabolism with a two-dimensional order; the basic polarity of life with negatively charged constituents on positively charged mineral surfaces; the surface-metabolic origin of phosphorylated sugar metabolism and nucleic acids; the origin of membrane lipids and of chemi-osmosis on pyrite surfaces; and the principles of the origin of the genetic machinery. The theory presents the early evolution of life as a process that begins with chemical necessity and winds up in genetic chance.
NASA Astrophysics Data System (ADS)
Ramachandran, C. S.; Balasubramanian, V.; Ananthapadmanabhan, P. V.
2011-03-01
Atmospheric plasma spraying is used extensively to make Thermal Barrier Coatings of 7-8% yttria-stabilized zirconia powders. The main problem faced in the manufacture of yttria-stabilized zirconia coatings by the atmospheric plasma spraying process is the selection of the optimum combination of input variables for achieving the required qualities of coating. This problem can be solved by the development of empirical relationships between the process parameters (input power, primary gas flow rate, stand-off distance, powder feed rate, and carrier gas flow rate) and the coating quality characteristics (deposition efficiency, tensile bond strength, lap shear bond strength, porosity, and hardness) through effective and strategic planning and the execution of experiments by response surface methodology. This article highlights the use of response surface methodology by designing a five-factor five-level central composite rotatable design matrix with full replication for planning, conduction, execution, and development of empirical relationships. Further, response surface methodology was used for the selection of optimum process parameters to achieve desired quality of yttria-stabilized zirconia coating deposits.
Perdana, Jimmy; Bereschenko, Ludmila; Roghair, Mark; Fox, Martijn B; Boom, Remko M; Kleerebezem, Michiel; Schutyser, Maarten A I
2012-11-01
Survival of probiotic bacteria during drying is not trivial. Survival percentages are very specific for each probiotic strain and can be improved by careful selection of drying conditions and proper drying carrier formulation. An experimental approach is presented, comprising a single-droplet drying method and a subsequent novel screening methodology, to assess the microbial viability within single particles. The drying method involves the drying of a single droplet deposited on a flat, hydrophobic surface under well-defined drying conditions and carrier formulations. Semidried or dried particles were subjected to rehydration, fluorescence staining, and live/dead enumeration using fluorescence microscopy. The novel screening methodology provided accurate survival percentages in line with conventional plating enumeration and was evaluated in single-droplet drying experiments with Lactobacillus plantarum WCFS1 as a model probiotic strain. Parameters such as bulk air temperatures and the carrier matrices (glucose, trehalose, and maltodextrin DE 6) were varied. Following the experimental approach, the influence on the viability as a function of the drying history could be monitored. Finally, the applicability of the novel viability assessment was demonstrated for samples obtained from drying experiments at a larger scale.
Supervised classification of continental shelf sediment off western Donegal, Ireland
NASA Astrophysics Data System (ADS)
Monteys, X.; Craven, K.; McCarron, S. G.
2017-12-01
Managing human impacts on marine ecosystems requires natural regions to be identified and mapped over a range of hierarchically nested scales. In recent years (2000-present) the Irish National Seabed Survey (INSS) and Integrated Mapping for the Sustainable Development of Ireland's Marine Resources programme (INFOMAR) (Geological Survey Ireland and Marine Institute collaborations) has provided unprecedented quantities of high quality data on Ireland's offshore territories. The increasing availability of large, detailed digital representations of these environments requires the application of objective and quantitative analyses. This study presents results of a new approach for sea floor sediment mapping based on an integrated analysis of INFOMAR multibeam bathymetric data (including the derivatives of slope and relative position), backscatter data (including derivatives of angular response analysis) and sediment groundtruthing over the continental shelf, west of Donegal. It applies a Geographic-Object-Based Image Analysis software package to provide a supervised classification of the surface sediment. This approach can provide a statistically robust, high resolution classification of the seafloor. Initial results display a differentiation of sediment classes and a reduction in artefacts from previously applied methodologies. These results indicate a methodology that could be used during physical habitat mapping and classification of marine environments.
Spline approximation, Part 1: Basic methodology
NASA Astrophysics Data System (ADS)
Ezhov, Nikolaj; Neitzel, Frank; Petrovic, Svetozar
2018-04-01
In engineering geodesy point clouds derived from terrestrial laser scanning or from photogrammetric approaches are almost never used as final results. For further processing and analysis a curve or surface approximation with a continuous mathematical function is required. In this paper the approximation of 2D curves by means of splines is treated. Splines offer quite flexible and elegant solutions for interpolation or approximation of "irregularly" distributed data. Depending on the problem they can be expressed as a function or as a set of equations that depend on some parameter. Many different types of splines can be used for spline approximation and all of them have certain advantages and disadvantages depending on the approximation problem. In a series of three articles spline approximation is presented from a geodetic point of view. In this paper (Part 1) the basic methodology of spline approximation is demonstrated using splines constructed from ordinary polynomials and splines constructed from truncated polynomials. In the forthcoming Part 2 the notion of B-spline will be explained in a unique way, namely by using the concept of convex combinations. The numerical stability of all spline approximation approaches as well as the utilization of splines for deformation detection will be investigated on numerical examples in Part 3.
Perdana, Jimmy; Bereschenko, Ludmila; Roghair, Mark; Fox, Martijn B.; Boom, Remko M.; Kleerebezem, Michiel
2012-01-01
Survival of probiotic bacteria during drying is not trivial. Survival percentages are very specific for each probiotic strain and can be improved by careful selection of drying conditions and proper drying carrier formulation. An experimental approach is presented, comprising a single-droplet drying method and a subsequent novel screening methodology, to assess the microbial viability within single particles. The drying method involves the drying of a single droplet deposited on a flat, hydrophobic surface under well-defined drying conditions and carrier formulations. Semidried or dried particles were subjected to rehydration, fluorescence staining, and live/dead enumeration using fluorescence microscopy. The novel screening methodology provided accurate survival percentages in line with conventional plating enumeration and was evaluated in single-droplet drying experiments with Lactobacillus plantarum WCFS1 as a model probiotic strain. Parameters such as bulk air temperatures and the carrier matrices (glucose, trehalose, and maltodextrin DE 6) were varied. Following the experimental approach, the influence on the viability as a function of the drying history could be monitored. Finally, the applicability of the novel viability assessment was demonstrated for samples obtained from drying experiments at a larger scale. PMID:22983965
ERIC Educational Resources Information Center
Vázquez-Alonso, Ángel; Manassero-Mas, María-Antonia; García-Carmona, Antonio; Montesano de Talavera, Marisa
2016-01-01
This study applies a new quantitative methodological approach to diagnose epistemology conceptions in a large sample. The analyses use seven multiple-rating items on the epistemology of science drawn from the item pool Views on Science-Technology-Society (VOSTS). The bases of the new methodological diagnostic approach are the empirical…
AERIS: An Integrated Domain Information System for Aerospace Science and Technology
ERIC Educational Resources Information Center
Hatua, Sudip Ranjan; Madalli, Devika P.
2011-01-01
Purpose: The purpose of this paper is to discuss the methodology in building an integrated domain information system with illustrations that provide proof of concept. Design/methodology/approach: The present work studies the usual search engine approach to information and its pitfalls. A methodology was adopted for construction of a domain-based…
Kubař, Tomáš; Elstner, Marcus
2013-04-28
In this work, a fragment-orbital density functional theory-based method is combined with two different non-adiabatic schemes for the propagation of the electronic degrees of freedom. This allows us to perform unbiased simulations of electron transfer processes in complex media, and the computational scheme is applied to the transfer of a hole in solvated DNA. It turns out that the mean-field approach, where the wave function of the hole is driven into a superposition of adiabatic states, leads to over-delocalization of the hole charge. This problem is avoided using a surface hopping scheme, resulting in a smaller rate of hole transfer. The method is highly efficient due to the on-the-fly computation of the coarse-grained DFT Hamiltonian for the nucleobases, which is coupled to the environment using a QM/MM approach. The computational efficiency and partial parallel character of the methodology make it possible to simulate electron transfer in systems of relevant biochemical size on a nanosecond time scale. Since standard non-polarizable force fields are applied in the molecular-mechanics part of the calculation, a simple scaling scheme was introduced into the electrostatic potential in order to simulate the effect of electronic polarization. It is shown that electronic polarization has an important effect on the features of charge transfer. The methodology is applied to two kinds of DNA sequences, illustrating the features of transfer along a flat energy landscape as well as over an energy barrier. The performance and relative merit of the mean-field scheme and the surface hopping for this application are discussed.
NASA Astrophysics Data System (ADS)
Gosselin, Jeremy M.; Dosso, Stan E.; Cassidy, John F.; Quijano, Jorge E.; Molnar, Sheri; Dettmer, Jan
2017-10-01
This paper develops and applies a Bernstein-polynomial parametrization to efficiently represent general, gradient-based profiles in nonlinear geophysical inversion, with application to ambient-noise Rayleigh-wave dispersion data. Bernstein polynomials provide a stable parametrization in that small perturbations to the model parameters (basis-function coefficients) result in only small perturbations to the geophysical parameter profile. A fully nonlinear Bayesian inversion methodology is applied to estimate shear wave velocity (VS) profiles and uncertainties from surface wave dispersion data extracted from ambient seismic noise. The Bayesian information criterion is used to determine the appropriate polynomial order consistent with the resolving power of the data. Data error correlations are accounted for in the inversion using a parametric autoregressive model. The inversion solution is defined in terms of marginal posterior probability profiles for VS as a function of depth, estimated using Metropolis-Hastings sampling with parallel tempering. This methodology is applied to synthetic dispersion data as well as data processed from passive array recordings collected on the Fraser River Delta in British Columbia, Canada. Results from this work are in good agreement with previous studies, as well as with co-located invasive measurements. The approach considered here is better suited than `layered' modelling approaches in applications where smooth gradients in geophysical parameters are expected, such as soil/sediment profiles. Further, the Bernstein polynomial representation is more general than smooth models based on a fixed choice of gradient type (e.g. power-law gradient) because the form of the gradient is determined objectively by the data, rather than by a subjective parametrization choice.
Final Technical Report: Distributed Controls for High Penetrations of Renewables
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byrne, Raymond H.; Neely, Jason C.; Rashkin, Lee J.
2015-12-01
The goal of this effort was to apply four potential control analysis/design approaches to the design of distributed grid control systems to address the impact of latency and communications uncertainty with high penetrations of photovoltaic (PV) generation. The four techniques considered were: optimal fixed structure control; Nyquist stability criterion; vector Lyapunov analysis; and Hamiltonian design methods. A reduced order model of the Western Electricity Coordinating Council (WECC) developed for the Matlab Power Systems Toolbox (PST) was employed for the study, as well as representative smaller systems (e.g., a two-area, three-area, and four-area power system). Excellent results were obtained with themore » optimal fixed structure approach, and the methodology we developed was published in a journal article. This approach is promising because it offers a method for designing optimal control systems with the feedback signals available from Phasor Measurement Unit (PMU) data as opposed to full state feedback or the design of an observer. The Nyquist approach inherently handles time delay and incorporates performance guarantees (e.g., gain and phase margin). We developed a technique that works for moderate sized systems, but the approach does not scale well to extremely large system because of computational complexity. The vector Lyapunov approach was applied to a two area model to demonstrate the utility for modeling communications uncertainty. Application to large power systems requires a method to automatically expand/contract the state space and partition the system so that communications uncertainty can be considered. The Hamiltonian Surface Shaping and Power Flow Control (HSSPFC) design methodology was selected to investigate grid systems for energy storage requirements to support high penetration of variable or stochastic generation (such as wind and PV) and loads. This method was applied to several small system models.« less
[Methodological approaches to the creation of healthy food].
Kornen, N N; Viktorova, E P; Evdokimova, O V
2015-01-01
The substantiation of necessity of creation of healthy food products and their classification. Formulated methodological approaches to the creation of healthy food: enriched, functional and specialized purpose.
MetaSort untangles metagenome assembly by reducing microbial community complexity
Ji, Peifeng; Zhang, Yanming; Wang, Jinfeng; Zhao, Fangqing
2017-01-01
Most current approaches to analyse metagenomic data rely on reference genomes. Novel microbial communities extend far beyond the coverage of reference databases and de novo metagenome assembly from complex microbial communities remains a great challenge. Here we present a novel experimental and bioinformatic framework, metaSort, for effective construction of bacterial genomes from metagenomic samples. MetaSort provides a sorted mini-metagenome approach based on flow cytometry and single-cell sequencing methodologies, and employs new computational algorithms to efficiently recover high-quality genomes from the sorted mini-metagenome by the complementary of the original metagenome. Through extensive evaluations, we demonstrated that metaSort has an excellent and unbiased performance on genome recovery and assembly. Furthermore, we applied metaSort to an unexplored microflora colonized on the surface of marine kelp and successfully recovered 75 high-quality genomes at one time. This approach will greatly improve access to microbial genomes from complex or novel communities. PMID:28112173
Developing comparative criminology and the case of China: an introduction.
Liu, Jianhong
2007-02-01
Although comparative criminology has made significant development during the past decade or so, systematic empirical research has only developed along a few topics. Comparative criminology has never occupied a central position in criminology. This article analyzes the major theoretical and methodological impediments in the development of comparative criminology. It stresses a need to shift methodology from a conventional primary approach that uses the nation as the unit of analysis to an in-depth case study method as a primary methodological approach. The article maintains that case study method can overcome the limitation of its descriptive tradition and become a promising methodological approach for comparative criminology.
Surface composition of Mercury from reflectance spectrophotometry
NASA Technical Reports Server (NTRS)
Vilas, Faith
1988-01-01
The controversies surrounding the existing spectra of Mercury are discussed together with the various implications for interpretations of Mercury's surface composition. Special attention is given to the basic procedure used for reducing reflectance spectrophotometry data, the factors that must be accounted for in the reduction of these data, and the methodology for defining the portion of the surface contributing the greatest amount of light to an individual spectrum. The application of these methodologies to Mercury's spectra is presented.
3DSEM++: Adaptive and intelligent 3D SEM surface reconstruction.
Tafti, Ahmad P; Holz, Jessica D; Baghaie, Ahmadreza; Owen, Heather A; He, Max M; Yu, Zeyun
2016-08-01
Structural analysis of microscopic objects is a longstanding topic in several scientific disciplines, such as biological, mechanical, and materials sciences. The scanning electron microscope (SEM), as a promising imaging equipment has been around for decades to determine the surface properties (e.g., compositions or geometries) of specimens by achieving increased magnification, contrast, and resolution greater than one nanometer. Whereas SEM micrographs still remain two-dimensional (2D), many research and educational questions truly require knowledge and facts about their three-dimensional (3D) structures. 3D surface reconstruction from SEM images leads to remarkable understanding of microscopic surfaces, allowing informative and qualitative visualization of the samples being investigated. In this contribution, we integrate several computational technologies including machine learning, contrario methodology, and epipolar geometry to design and develop a novel and efficient method called 3DSEM++ for multi-view 3D SEM surface reconstruction in an adaptive and intelligent fashion. The experiments which have been performed on real and synthetic data assert the approach is able to reach a significant precision to both SEM extrinsic calibration and its 3D surface modeling. Copyright © 2016 Elsevier Ltd. All rights reserved.
A variable-gain output feedback control design methodology
NASA Technical Reports Server (NTRS)
Halyo, Nesim; Moerder, Daniel D.; Broussard, John R.; Taylor, Deborah B.
1989-01-01
A digital control system design technique is developed in which the control system gain matrix varies with the plant operating point parameters. The design technique is obtained by formulating the problem as an optimal stochastic output feedback control law with variable gains. This approach provides a control theory framework within which the operating range of a control law can be significantly extended. Furthermore, the approach avoids the major shortcomings of the conventional gain-scheduling techniques. The optimal variable gain output feedback control problem is solved by embedding the Multi-Configuration Control (MCC) problem, previously solved at ICS. An algorithm to compute the optimal variable gain output feedback control gain matrices is developed. The algorithm is a modified version of the MCC algorithm improved so as to handle the large dimensionality which arises particularly in variable-gain control problems. The design methodology developed is applied to a reconfigurable aircraft control problem. A variable-gain output feedback control problem was formulated to design a flight control law for an AFTI F-16 aircraft which can automatically reconfigure its control strategy to accommodate failures in the horizontal tail control surface. Simulations of the closed-loop reconfigurable system show that the approach produces a control design which can accommodate such failures with relative ease. The technique can be applied to many other problems including sensor failure accommodation, mode switching control laws and super agility.
NASA Astrophysics Data System (ADS)
Schöttl, Peter; Bern, Gregor; van Rooyen, De Wet; Heimsath, Anna; Fluri, Thomas; Nitz, Peter
2017-06-01
A transient simulation methodology for cavity receivers for Solar Tower Central Receiver Systems with molten salt as heat transfer fluid is described. Absorbed solar radiation is modeled with ray tracing and a sky discretization approach to reduce computational effort. Solar radiation re-distribution in the cavity as well as thermal radiation exchange are modeled based on view factors, which are also calculated with ray tracing. An analytical approach is used to represent convective heat transfer in the cavity. Heat transfer fluid flow is simulated with a discrete tube model, where the boundary conditions at the outer tube surface mainly depend on inputs from the previously mentioned modeling aspects. A specific focus is put on the integration of optical and thermo-hydraulic models. Furthermore, aiming point and control strategies are described, which are used during the transient performance assessment. Eventually, the developed simulation methodology is used for the optimization of the aperture opening size of a PS10-like reference scenario with cavity receiver and heliostat field. The objective function is based on the cumulative gain of one representative day. Results include optimized aperture opening size, transient receiver characteristics and benefits of the implemented aiming point strategy compared to a single aiming point approach. Future work will include annual simulations, cost assessment and optimization of a larger range of receiver parameters.
An Investigation of Land-Atmosphere Coupling from Local to Regional Scales
NASA Astrophysics Data System (ADS)
Brunsell, N. A.; Van Vleck, E.; Rahn, D. A.
2017-12-01
The exchanges of mass and energy between the surface and atmosphere have been shown to depend upon both local and regional climatic influences. However, the degree of control exerted by the land surface on the coupling metrics is not well understood. In particular, we lack an understanding of the relationship between the local microclimate of a site and the regional forces responsible for land-atmosphere coupling. To address this, we investigate a series of metrics calculated from eddy covariance data and ceilometer data, land surface modeling and remotely sensed observations in the central United States to diagnose these interactions and predict the change from one coupling regime (e.g. wet/dry coupling) to another state. The stability of the coupling is quantified using a Lyapunov exponent based methodology. Through the use of a wavelet information theoretic approach, we isolate the roles local energy partitioning, as well as the temperature and moisture gradients on controlling and changing the coupling regime. Taking a multi-scale observational approach, we first examine the relationship at the tower scale. Using land surface models, we quantify to what extent current models are capable of properly diagnosing the dynamics of the coupling regime. In particular, we focus on the role of the surface moisture and vegetation to initiate and maintain precipitation feedbacks. We extend this analysis to the regional scale by utilizing reanalysis and remotely sensed observations. Thus, we are able to quantify the changes in observed coupling patterns with linkages to local interactions to address the question of the local control that the surface exerts over the maintenance of land-atmosphere coupling.
ERIC Educational Resources Information Center
Ornellas, Adriana; Muñoz Carril, Pablo César
2014-01-01
This article outlines a methodological approach to the creation, production and dissemination of online collaborative audio-visual projects, using new social learning technologies and open-source video tools, which can be applied to any e-learning environment in higher education. The methodology was developed and used to design a course in the…
Kansei, surfaces and perception engineering
NASA Astrophysics Data System (ADS)
Rosen, B.-G.; Eriksson, L.; Bergman, M.
2016-09-01
The aesthetic and pleasing properties of a product are important and add significantly to the meaning and relevance of a product. Customer sensation and perception are largely about psychological factors. There has been a strong industrial and academic need and interest for methods and tools to quantify and link product properties to the human response but a lack of studies of the impact of surfaces. In this study, affective surface engineering is used to illustrate and model the link between customer expectations and perception to controllable product surface properties. The results highlight the use of the soft metrology concept for linking physical and human factors contributing to the perception of products. Examples of surface applications of the Kansei methodology are presented from sauna bath, health care, architectural and hygiene tissue application areas to illustrate, discuss and confirm the strength of the methodology. In the conclusions of the study, future research in soft metrology is proposed to allow understanding and modelling of product perception and sensations in combination with a development of the Kansei surface engineering methodology and software tools.
NASA Astrophysics Data System (ADS)
Bacchi, Vito; Duluc, Claire-Marie; Bertrand, Nathalie; Bardet, Lise
2017-04-01
In recent years, in the context of hydraulic risk assessment, much effort has been put into the development of sophisticated numerical model systems able reproducing surface flow field. These numerical models are based on a deterministic approach and the results are presented in terms of measurable quantities (water depths, flow velocities, etc…). However, the modelling of surface flows involves numerous uncertainties associated both to the numerical structure of the model, to the knowledge of the physical parameters which force the system and to the randomness inherent to natural phenomena. As a consequence, dealing with uncertainties can be a difficult task for both modelers and decision-makers [Ioss, 2011]. In the context of nuclear safety, IRSN assesses studies conducted by operators for different reference flood situations (local rain, small or large watershed flooding, sea levels, etc…), that are defined in the guide ASN N°13 [ASN, 2013]. The guide provides some recommendations to deal with uncertainties, by proposing a specific conservative approach to cover hydraulic modelling uncertainties. Depending of the situation, the influencing parameter might be the Strickler coefficient, levee behavior, simplified topographic assumptions, etc. Obviously, identifying the most influencing parameter and giving it a penalizing value is challenging and usually questionable. In this context, IRSN conducted cooperative (Compagnie Nationale du Rhone, I-CiTy laboratory of Polytech'Nice, Atomic Energy Commission, Bureau de Recherches Géologiques et Minières) research activities since 2011 in order to investigate feasibility and benefits of Uncertainties Analysis (UA) and Global Sensitivity Analysis (GSA) when applied to hydraulic modelling. A specific methodology was tested by using the computational environment Promethee, developed by IRSN, which allows carrying out uncertainties propagation study. This methodology was applied with various numerical models and in different contexts, as river flooding on the Rhône River (Nguyen et al., 2015) and on the Garonne River, for the studying of local rainfall (Abily et al., 2016) or for tsunami generation, in the framework of the ANR-research project TANDEM. The feedback issued from these previous studies is analyzed (technical problems, limitations, interesting results, etc…) and the perspectives and a discussion on how a probabilistic approach of uncertainties should improve the actual deterministic methodology for risk assessment (also for other engineering applications) will be finally given.
NASA Astrophysics Data System (ADS)
Aleina, Sara Cresto; Viola, Nicole; Fusaro, Roberta; Saccoccia, Giorgio
2017-10-01
Exploration technology roadmaps have been developed by ESA in the past few years and the latest edition has been released in 2015. Scope of these technology roadmaps, elaborated in consultation with the different ESA stakeholders (e.g. European Industries and Research Entities), is to provide a powerful tool for strategic, programmatic and technical decisions in support of the European role within an International Space Exploration context. In the context of preparation for possible future European Moon exploration initiatives, the technology roadmaps have been used to highlight the role of technology within Missions, Building Blocks and Operational Capabilities of relevance. In particular, as part of reference missions to the Moon that would fit in the time frame 2020 to 2030, ESA has addressed the definition of lunar surface exploration missions in line with its space exploration strategy, with the common mission goals of returning samples from the Moon and Mars and expanding human presence to these destinations in a step-wise approach. The roadmaps for the procurement of technologies required for the first mission elements of the above strategy have been elaborated through their main building blocks, i.e. Visual navigation, Hazard detection and avoidance; Sample acquisition, processing and containment system; Surface mobility elements; Tele-robotic and autonomous control systems; and Storable propulsion modules and equipment. Technology prioritization methodologies have been developed in support of the ESA Exploration Technology Roadmaps, in order to provide logical and quantitative instruments to verify choices of prioritization that can be carried out based on important, but non-quantitative factors. These methodologies, which are thoroughly described in the first part of the paper, proceed through subsequent steps. First, technology prioritization's criteria are selected; then decision trees are developed to highlight all feasible paths of combination of technology prioritization's criteria and to assess the final achievement of each path, i.e. the cost-effectiveness. The risk associated to each path is also evaluated. In the second part of the paper, these prioritization methodologies have been applied to some of the building blocks of relevance for the mission concepts under evaluation at ESA (such as Tele-robotic and autonomous control systems; Storable propulsion modules and equipment) and the results are presented to highlight the approach for an effective TRL increase. Eventually main conclusions are drawn.
Junyong Zhu; G.S. Wang; X.J. Pan; Roland Gleisner
2009-01-01
Sieving methods have been almost exclusively used for feedstock size-reduction characterization in the biomass refining literature. This study demonstrates a methodology to properly characterize specific surface of biomass substrates through two dimensional measurement of each fiber of the substrate using a wet imaging technique. The methodology provides more...
Jae-Won Lee; Rita C.L.B. Rodrigues; Thomas W. Jeffries
2009-01-01
Response surface methodology was used to evaluate optimal time, temperature and oxalic acid concentration for simultaneous saccharification and fermentation (SSF) of corncob particles by Pichia stipitis CBS 6054. Fifteen different conditions for pretreatment were examined in a 23 full factorial design with six axial points. Temperatures ranged from 132 to 180º...
Conforto, Egle; Joguet, Nicolas; Buisson, Pierre; Vendeville, Jean-Eudes; Chaigneau, Carine; Maugard, Thierry
2015-02-01
The aim of this paper is to describe an optimized methodology to study the surface characteristics and internal structure of biopolymer capsules using scanning electron microscopy (SEM) in environmental mode. The main advantage of this methodology is that no preparation is required and, significantly, no metallic coverage is deposited on the surface of the specimen, thus preserving the original capsule shape and its surface morphology. This avoids introducing preparation artefacts which could modify the capsule surface and mask information concerning important feature like porosities or roughness. Using this method gelatin and mainly fatty coatings, difficult to be analyzed by standard SEM technique, unambiguously show fine details of their surface morphology without damage. Furthermore, chemical contrast is preserved in backscattered electron images of unprepared samples, allowing visualizing the internal organization of the capsule, the quality of the envelope, etc... This study provides pointers on how to obtain optimal conditions for the analysis of biological or sensitive material, as this is not always studied using appropriate techniques. A reliable evaluation of the parameters used in capsule elaboration for research and industrial applications, as well as that of capsule functionality is provided by this methodology, which is essential for the technological progress in this domain. Copyright © 2014 Elsevier B.V. All rights reserved.
Minasian-Batmanian, Laura C; Lingard, Jennifer; Prosser, Michael
2005-11-01
Student approaches to learning vary from surface approaches to meaningful, deep learning practices. Differences in approach may be related to students' conceptions of the subject, perceptions of the learning environment, prior study experiences and performance on assessment. This study aims to explore entering students' conceptions of the unit they are about to study and how they intend to approach their studies. It involved a survey of 203 (of 250) first year students in a cross disciplinary unit in the Faculty of Health Sciences. They were asked to complete an open-ended response survey, including questions on what they thought they needed to do to learn biochemistry and what they thought the study of biochemistry was about. A phenomenographic methodology was used to identify categories of description for the questions. The paper will describe the categories in detail, the structural relationship between the categories and the distribution of responses within categories. The study reports a relationship between conception of the topic and approaches to learning. Students with more complex and coherent conceptions of the topic report that they were more likely to adopt deeper approaches to study than those with more fragmented conceptions. However, compared to previous studies, a surprisingly high proportion of students with more cohesive conceptions still intended to adopt more surface approaches. This may reflect the particular context of their learning, namely in a compulsory unit involving material for which most students have minimal background and difficulty seeing its relevance. Implications for teaching such foundation material are discussed.
NASA Astrophysics Data System (ADS)
Ayad, G.; Song, J.; Barriere, T.; Liu, B.; Gelin, J. C.
2007-05-01
The paper is concerned with optimization and parametric identification of Powder Injection Molding process that consists first in injection of powder mixture with polymer binder and then to the sintering of the resulting powders parts by solid state diffusion. In the first part, one describes an original methodology to optimize the injection stage based on the combination of Design Of Experiments and an adaptive Response Surface Modeling. Then the second part of the paper describes the identification strategy that one proposes for the sintering stage, using the identification of sintering parameters from dilatometer curves followed by the optimization of the sintering process. The proposed approaches are applied to the optimization for manufacturing of a ceramic femoral implant. One demonstrates that the proposed approach give satisfactory results.
Photonic Resins: Designing Optical Appearance via Block Copolymer Self-Assembly.
Song, Dong-Po; Jacucci, Gianni; Dundar, Feyza; Naik, Aditi; Fei, Hua-Feng; Vignolini, Silvia; Watkins, James J
2018-03-27
Despite a huge variety of methodologies having been proposed to produce photonic structures by self-assembly, the lack of an effective fabrication approach has hindered their practical uses. These approaches are typically limited by the poor control in both optical and mechanical properties. Here we report photonic thermosetting polymeric resins obtained through brush block copolymer (BBCP) self-assembly. We demonstrate that the control of the interplay between order and disorder in the obtained photonic structure offers a powerful tool box for designing the optical appearance of the polymer resins in terms of reflected wavelength and scattering properties. The obtained materials exhibit excellent mechanical properties with hardness up to 172 MPa and Young's modulus over 2.9 GPa, indicating great potential for practical uses as photonic coatings on a variety of surfaces.
Ye, Liu; Ni, Bing-Jie; Law, Yingyu; Byers, Craig; Yuan, Zhiguo
2014-01-01
The quantification of nitrous oxide (N2O) emissions from open-surface wastewater treatment systems with surface aerators is difficult as emissions from the surface aerator zone cannot be easily captured by floating hoods. In this study, we propose and demonstrate a novel methodology to estimate N2O emissions from such systems through determination of the N2O transfer coefficient (kLa) induced by surface aerators based on oxygen balance for the entire system. The methodology is demonstrated through its application to a full-scale open oxidation ditch wastewater treatment plant with surface aerators. The estimated kLa profile based on a month-long measurement campaign for oxygen balance, intensive monitoring of dissolved N2O profiles along the oxidation ditch over a period of four days, together with mathematical modelling, enabled to determine the N2O emission factor from this treatment plant (0.52 ± 0.16%). Majority of the N2O emission was found to occur in the surface aerator zone, which would be missed if the gas hood method was applied alone. Copyright © 2013 Elsevier Ltd. All rights reserved.
Minjares-Fuentes, R; Femenia, A; Garau, M C; Meza-Velázquez, J A; Simal, S; Rosselló, C
2014-06-15
An ultrasound-assisted procedure for the extraction of pectins from grape pomace with citric acid as the extracting agent was established. A Box-Behnken design (BBD) was employed to optimize the extraction temperature (X1: 35-75°C), extraction time (X2: 20-60 min) and pH (X3: 1.0-2.0) to obtain a high yield of pectins with high average molecular weight (MW) and degree of esterification (DE) from grape pomace. Analysis of variance showed that the contribution of a quadratic model was significant for the pectin extraction yield and for pectin MW whereas the DE of pectins was more influenced by a linear model. An optimization study using response surface methodology was performed and 3D response surfaces were plotted from the mathematical model. According to the RSM model, the highest pectin yield (∼32.3%) can be achieved when the UAE process is carried out at 75°C for 60 min using a citric acid solution of pH 2.0. These pectic polysaccharides, composed mainly by galacturonic acid units (<97% of total sugars), have an average MW of 163.9 kDa and a DE of 55.2%. Close agreement between experimental and predicted values was found. These results suggest that ultrasound-assisted extraction could be a good option for the extraction of functional pectins with citric acid from grape pomace at industrial level. Copyright © 2014 Elsevier Ltd. All rights reserved.
Barik, Anwesha; Banerjee, Satarupa; Dhara, Santanu; Chakravorty, Nishant
2017-04-01
Complexities in the full genome expression studies hinder the extraction of tracker genes to analyze the course of biological events. In this study, we demonstrate the applications of supervised machine learning methods to reduce the irrelevance in microarray data series and thereby extract robust molecular markers to track biological processes. The methodology has been illustrated by analyzing whole genome expression studies on bone-implant integration (ossointegration). Being a biological process, osseointegration is known to leave a trail of genetic footprint during the course. In spite of existence of enormous amount of raw data in public repositories, researchers still do not have access to a panel of genes that can definitively track osseointegration. The results from our study revealed panels comprising of matrix metalloproteinases and collagen genes were able to track osseointegration on implant surfaces (MMP9 and COL1A2 on micro-textured; MMP12 and COL6A3 on superimposed nano-textured surfaces) with 100% classification accuracy, specificity and sensitivity. Further, our analysis showed the importance of the progression of the duration in establishment of the mechanical connection at bone-implant surface. The findings from this study are expected to be useful to researchers investigating osseointegration of novel implant materials especially at the early stage. The methodology demonstrated can be easily adapted by scientists in different fields to analyze large databases for other biological processes. Copyright © 2017 Elsevier Inc. All rights reserved.
Recording polarization gratings with a standing spiral wave
NASA Astrophysics Data System (ADS)
Vernon, Jonathan P.; Serak, Svetlana V.; Hakobyan, Rafik S.; Aleksanyan, Artur K.; Tondiglia, Vincent P.; White, Timothy J.; Bunning, Timothy J.; Tabiryan, Nelson V.
2013-11-01
A scalable and robust methodology for writing cycloidal modulation patterns of optical axis orientation in photosensitive surface alignment layers is demonstrated. Counterpropagating circularly polarized beams, generated by reflection of the input beam from a cholesteric liquid crystal, direct local surface orientation in a photosensitive surface. Purposely introducing a slight angle between the input beam and the photosensitive surface normal introduces a grating period/orientation that is readily controlled and templated. The resulting cycloidal diffractive waveplates offer utility in technologies requiring diffraction over a broad range of angles/wavelengths. This simple methodology of forming polarization gratings offers advantages over conventional fabrication techniques.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guerrero, R. D., E-mail: rdguerrerom@unal.edu.co; Arango, C. A., E-mail: caarango@icesi.edu.co; Reyes, A., E-mail: areyesv@unal.edu.co
We recently proposed a Quantum Optimal Control (QOC) method constrained to build pulses from analytical pulse shapes [R. D. Guerrero et al., J. Chem. Phys. 143(12), 124108 (2015)]. This approach was applied to control the dissociation channel yields of the diatomic molecule KH, considering three potential energy curves and one degree of freedom. In this work, we utilized this methodology to study the strong field control of the cis-trans photoisomerization of 11-cis retinal. This more complex system was modeled with a Hamiltonian comprising two potential energy surfaces and two degrees of freedom. The resulting optimal pulse, made of 6 linearlymore » chirped pulses, was capable of controlling the population of the trans isomer on the ground electronic surface for nearly 200 fs. The simplicity of the pulse generated with our QOC approach offers two clear advantages: a direct analysis of the sequence of events occurring during the driven dynamics, and its reproducibility in the laboratory with current laser technologies.« less
Controlled grafting of vinylic monomers on polyolefins: a robust mathematical modeling approach
Saeb, Mohammad Reza; Rezaee, Babak; Shadman, Alireza; Formela, Krzysztof; Ahmadi, Zahed; Hemmati, Farkhondeh; Kermaniyan, Tayebeh Sadat; Mohammadi, Yousef
2017-01-01
Abstract Experimental and mathematical modeling analyses were used for controlling melt free-radical grafting of vinylic monomers on polyolefins and, thereby, reducing the disturbance of undesired cross-linking of polyolefins. Response surface, desirability function, and artificial intelligence methodologies were blended to modeling/optimization of grafting reaction in terms of vinylic monomer content, peroxide initiator concentration, and melt-processing time. An in-house code was developed based on artificial neural network that learns and mimics processing torque and grafting of glycidyl methacrylate (GMA) typical vinylic monomer on high-density polyethylene (HDPE). Application of response surface and desirability function enabled concurrent optimization of processing torque and GMA grafting on HDPE, through which we quantified for the first time competition between parallel reactions taking place during melt processing: (i) desirable grafting of GMA on HDPE; (ii) undesirable cross-linking of HDPE. The proposed robust mathematical modeling approach can precisely learn the behavior of grafting reaction of vinylic monomers on polyolefins and be placed into practice in finding exact operating condition needed for efficient grafting of reactive monomers on polyolefins. PMID:29491797
Controlled grafting of vinylic monomers on polyolefins: a robust mathematical modeling approach.
Saeb, Mohammad Reza; Rezaee, Babak; Shadman, Alireza; Formela, Krzysztof; Ahmadi, Zahed; Hemmati, Farkhondeh; Kermaniyan, Tayebeh Sadat; Mohammadi, Yousef
2017-01-01
Experimental and mathematical modeling analyses were used for controlling melt free-radical grafting of vinylic monomers on polyolefins and, thereby, reducing the disturbance of undesired cross-linking of polyolefins. Response surface, desirability function, and artificial intelligence methodologies were blended to modeling/optimization of grafting reaction in terms of vinylic monomer content, peroxide initiator concentration, and melt-processing time. An in-house code was developed based on artificial neural network that learns and mimics processing torque and grafting of glycidyl methacrylate (GMA) typical vinylic monomer on high-density polyethylene (HDPE). Application of response surface and desirability function enabled concurrent optimization of processing torque and GMA grafting on HDPE, through which we quantified for the first time competition between parallel reactions taking place during melt processing: (i) desirable grafting of GMA on HDPE; (ii) undesirable cross-linking of HDPE. The proposed robust mathematical modeling approach can precisely learn the behavior of grafting reaction of vinylic monomers on polyolefins and be placed into practice in finding exact operating condition needed for efficient grafting of reactive monomers on polyolefins.
NASA Astrophysics Data System (ADS)
Vignesh, S.; Dinesh Babu, P.; Surya, G.; Dinesh, S.; Marimuthu, P.
2018-02-01
The ultimate goal of all production entities is to select the process parameters that would be of maximum strength, minimum wear and friction. The friction and wear are serious problems in most of the industries which are influenced by the working set of parameters, oxidation characteristics and mechanism involved in formation of wear. The experimental input parameters such as sliding distance, applied load, and temperature are utilized in finding out the optimized solution for achieving the desired output responses such as coefficient of friction, wear rate, and volume loss. The optimization is performed with the help of a novel method, Elitist Non-dominated Sorting Genetic Algorithm (NSGA-II) based on an evolutionary algorithm. The regression equations obtained using Response Surface Methodology (RSM) are used in determining the optimum process parameters. Further, the results achieved through desirability approach in RSM are compared with that of the optimized solution obtained through NSGA-II. The results conclude that proposed evolutionary technique is much effective and faster than the desirability approach.
Karpf, Christian; Krebs, Peter
2011-05-01
The management of sewer systems requires information about discharge and variability of typical wastewater sources in urban catchments. Especially the infiltration of groundwater and the inflow of surface water (I/I) are important for making decisions about the rehabilitation and operation of sewer networks. This paper presents a methodology to identify I/I and estimate its quantity. For each flow fraction in sewer networks, an individual model approach is formulated whose parameters are optimised by the method of least squares. This method was applied to estimate the contributions to the wastewater flow in the sewer system of the City of Dresden (Germany), where data availability is good. Absolute flows of I/I and their temporal variations are estimated. Further information on the characteristics of infiltration is gained by clustering and grouping sewer pipes according to the attributes construction year and groundwater influence and relating these resulting classes to infiltration behaviour. Further, it is shown that condition classes based on CCTV-data can be used to estimate the infiltration potential of sewer pipes. Copyright © 2011 Elsevier Ltd. All rights reserved.
Souagui, Y; Tritsch, D; Grosdemange-Billiard, C; Kecha, M
2015-06-01
Optimization of medium components and physicochemical parameters for antifungal production by an alkaliphilic and salt-tolerant actinomycete designated Streptomyces sp. SY-BS5; isolated from an arid region in south of Algeria. The strain showed broad-spectrum activity against pathogenic and toxinogenic fungi. Identification of the actinomycete strain was realized on the basis of 16S rRNA gene sequencing. Antifungal production was optimized following one-factor-at-a-time (OFAT) and response surface methodology (RSM) approaches. The most suitable medium for growth and antifungal production was found using one-factor-at-a-time methodology. The individual and interaction effects of three nutritional variables, carbon source (glucose), nitrogen source (yeast extract) and sodium chloride (NaCl) were optimized by Box-Behnken design. Finally, culture conditions for the antifungal production, pH and temperature were studied and determined. Analysis of the 16S rRNA gene sequence (1454 nucleotides) assigned this strain to Streptomyces genus with 99% similarity with Streptomyces cyaneofuscatus JCM4364(T), the most closely related. The results of the optimization study show that concentrations 3.476g/L of glucose, 3.876g/L of yeast extract and 41.140g/L of NaCl are responsible for the enhancement of antifungal production by Streptomyces sp. SY-BS5. The preferable culture conditions for antifungal production were pH 10, temperature 30°C for 09 days. This study proved that RSM is usual and powerful tool for the optimization of antifungal production from actinomycetes. Copyright © 2015 Elsevier Masson SAS. All rights reserved.
Methodology or method? A critical review of qualitative case study reports
Hyett, Nerida; Kenny, Amanda; Dickson-Swift, Virginia
2014-01-01
Despite on-going debate about credibility, and reported limitations in comparison to other approaches, case study is an increasingly popular approach among qualitative researchers. We critically analysed the methodological descriptions of published case studies. Three high-impact qualitative methods journals were searched to locate case studies published in the past 5 years; 34 were selected for analysis. Articles were categorized as health and health services (n=12), social sciences and anthropology (n=7), or methods (n=15) case studies. The articles were reviewed using an adapted version of established criteria to determine whether adequate methodological justification was present, and if study aims, methods, and reported findings were consistent with a qualitative case study approach. Findings were grouped into five themes outlining key methodological issues: case study methodology or method, case of something particular and case selection, contextually bound case study, researcher and case interactions and triangulation, and study design inconsistent with methodology reported. Improved reporting of case studies by qualitative researchers will advance the methodology for the benefit of researchers and practitioners. PMID:24809980
Gusev, E Yu; Chereshnev, V A
2013-01-01
Theoretical and methodological approaches to description of systemic inflammation as general pathological process are discussed. It is shown, that there is a need of integration of wide range of types of researches to develop a model of systemic inflammation.
Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research
ERIC Educational Resources Information Center
Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.
2017-01-01
Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…
Recent Developments and Applications of the MMPBSA Method
Wang, Changhao; Greene, D'Artagnan; Xiao, Li; Qi, Ruxi; Luo, Ray
2018-01-01
The Molecular Mechanics Poisson-Boltzmann Surface Area (MMPBSA) approach has been widely applied as an efficient and reliable free energy simulation method to model molecular recognition, such as for protein-ligand binding interactions. In this review, we focus on recent developments and applications of the MMPBSA method. The methodology review covers solvation terms, the entropy term, extensions to membrane proteins and high-speed screening, and new automation toolkits. Recent applications in various important biomedical and chemical fields are also reviewed. We conclude with a few future directions aimed at making MMPBSA a more robust and efficient method. PMID:29367919
Geometric morphometrics and virtual anthropology: advances in human evolutionary studies.
Rein, Thomas R; Harvati, Katerina
2014-01-01
Geometric morphometric methods have been increasingly used in paleoanthropology in the last two decades, lending greater power to the analysis and interpretation of the human fossil record. More recently the advent of the wide use of computed tomography and surface scanning, implemented in combination with geometric morphometrics (GM), characterizes a new approach, termed Virtual Anthropology (VA). These methodological advances have led to a number of developments in human evolutionary studies. We present some recent examples of GM and VA related research in human evolution with an emphasis on work conducted at the University of Tübingen and other German research institutions.
A transonic-small-disturbance wing design methodology
NASA Technical Reports Server (NTRS)
Phillips, Pamela S.; Waggoner, Edgar G.; Campbell, Richard L.
1988-01-01
An automated transonic design code has been developed which modifies an initial airfoil or wing in order to generate a specified pressure distribution. The design method uses an iterative approach that alternates between a potential-flow analysis and a design algorithm that relates changes in surface pressure to changes in geometry. The analysis code solves an extended small-disturbance potential-flow equation and can model a fuselage, pylons, nacelles, and a winglet in addition to the wing. A two-dimensional option is available for airfoil analysis and design. Several two- and three-dimensional test cases illustrate the capabilities of the design code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estevez, Ivan; Concept Scientific Instruments, ZA de Courtaboeuf, 2 rue de la Terre de Feu, 91940 Les Ulis; Chrétien, Pascal
2014-02-24
On the basis of a home-made nanoscale impedance measurement device associated with a commercial atomic force microscope, a specific operating process is proposed in order to improve absolute (in sense of “nonrelative”) capacitance imaging by drastically reducing the parasitic effects due to stray capacitance, surface topography, and sample tilt. The method, combining a two-pass image acquisition with the exploitation of approach curves, has been validated on sets of calibration samples consisting in square parallel plate capacitors for which theoretical capacitance values were numerically calculated.
Land cover change mapping using MODIS time series to improve emissions inventories
NASA Astrophysics Data System (ADS)
López-Saldaña, Gerardo; Quaife, Tristan; Clifford, Debbie
2016-04-01
MELODIES is an FP7 funded project to develop innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. Understanding and quantifying land surface changes is necessary for estimating greenhouse gas and ammonia emissions, and for meeting air quality limits and targets. More sophisticated inventories methodologies for at least key emission source are needed due to policy-driven air quality directives. Quantifying land cover changes on an annual basis requires greater spatial and temporal disaggregation of input data. The main aim of this study is to develop a methodology for using Earth Observations (EO) to identify annual land surface changes that will improve emissions inventories from agriculture and land use/land use change and forestry (LULUCF) in the UK. First goal is to find the best sets of input features that describe accurately the surface dynamics. In order to identify annual and inter-annual land surface changes, a times series of surface reflectance was used to capture seasonal variability. Daily surface reflectance images from the Moderate Resolution Imaging Spectroradiometer (MODIS) at 500m resolution were used to invert a Bidirectional Reflectance Distribution Function (BRDF) model to create the seamless time series. Given the limited number of cloud-free observations, a BRDF climatology was used to constrain the model inversion and where no high-scientific quality observations were available at all, as a gap filler. The Land Cover Map 2007 (LC2007) produced by the Centre for Ecology & Hydrology (CEH) was used for training and testing purposes. A land cover product was created for 2003 to 2015 and a bayesian approach was created to identified land cover changes. We will present the results of the time series development and the first exercises when creating the land cover and land cover changes products.
Mogaji, Kehinde Anthony; Lim, Hwee San
2017-07-01
This study integrates the application of Dempster-Shafer-driven evidential belief function (DS-EBF) methodology with remote sensing and geographic information system techniques to analyze surface and subsurface data sets for the spatial prediction of groundwater potential in Perak Province, Malaysia. The study used additional data obtained from the records of the groundwater yield rate of approximately 28 bore well locations. The processed surface and subsurface data produced sets of groundwater potential conditioning factors (GPCFs) from which multiple surface hydrologic and subsurface hydrogeologic parameter thematic maps were generated. The bore well location inventories were partitioned randomly into a ratio of 70% (19 wells) for model training to 30% (9 wells) for model testing. Application results of the DS-EBF relationship model algorithms of the surface- and subsurface-based GPCF thematic maps and the bore well locations produced two groundwater potential prediction (GPP) maps based on surface hydrologic and subsurface hydrogeologic characteristics which established that more than 60% of the study area falling within the moderate-high groundwater potential zones and less than 35% falling within the low potential zones. The estimated uncertainty values within the range of 0 to 17% for the predicted potential zones were quantified using the uncertainty algorithm of the model. The validation results of the GPP maps using relative operating characteristic curve method yielded 80 and 68% success rates and 89 and 53% prediction rates for the subsurface hydrogeologic factor (SUHF)- and surface hydrologic factor (SHF)-based GPP maps, respectively. The study results revealed that the SUHF-based GPP map accurately delineated groundwater potential zones better than the SHF-based GPP map. However, significant information on the low degree of uncertainty of the predicted potential zones established the suitability of the two GPP maps for future development of groundwater resources in the area. The overall results proved the efficacy of the data mining model and the geospatial technology in groundwater potential mapping.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
ERIC Educational Resources Information Center
Metcalf, Heather
2016-01-01
This research methods Essay details the usefulness of critical theoretical frameworks and critical mixed-methodological approaches for life sciences education research on broadening participation in the life sciences. First, I draw on multidisciplinary research to discuss critical theory and methodologies. Then, I demonstrate the benefits of these…
Paradigms, pragmatism and possibilities: mixed-methods research in speech and language therapy.
Glogowska, Margaret
2011-01-01
After the decades of the so-called 'paradigm wars' in social science research methodology and the controversy about the relative place and value of quantitative and qualitative research methodologies, 'paradigm peace' appears to have now been declared. This has come about as many researchers have begun to take a 'pragmatic' approach in the selection of research methodology, choosing the methodology best suited to answering the research question rather than conforming to a methodological orthodoxy. With the differences in the philosophical underpinnings of the two traditions set to one side, an increasing awareness, and valuing, of the 'mixed-methods' approach to research is now present in the fields of social, educational and health research. To explore what is meant by mixed-methods research and the ways in which quantitative and qualitative methodologies and methods can be combined and integrated, particularly in the broad field of health services research and the narrower one of speech and language therapy. The paper discusses the ways in which methodological approaches have already been combined and integrated in health services research and speech and language therapy, highlighting the suitability of mixed-methods research for answering the typically multifaceted questions arising from the provision of complex interventions. The challenges of combining and integrating quantitative and qualitative methods and the barriers to the adoption of mixed-methods approaches are also considered. The questions about healthcare, as it is being provided in the 21st century, calls for a range of methodological approaches. This is particularly the case for human communication and its disorders, where mixed-methods research offers a wealth of possibilities. In turn, speech and language therapy research should be able to contribute substantively to the future development of mixed-methods research. © 2010 Royal College of Speech & Language Therapists.
Shi, Yajuan; Wang, Ruoshi; Lu, Yonglong; Song, Shuai; Johnson, Andrew C; Sweetman, Andrew; Jones, Kevin
2016-09-01
Ecological risk assessment (ERA) has been widely applied in characterizing the risk of chemicals to organisms and ecosystems. The paucity of toxicity data on local biota living in the different compartments of an ecosystem and the absence of a suitable methodology for multi-compartment spatial risk assessment at the regional scale has held back this field. The major objective of this study was to develop a methodology to quantify and distinguish the spatial distribution of risk to ecosystems at a regional scale. A framework for regional multi-compartment probabilistic ecological risk assessment (RMPERA) was constructed and corroborated using a bioassay of a local species. The risks from cadmium (Cd) pollution in river water, river sediment, coastal water, coastal surface sediment and soil in northern Bohai Rim were examined. The results indicated that the local organisms in soil, river, coastal water, and coastal sediment were affected by Cd. The greatest impacts from Cd were identified in the Tianjin and Huludao areas. The overall multi-compartment risk was 31.4% in the region. The methodology provides a new approach for regional multi-compartment ecological risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.
Kagkadis, K A; Rekkas, D M; Dallas, P P; Choulis, N H
1996-01-01
In this study a complex of Ibuprofen and b-Hydroxypropylcyclodextrin was prepared employing a freeze drying method. The production parameters and the final specifications of this product were optimized by using response surface methodology. The results show that the freeze dried complex meets the requirements for solubility to be considered as a possible injectable form.
Taheri-Garavand, Amin; Karimi, Fatemeh; Karimi, Mahmoud; Lotfi, Valiullah; Khoobbakht, Golmohammad
2018-06-01
The aim of the study is to fit models for predicting surfaces using the response surface methodology and the artificial neural network to optimize for obtaining the maximum acceptability using desirability functions methodology in a hot air drying process of banana slices. The drying air temperature, air velocity, and drying time were chosen as independent factors and moisture content, drying rate, energy efficiency, and exergy efficiency were dependent variables or responses in the mentioned drying process. A rotatable central composite design as an adequate method was used to develop models for the responses in the response surface methodology. Moreover, isoresponse contour plots were useful to predict the results by performing only a limited set of experiments. The optimum operating conditions obtained from the artificial neural network models were moisture content 0.14 g/g, drying rate 1.03 g water/g h, energy efficiency 0.61, and exergy efficiency 0.91, when the air temperature, air velocity, and drying time values were equal to -0.42 (74.2 ℃), 1.00 (1.50 m/s), and -0.17 (2.50 h) in the coded units, respectively.
Zhou, Ying; Zhu, Nengwu; Kang, Naixin; Cao, Yanlan; Shi, Chaohong; Wu, Pingxiao; Dang, Zhi; Zhang, Xiaoping; Qin, Benqian
2017-02-01
Enhancement of the biosorption capacity for gold is highly desirable for the biorecovery of secondary gold resources. In this study, polyethylenimine (PEI) was grafted on Shewanella haliotis surface through layer-by-layer assembly approach so as to improve the biosorption capacity of Au(III). Results showed that the relative contribution of amino group to the biosorption of Au(III) was the largest one (about 44%). After successful grafting 1, 2 and 3-layer PEI on the surface of biomass, the biosorption capacity significantly enhanced from 143.8mg/g to 597.1, 559.1, and 536.8mg/g, respectively. Interestingly, the biomass modified with 1-layer PEI exhibited 4.2 times higher biosorption capacity than the untreated control. When 1-layer modified biomass was subjected to optimizing the various conditions by response surface methodology, the theoretical maximum adsorption capacity could reach up to 727.3mg/g. All findings demonstrated that PEI modified S. haliotis was effective for enhancing gold biorecovery. Copyright © 2016 Elsevier Ltd. All rights reserved.
Logan, B.L.; McDonald, R.R.; Nelson, J.M.; Kinzel, P.J.; Barton, G.J.
2011-01-01
River channel construction projects aimed at restoring or improving degraded waterways have become common but have been variously successful. In this report a methodology is proposed to evaluate channel designs before channels are built by using multidimensional modeling and analysis. This approach allows detailed analysis of water-surface profiles, sediment transport, and aquatic habitat that may result if the design is implemented. The method presented here addresses the need to model a range of potential stream-discharge and channel-roughness conditions to best assess the function of the design channel for a suite of possible conditions. This methodology is demonstrated by using a preliminary channel-restoration design proposed for a part of the Kootenai River in northern Idaho designated as critical habitat for the endangered white sturgeon (Acipenser transmontanus) and evaluating the design on the basis of simulations with the Flow and Sediment Transport with Morphologic Evolution of Channels (FaSTMECH) model. This evaluation indicated substantial problems with the preliminary design because boundary conditions used in the design were inconsistent with best estimates of future conditions. As a result, simulated water-surface levels did not meet target levels that corresponded to the designed bankfull surfaces; therefore, the flood plain would not function as intended. Sediment-transport analyses indicated that both the current channel of the Kootenai River and the design channel are largely unable to move the bed material through the reach at bankfull discharge. Therefore, sediment delivered to the design channel would likely be deposited within the reach instead of passing through it as planned. Consequently, the design channel geometry would adjust through time. Despite these issues, the design channel would provide more aquatic habitat suitable for spawning white sturgeon (Acipenser transmontanus) at lower discharges than is currently available in the Kootenai River. The evaluation methodology identified potential problems with the design channel that can be addressed through design modifications to better meet project objectives before channel construction.
Amaral, Larissa S; Azevedo, Eduardo B; Perussi, Janice R
2018-06-01
Antimicrobial Photodynamic Inactivation (a-PDI) is based on the oxidative destruction of biological molecules by reactive oxygen species generated by the photo-excitation of a photosensitive molecule. When a-PDT is performed with the use of mathematical models, the optimal conditions for maximum inactivation are found. Experimental designs allow a multivariate analysis of the experimental parameters. This is usually made using a univariate approach, which demands a large number of experiments, being time and money consuming. This paper presents the use of the response surface methodology for improving the search for the best conditions to reduce E. coli survival levels by a-PDT using methylene blue (MB) and toluidine blue (TB) as photosensitizers and white light. The goal was achieved by analyzing the effects and interactions of the three main parameters involved in the process: incubation time (IT), photosensitizer concentration (C PS ), and light dose (LD). The optimization procedure began with a full 2 3 factorial design, followed by a central composite one, in which the optimal conditions were estimated. For MB, C PS was the most important parameter followed by LD and IT whereas, for TB, the main parameter was LD followed by C PS and IT. Using the estimated optimal conditions for inactivation, MB was able to inactivate 99.999999% CFU mL -1 of E. coli with IT of 28 min, LD of 31 J cm -2 , and C PS of 32 μmol L -1 , while TB required 18 min, 39 J cm -2 , and 37 μmol L -1 . The feasibility of using the response surface methodology with a-PDT was demonstrated, enabling enhanced photoinactivation efficiency and fast results with a minimal number of experiments. Copyright © 2018 Elsevier B.V. All rights reserved.
Navarrete-Bolaños, J L; Téllez-Martínez, M G; Miranda-López, R; Jiménez-Islas, H
2017-07-03
For any fermentation process, the production cost depends on several factors, such as the genetics of the microorganism, the process condition, and the culture medium composition. In this work, a guideline for the design of cost-efficient culture media using a sequential approach based on response surface methodology is described. The procedure was applied to analyze and optimize a culture medium of registered trademark and a base culture medium obtained as a result of the screening analysis from different culture media used to grow the same strain according to the literature. During the experiments, the procedure quantitatively identified an appropriate array of micronutrients to obtain a significant yield and find a minimum number of culture medium ingredients without limiting the process efficiency. The resultant culture medium showed an efficiency that compares favorably with the registered trademark medium at a 95% lower cost as well as reduced the number of ingredients in the base culture medium by 60% without limiting the process efficiency. These results demonstrated that, aside from satisfying the qualitative requirements, an optimum quantity of each constituent is needed to obtain a cost-effective culture medium. Study process variables for optimized culture medium and scaling-up production for the optimal values are desirable.
Abidi, Mustufa Haider; Al-Ahmari, Abdulrahman; Ahmad, Ali
2018-01-01
Advanced graphics capabilities have enabled the use of virtual reality as an efficient design technique. The integration of virtual reality in the design phase still faces impediment because of issues linked to the integration of CAD and virtual reality software. A set of empirical tests using the selected conversion parameters was found to yield properly represented virtual reality models. The reduced model yields an R-sq (pred) value of 72.71% and an R-sq (adjusted) value of 86.64%, indicating that 86.64% of the response variability can be explained by the model. The R-sq (pred) is 67.45%, which is not very high, indicating that the model should be further reduced by eliminating insignificant terms. The reduced model yields an R-sq (pred) value of 73.32% and an R-sq (adjusted) value of 79.49%, indicating that 79.49% of the response variability can be explained by the model. Using the optimization software MODE Frontier (Optimization, MOGA-II, 2014), four types of response surfaces for the three considered response variables were tested for the data of DOE. The parameter values obtained using the proposed experimental design methodology result in better graphics quality, and other necessary design attributes.
Dobrovolsky, Vasily N; Revollo, Javier; Petibone, Dayton M; Heflich, Robert H
2017-01-01
The Pig-a assay is being developed as an in vivo gene mutation assay for regulatory safety assessments. The assay is based on detecting mutation in the endogenous Pig-a gene of treated rats by using flow cytometry to measure changes in cell surface markers of peripheral blood cells. Here we present a methodology for demonstrating that phenotypically mutant rat T-cells identified by flow cytometry contain mutations in the Pig-a gene, an important step for validating the assay. In our approach, the mutant phenotype T-cells are sorted into individual wells of 96-well plates and expanded into clones. Subsequent sequencing of genomic DNA from the expanded clones confirms that the Pig-a assay detects exactly what it claims to detect-cells with mutations in the endogenous Pig-a gene. In addition, determining the spectra of Pig-a mutations provides information for better understanding the mutational mechanism of compounds of interest. Our methodology of combining phenotypic antibody labeling, magnetic enrichment, sorting, and single-cell clonal expansion can be used in genotoxicity/mutagenicity studies and in other general immunotoxicology research requiring identification, isolation, and expansion of extremely rare subpopulations of T-cells.
NASA Astrophysics Data System (ADS)
Ghaedi, M.; Khafri, H. Zare; Asfaram, A.; Goudarzi, A.
2016-01-01
The Janus Green B (JGB) adsorption onto homemade ZnO/Zn(OH)2 nanoparticles loaded on activated carbon (AC) which characterized by FESEM and XRD analysis has been reported. Combination of response surface methodology (RSM) and central composite design (CCD) has been employed to model and optimize variables using STATISTICA 10.0 software. The influence of parameters over pH (2.0-8.0), adsorbent (0.004-0.012 g), sonication time (4-8 min) and JGB concentration (3-21 mg L-1) on JGB removal percentage was investigated and their main and interaction contribution was examined. It was revealed that 21 mg L-1 JGB, 0.012 g ZnO/Zn(OH)2-NP-AC at pH 7.0 and 7 min sonication time permit to achieve removal percentage more than 99%. Finally, a good agreement between experimental and predicted values after 7 min was achieved using pseudo-second-order rate equation. The Langmuir adsorption is appropriate for correlation of equilibrium data. The small amount of adsorbent (0.008-0.015 g) is applicable for successful removal of JGB (RE > 99%) in short time (7 min) with high adsorption capacity (81.3-98.03 mg g-1).
Asati, Ankita; Satyanarayana, G N V; Panchal, Smita; Thakur, Ravindra Singh; Ansari, Nasreen G; Patel, Devendra K
2017-08-04
A sensitive, rapid and efficient ionic liquid-based vortex assisted liquid-liquid microextraction (IL-VALLME) with Liquid Chromatography Mass spectrometry (LC-MS/MS) method is proposed for the determination of bisphenols in thermal paper. Extraction factors were systematically optimized by response surface methodology. Experimental factors showing significant effects on the analytical responses were evaluated using design of experiment. The limit of detection for Bisphenol-A (BPA) and Bisphenol-S (BPS) in thermal paper were 1.25 and 0.93μgkg -1 respectively. The dynamic linearity range for BPA was between 4 and 100μgkg -1 and the determination of coefficient (R 2 ) was 0.996. The values of the same parameters were 3-100μgkg -1 and 0.998 for BPS. The extraction recoveries of BPA and BPS in thermal paper were 101% and 99%. Percent relative standard deviation (% RSD) for matrix effect and matrix match effects were not more than 10%, for both bisphenols. The proposed method uses a statistical approach for the analysis of bisphenols in environmental samples, and is easy, rapid, requires minimum organic solvents and efficient. Copyright © 2017 Elsevier B.V. All rights reserved.
Fully printable, strain-engineered electronic wrap for customizable soft electronics.
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-03-24
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.
Roy, Sudipta; Halder, Suman Kumar; Banerjee, Debdulal
2016-01-01
Streptomyces thermoviolaceus NT1, an endophytic isolate, was studied for optimization of granaticinic acid production. It is an antimicrobial metabolite active against even drug resistant bacteria. Different media, optimum glucose concentration, initial media pH, incubation temperature, incubation period, and inoculum size were among the selected parameters optimized in the one-variable-at-a-time (OVAT) approach, where glucose concentration, pH, and temperature were found to play a critical role in antibiotic production by this strain. Finally, the Box–Behnken experimental design (BBD) was employed with three key factors (selected after OVAT studies) for response surface methodological (RSM) analysis of this optimization study.RSM analysis revealed a multifactorial combination; glucose 0.38%, pH 7.02, and temperature 36.53 °C as the optimum conditions for maximum antimicrobial yield. Experimental verification of model analysis led to 3.30-fold (61.35 mg/L as compared to 18.64 mg/L produced in un-optimized condition) enhanced granaticinic acid production in ISP2 medium with 5% inoculum and a suitable incubation period of 10 days. So, the conjugated optimization study for maximum antibiotic production from Streptomyces thermoviolaceus NT1 was found to result in significantly higher yield, which might be exploited in industrial applications. PMID:28952581
Mo, Yu; Zhao, Lei; Wang, Zhonghui; Chen, Chia-Lung; Tan, Giin-Yu Amy; Wang, Jing-Yuan
2014-04-01
A work applied response surface methodology coupled with Box-Behnken design (RSM-BBD) has been developed to enhance styrene recovery from waste polystyrene (WPS) through pyrolysis. The relationship between styrene yield and three selected operating parameters (i.e., temperature, heating rate, and carrier gas flow rate) was investigated. A second order polynomial equation was successfully built to describe the process and predict styrene yield under the study conditions. The factors identified as statistically significant to styrene production were: temperature, with a quadratic effect; heating rate, with a linear effect; carrier gas flow rate, with a quadratic effect; interaction between temperature and carrier gas flow rate; and interaction between heating rate and carrier gas flow rate. The optimum conditions for the current system were determined to be at a temperature range of 470-505°C, a heating rate of 40°C/min, and a carrier gas flow rate range of 115-140mL/min. Under such conditions, 64.52% WPS was recovered as styrene, which was 12% more than the highest reported yield for reactors of similar size. It is concluded that RSM-BBD is an effective approach for yield optimization of styrene recovery from WPS pyrolysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
Fully printable, strain-engineered electronic wrap for customizable soft electronics
NASA Astrophysics Data System (ADS)
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-03-01
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.
Fully printable, strain-engineered electronic wrap for customizable soft electronics
Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek
2017-01-01
Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form. PMID:28338055
NASA Astrophysics Data System (ADS)
Erduran, Sibel; Simon, Shirley; Osborne, Jonathan
2004-11-01
This paper reports some methodological approaches to the analysis of argumentation discourse developed as part of the two-and-a-half year project titled Enhancing the Quality of Argument in School Scienc'' supported by the Economic and Social Research Council in the United Kingdom. In this project researchers collaborated with middle-school science teachers to develop models of instructional activities in an effort to make argumentation a component of instruction. We begin the paper with a brief theoretical justification for why we consider argumentation to be of significance to science education. We then contextualize the use of Toulmin's Argument Pattern in the study of argumentation discourse and provide a justification for the methodological outcomes our approach generates. We illustrate how our work refines and develops research methodologies in argumentation analysis. In particular, we present two methodological approaches to the analysis of argumentation resulting in whole-class as well as small-group student discussions. For each approach, we illustrate our coding scheme and some results as well as how our methodological approach has enabled our inquiry into the quality of argumentation in the classroom. We conclude with some implications for future research in argumentation in science education.
Methodology for Estimating Total Automotive Manufacturing Costs
DOT National Transportation Integrated Search
1983-04-01
A number of methodologies for estimating manufacturing costs have been developed. This report discusses the different approaches and shows that an approach to estimating manufacturing costs in the automobile industry based on surrogate plants is pref...
External Validity in the Study of Human Development: Theoretical and Methodological Issues
ERIC Educational Resources Information Center
Hultsch, David F.; Hickey, Tom
1978-01-01
An examination of the concept of external validity from two theoretical perspectives: a traditional mechanistic approach and a dialectical organismic approach. Examines the theoretical and methodological implications of these perspectives. (BD)
Classification of boreal forest by satellite and inventory data using neural network approach
NASA Astrophysics Data System (ADS)
Romanov, A. A.
2012-12-01
The main objective of this research was to develop methodology for boreal (Siberian Taiga) land cover classification in a high accuracy level. The study area covers the territories of Central Siberian several parts along the Yenisei River (60-62 degrees North Latitude): the right bank includes mixed forest and dark taiga, the left - pine forests; so were taken as a high heterogeneity and statistically equal surfaces concerning spectral characteristics. Two main types of data were used: time series of middle spatial resolution satellite images (Landsat 5, 7 and SPOT4) and inventory datasets from the nature fieldworks (used for training samples sets preparation). Method of collecting field datasets included a short botany description (type/species of vegetation, density, compactness of the crowns, individual height and max/min diameters representative of each type, surface altitude of the plot), at the same time the geometric characteristic of each training sample unit corresponded to the spatial resolution of satellite images and geo-referenced (prepared datasets both of the preliminary processing and verification). The network of test plots was planned as irregular and determined by the landscape oriented approach. The main focus of the thematic data processing has been allocated for the use of neural networks (fuzzy logic inc.); therefore, the results of field studies have been converting input parameter of type / species of vegetation cover of each unit and the degree of variability. Proposed approach involves the processing of time series separately for each image mainly for the verification: shooting parameters taken into consideration (time, albedo) and thus expected to assess the quality of mapping. So the input variables for the networks were sensor bands, surface altitude, solar angels and land surface temperature (for a few experiments); also given attention to the formation of the formula class on the basis of statistical pre-processing of results of field research (prevalence type). Besides some statistical methods of supervised classification has been used (minimal distance, maximum likelihood, Mahalanobis). During the study received various types of neural classifiers suitable for the mapping, and even for the high heterogenic areas neural network approach has shown better results in precision despite the validity of the assumption of Gaussian distribution (Table). Experimentally chosen optimum network structure consisting of three layers of ten neuron in each, but it should be clarified that such configuration requires larges computational resources in comparison the statistical methods presented above; necessary to increase the number of iteration in network learning process for RMS errors minimization. It should also be emphasized that the key issues of accuracy estimation of the classification results is lack of completeness of the training sets, this is especially true with summer image processing of mixed forest. However seems that proposed methodology can be used also for measure local dynamic of boreal land surface by the type of vegetation.Comparison of classification accuracyt;
Tomkins, Matthew Robert; Liao, David Shiqi; Docoslis, Aristides
2015-01-08
A detection method that combines electric field-assisted virus capture on antibody-decorated surfaces with the "fingerprinting" capabilities of micro-Raman spectroscopy is demonstrated for the case of M13 virus in water. The proof-of-principle surface mapping of model bioparticles (protein coated polystyrene spheres) captured by an AC electric field between planar microelectrodes is presented with a methodology for analyzing the resulting spectra by comparing relative peak intensities. The same principle is applied to dielectrophoretically captured M13 phage particles whose presence is indirectly confirmed with micro-Raman spectroscopy using NeutrAvidin-Cy3 as a labeling molecule. It is concluded that the combination of electrokinetically driven virus sampling and micro-Raman based signal transduction provides a promising approach for time-efficient and in situ detection of viruses.
Computed myography: three-dimensional reconstruction of motor functions from surface EMG data
NASA Astrophysics Data System (ADS)
van den Doel, Kees; Ascher, Uri M.; Pai, Dinesh K.
2008-12-01
We describe a methodology called computed myography to qualitatively and quantitatively determine the activation level of individual muscles by voltage measurements from an array of voltage sensors on the skin surface. A finite element model for electrostatics simulation is constructed from morphometric data. For the inverse problem, we utilize a generalized Tikhonov regularization. This imposes smoothness on the reconstructed sources inside the muscles and suppresses sources outside the muscles using a penalty term. Results from experiments with simulated and human data are presented for activation reconstructions of three muscles in the upper arm (biceps brachii, bracialis and triceps). This approach potentially offers a new clinical tool to sensitively assess muscle function in patients suffering from neurological disorders (e.g., spinal cord injury), and could more accurately guide advances in the evaluation of specific rehabilitation training regimens.
Tomkins, Matthew Robert; Liao, David Shiqi; Docoslis, Aristides
2015-01-01
A detection method that combines electric field-assisted virus capture on antibody-decorated surfaces with the “fingerprinting” capabilities of micro-Raman spectroscopy is demonstrated for the case of M13 virus in water. The proof-of-principle surface mapping of model bioparticles (protein coated polystyrene spheres) captured by an AC electric field between planar microelectrodes is presented with a methodology for analyzing the resulting spectra by comparing relative peak intensities. The same principle is applied to dielectrophoretically captured M13 phage particles whose presence is indirectly confirmed with micro-Raman spectroscopy using NeutrAvidin-Cy3 as a labeling molecule. It is concluded that the combination of electrokinetically driven virus sampling and micro-Raman based signal transduction provides a promising approach for time-efficient and in situ detection of viruses. PMID:25580902
Passive bottom reflection-loss estimation using ship noise and a vertical line array.
Muzi, Lanfranco; Siderius, Martin; Verlinden, Christopher M
2017-06-01
An existing technique for passive bottom-loss estimation from natural marine surface noise (generated by waves and wind) is adapted to use noise generated by ships. The original approach-based on beamforming of the noise field recorded by a vertical line array of hydrophones-is retained; however, additional processing is needed in order for the field generated by a passing ship to show features that are similar to those of the natural surface-noise field. A necessary requisite is that the ship position, relative to the array, varies over as wide a range of steering angles as possible, ideally passing directly over the array to ensure coverage of the steepest angles. The methodology is illustrated through simulation and applied to data from a field experiment conducted offshore of San Diego, CA in 2009.
Scaloni, A; Ferranti, P; De Simone, G; Mamone, G; Sannolo, N; Malorni, A
1999-06-11
The use of aspecific methylation reaction in combination with MS procedures has been employed for the characterization of the nucleophilic residues present on the molecular surface of the human 2,3-diphosphoglycerate/deoxy-hemoglobin complex. In particular, direct molecular weight determinations by ESMS allowed to control the reaction conditions, limiting the number of methyl groups introduced in the modified globin chains. A combined LCESMS-Edman degradation approach for the analysis of the tryptic peptide mixtures yielded to the exact identification of methylation sites together with the quantitative estimation of their degree of modification. The reactivities observed were directly correlated with the pKa and the relative surface accessibility of the nucleophilic residues, calculated from the X-ray crystallographic structure of the protein. The results here described indicate that this methodology can be efficiently used in aspecific modification experiments directed to the molecular characterization of the surface topology in proteins and protein complexes.
Profiling charge complementarity and selectivity for binding at the protein surface.
Sulea, Traian; Purisima, Enrico O
2003-05-01
A novel analysis and representation of the protein surface in terms of electrostatic binding complementarity and selectivity is presented. The charge optimization methodology is applied in a probe-based approach that simulates the binding process to the target protein. The molecular surface is color coded according to calculated optimal charge or according to charge selectivity, i.e., the binding cost of deviating from the optimal charge. The optimal charge profile depends on both the protein shape and charge distribution whereas the charge selectivity profile depends only on protein shape. High selectivity is concentrated in well-shaped concave pockets, whereas solvent-exposed convex regions are not charge selective. This suggests the synergy of charge and shape selectivity hot spots toward molecular selection and recognition, as well as the asymmetry of charge selectivity at the binding interface of biomolecular systems. The charge complementarity and selectivity profiles map relevant electrostatic properties in a readily interpretable way and encode information that is quite different from that visualized in the standard electrostatic potential map of unbound proteins.
NASA Astrophysics Data System (ADS)
Ayzenshtadt, A. M.; Frolova, M. A.; Makhova, T. A.; Danilov, V. E.; Gupta, Piyush K.; Verma, Rama S.
2018-01-01
Minerals samples of mixed-genesis rocks in a finely dispersed state were obtained and studied, namely sand deposit (Kholmogory district) and basalt (Myandukha deposit, Plesetsk district) in Arkhangelsk region. The paper provides the chemical composition data used to calculate the specific mass atomization energy of rocks. The energy parameters of the micro and nano systems of the rock samples - free surface energy and surface activity - were calculated. For toxicological evaluation of the materials obtained, next-generation sequencing (NGS) was used to perform metagenomic analysis which allowed determining the species diversity of microorganisms in the samples under study. It was shown that the sequencing method and metagenomic analysis are applicable and provide good reproducibility for the analysis of the toxicological properties of selected rock samples. The correlation of the surface activity of finely dispersed rock systems and the species diversity of cultivated microorganisms on the raw material was observed.
Brosseau, Christa L; Gambardella, Alessa; Casadio, Francesca; Grzywacz, Cecily M; Wouters, Jan; Van Duyne, Richard P
2009-04-15
Tailored ad-hoc methods must be developed for successful identification of minute amounts of natural dyes on works of art using Surface-Enhanced Raman Spectroscopy (SERS). This article details two of these successful approaches using silver film over nanosphere (AgFON) substrates and silica gel coupled with citrate-reduced Ag colloids. The latter substrate functions as the test system for the coupling of thin-layer chromatography and SERS (TLC-SERS), which has been used in the current research to separate and characterize a mixture of several artists' dyes. The poor limit of detection of TLC is overcome by coupling with SERS, and dyes which co-elute to nearly the same spot can be distinguished from each other. In addition, in situ extractionless non-hydrolysis SERS was used to analyze dyed reference fibers, as well as historical textile fibers. Colorants such as alizarin, purpurin, carminic acid, lac dye, crocin, and Cape jasmine were thus successfully identified.
NASA Astrophysics Data System (ADS)
Boubakir, A.; Boudjema, F.; Boubakir, C.
2008-06-01
This paper proposes an approach of hybrid control that is based on the concept of combining fuzzy logic and the methodology of sliding mode control (SMC). In the present works, a first-order nonlinear sliding surface is presented, on which the developed control law is based. Mathematical proof for the stability and convergence of the system is presented. In order to reduce the chattering in sliding mode control, a fixed boundary layer around the switch surface is used. Within the boundary layer, since the fuzzy logic control is applied, the chattering phenomenon, which is inherent in a sliding mode control, is avoided by smoothing the switch signal. Outside the boundary, the sliding mode control is applied to driving the system states into the boundary layer. Experimental studies carried out on a coupled Tanks system indicate that the proposed fuzzy sliding mode control (FSMC) is a good candidate for control applications.
Vasanth, Muthuraman; Muralidhar, Moturi; Saraswathy, Ramamoorthy; Nagavel, Arunachalam; Dayal, Jagabattula Syama; Jayanthi, Marappan; Lalitha, Natarajan; Kumararaja, Periyamuthu; Vijayan, Koyadan Kizhakkedath
2016-12-01
Global warming/climate change is the greatest environmental threat of our time. Rapidly developing aquaculture sector is an anthropogenic activity, the contribution of which to global warming is little understood, and estimation of greenhouse gases (GHGs) emission from the aquaculture ponds is a key practice in predicting the impact of aquaculture on global warming. A comprehensive methodology was developed for sampling and simultaneous analysis of GHGs, carbon dioxide (CO 2 ), methane (CH 4 ), and nitrous oxide (N 2 O) from the aquaculture ponds. The GHG fluxes were collected using cylindrical acrylic chamber, air pump, and tedlar bags. A cylindrical acrylic floating chamber was fabricated to collect the GHGs emanating from the surface of aquaculture ponds. The sampling methodology was standardized and in-house method validation was established by achieving linearity, accuracy, precision, and specificity. GHGs flux was found to be stable at 10 ± 2 °C of storage for 3 days. The developed methodology was used to quantify GHGs in the Pacific white shrimp Penaeus vannamei and black tiger shrimp Penaeus monodon culture ponds for a period of 4 months. The rate of emission of carbon dioxide was found to be much greater when compared to other two GHGs. Average GHGs emission in gha -1 day -1 during the culture was comparatively high in P.vannamei culture ponds.
NASA Astrophysics Data System (ADS)
Ramírez-Cuesta, J. M.; Cruz-Blanco, M.; Santos, C.; Lorite, I. J.
2017-03-01
Reference evapotranspiration (ETo) is a key component in efficient water management, especially in arid and semi-arid environments. However, accurate ETo assessment at the regional scale is complicated by the limited number of weather stations and the strict requirements in terms of their location and surrounding physical conditions for the collection of valid weather data. In an attempt to overcome this limitation, new approaches based on the use of remote sensing techniques and weather forecast tools have been proposed. Use of the Land Surface Analysis Satellite Application Facility (LSA SAF) tool and Geographic Information Systems (GIS) have allowed the design and development of innovative approaches for ETo assessment, which are especially useful for areas lacking available weather data from weather stations. Thus, by identifying the best-performing interpolation approaches (such as the Thin Plate Splines, TPS) and by developing new approaches (such as the use of data from the most similar weather station, TS, or spatially distributed correction factors, CITS), errors as low as 1.1% were achieved for ETo assessment. Spatial and temporal analyses reveal that the generated errors were smaller during spring and summer as well as in homogenous topographic areas. The proposed approaches not only enabled accurate calculations of seasonal and daily ETo values, but also contributed to the development of a useful methodology for evaluating the optimum number of weather stations to be integrated into a weather station network and the appropriateness of their locations. In addition to ETo, other variables included in weather forecast datasets (such as temperature or rainfall) could be evaluated using the same innovative methodology proposed in this study.
Approach to Teaching Research Methodology for Information Technology
ERIC Educational Resources Information Center
Steenkamp, Annette Lerine; McCord, Samual Alan
2007-01-01
The paper reports on an approach to teaching a course in information technology research methodology in a doctoral program, the Doctor of Management in Information Technology (DMIT), in which research, with focus on finding innovative solutions to problems found in practice, comprises a significant part of the degree. The approach makes a…
Multiple Cultures of Doing Geography Facilitate Global Studies
ERIC Educational Resources Information Center
Ahamer, Gilbert
2013-01-01
Purpose: This article aims to explain why geography is a prime discipline for analysing globalisation and a multicultural view of Global Studies. The generic approach of human geography to first select an appropriate methodology is taken as a key approach. Design/methodology/approach: Concepts from aggregate disciplines such as history, economics,…
Using Q Methodology in the Literature Review Process: A Mixed Research Approach
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Frels, Rebecca K.
2015-01-01
Because of the mixed research-based nature of literature reviews, it is surprising, then, that insufficient information has been provided as to how reviewers can incorporate mixed research approaches into their literature reviews. Thus, in this article, we provide a mixed methods research approach--Q methodology--for analyzing information…
Identifying Behavioral Barriers to Campus Sustainability: A Multi-Method Approach
ERIC Educational Resources Information Center
Horhota, Michelle; Asman, Jenni; Stratton, Jeanine P.; Halfacre, Angela C.
2014-01-01
Purpose: The purpose of this paper is to assess the behavioral barriers to sustainable action in a campus community. Design/methodology/approach: This paper reports three different methodological approaches to the assessment of behavioral barriers to sustainable actions on a college campus. Focus groups and surveys were used to assess campus…
Duggleby, Wendy; Williams, Allison
2016-01-01
The purpose of this article is to discuss methodological and epistemological considerations involved in using qualitative inquiry to develop interventions. These considerations included (a) using diverse methodological approaches and (b) epistemological considerations such as generalization, de-contextualization, and subjective reality. Diverse methodological approaches have the potential to inform different stages of intervention development. Using the development of a psychosocial hope intervention for advanced cancer patients as an example, the authors utilized a thematic study to assess current theories/frameworks and interventions. However, to understand the processes that the intervention needed to target to affect change, grounded theory was used. Epistemological considerations provided a framework to understand and, further, critique the intervention. Using diverse qualitative methodological approaches and examining epistemological considerations were useful in developing an intervention that appears to foster hope in patients with advanced cancer. © The Author(s) 2015.
Chitosan Nanoparticles Prepared by Ionotropic Gelation: An Overview of Recent Advances.
Desai, Kashappa Goud
2016-01-01
The objective of this review is to summarize recent advances in chitosan nanoparticles prepared by ionotropic gelation. Significant progress has occurred in this area since the method was first reported. The gelation technique has been improved through a number of creative methodological modifications. Ionotropic gelation via electrospraying and spinning disc processing produces nanoparticles with a more uniform size distribution. Large-scale manufacturing of the nanoparticles can be achieved with the latter approach. Hydrophobic and hydrophilic drugs can be simultaneously encapsulated with high efficiency by emulsification followed by ionic gelation. The turbulent mixing approach facilitates nanoparticle formation at a relatively high polymer concentration (5 mg/mL). The technique can be easily tuned to achieve the desired polymer/surface modifications (e.g., blending, coating, and surface conjugation). Using factorial-design-based approaches, optimal conditions for nanoparticle formation can be determined with a minimum number of experiments. New insights have been gained into the mechanism of chitosan-tripolyphosphate nanoparticle formation. Chitosan nanoparticles prepared by ionotropic gelation tend to aggregate/agglomerate in unfavorable environments. Factors influencing this phenomenon and strategies that can be adopted to minimize the instability are discussed. Ionically cross-linked nanoparticles based on native chitosan and modified chitosan have shown excellent efficacy for controlled and targeted drug-delivery applications.
Spanish methodological approach for biosphere assessment of radioactive waste disposal.
Agüero, A; Pinedo, P; Cancio, D; Simón, I; Moraleda, M; Pérez-Sánchez, D; Trueba, C
2007-10-01
The development of radioactive waste disposal facilities requires implementation of measures that will afford protection of human health and the environment over a specific temporal frame that depends on the characteristics of the wastes. The repository design is based on a multi-barrier system: (i) the near-field or engineered barrier, (ii) far-field or geological barrier and (iii) the biosphere system. Here, the focus is on the analysis of this last system, the biosphere. A description is provided of conceptual developments, methodological aspects and software tools used to develop the Biosphere Assessment Methodology in the context of high-level waste (HLW) disposal facilities in Spain. This methodology is based on the BIOMASS "Reference Biospheres Methodology" and provides a logical and systematic approach with supplementary documentation that helps to support the decisions necessary for model development. It follows a five-stage approach, such that a coherent biosphere system description and the corresponding conceptual, mathematical and numerical models can be built. A discussion on the improvements implemented through application of the methodology to case studies in international and national projects is included. Some facets of this methodological approach still require further consideration, principally an enhanced integration of climatology, geography and ecology into models considering evolution of the environment, some aspects of the interface between the geosphere and biosphere, and an accurate quantification of environmental change processes and rates.
A novel porous Ffowcs-Williams and Hawkings acoustic methodology for complex geometries
NASA Astrophysics Data System (ADS)
Nitzkorski, Zane Lloyd
Predictive noise calculations from high Reynolds number flows in complex engineering geometry are becoming a possibility with the high performance computing resources that have become available in recent years. Increasing the applicability and reliability of solution methodologies have been two key challenges toward this goal. This dissertation develops a porous Ffowcs-Williams and Hawkings methodology that uses a novel endcap methodology, and can be applied to unstructured grids. The use of unstructured grids allows complex geometry to be represented while porous formulation eliminates difficulties with the choice of acoustic Green's function. Specifically, this dissertation (1) proposes and examines a novel endcap procedure to account for spurious noise, (2) uses the proposed methodology to investigate noise production from a range of subcritical Reynolds number circular cylinders, and (3) investigates a trailing edge geometry for noise production and to illustrate the generality of the Green's function. Porous acoustic analogies need an endcap scheme in order to prevent spurious noise due to truncation errors. A dynamic end cap methodology is proposed to account for spurious contributions to the far--field sound within the context of the Ffowcs--Williams and Hawkings (FW--H) acoustic analogy. The quadrupole source terms are correlated over multiple planes to obtain a convection velocity which is then used to determine a corrective convective flux at the FW--H porous surface. The proposed approach is first demonstrated for a convecting potential vortex. The correlation is investigated by examining it pass through multiple exit planes. It is then evaluated by computing the sound emitted by flow over a circular cylinder at Reynolds number of 150 and compared to other endcap methods, such as Shur et al. [1]. Insensitivity to end plane location and spacing and the effect of the dynamic convection velocity are computed. Subcritical Reynolds number circular cylinder flows are investigated at Re = 3900, 10000 and 89000 in order to evaluate the method and investigate the physical sources of noise production. The Re = 3900 case was chosen due to its highly validated flow-field and to serve as a basis of comparison. The Re = 10000 cylinder is used to validate the noise production at turbulent Reynolds numbers against other simulations. Finally the Re = 89000 simulations are used to compare to experiment serving as a rigorous test of the methods predictive ability. The proposed approach demonstrates better performance than other commonly used approaches with the added benefit of computational efficiency and the ability to query independent volumes. This gives the added benefit of discovering how much noise production is directly associated with volumetric noise contributions. These capabilities allow for a thorough investigation of the sources of noise production and a means to evaluate proposed theories. A physical description of the source of sound for subcritical Reynolds number cylinders is established. A 45° beveled trailing edge configuration is investigated due to its relevance to hydrofoil and propeller noise. This configuration also allows for the evaluation of the assumption associated with the free-space Green's function since the half-plane Green's function can be used to represent the solution to the wave equation for this geometry. Similar results for directivity and amplitudes of the two formulations confirm the flexibility of the porous surface implementation. Good agreement with experiment is obtained. The effect of boundary layer thickness is investigated. The noise produced in the upper half plane is significantly decreased for the thinner boundary layer while the noise production in the lower half plane is only slightly decreased.
NASA Astrophysics Data System (ADS)
Barrios, José Miguel; Ghilain, Nicolas; Arboleda, Alirio; Gellens-Meulenberghs, Françoise
2014-05-01
Evapotranspiration (ET) is the water flux going from the surface into the atmosphere as result of soil and surface water evaporation and plant transpiration. It constitutes a key component of the water cycle and its quantification is of crucial importance for a number of applications like water management, climatic modelling, agriculture monitoring and planning, etc. Estimating ET is not an easy task; specially if large areas are envisaged and various spatio-temporal patterns of ET are present as result of heterogeneity in land cover, land use and climatic conditions. In this respect, spaceborne remote sensing (RS) provides the only alternative to continuously measure surface parameters related to ET over large areas. The Royal Meteorological Institute (RMI) of Belgium, in the framework of EUMETSAT's "Land Surface Analysis-Satellite Application Facility" (LSA-SAF), has developed a model for the estimation of ET. The model is forced by RS data, numerical weather predictions and land cover information. The RS forcing is derived from measurements by the Spinning Enhanced Visible and Infrared Imager (SEVIRI) onboard the Meteosat Second Generation (MSG) satellite. This ET model is operational and delivers ET estimations over the whole field of view of the MSG satellite (Europe, Africa and Eastern South America) (http://landsaf.meteo.pt) every 30 minutes. The spatial resolution of MSG is 3 x 3 km at subsatellite point and about 4 x 5 km in continental Europe. The spatial resolution of this product may constrain its full exploitation as the interest of potential users (farmers and natural resources scientists) may lie on smaller spatial units. This study aimed at testing methodological alternatives to combine RS imagery (geostationary and polar orbit satellites) for the estimation of ET such that the spatial resolution of the final product is improved. In particular, the study consisted in the implementation of two approaches for combining the current ET estimations with RS data containing information over vegetation parameters and captured by polar orbit spaceborne sensors. The first tested approach consisted in forcing the operational ET algorithm with RS measurements obtained from a moderate spatial resolution sensor. The variables with improved spatial resolution were leaf area index and albedo. Other variables of the model remained unchanged with respect to the operational version. In the second approach, a two phases procedure was implemented. Firstly, a preliminary approximation of ET was obtained as a function of solar radiation, air temperature and a vegetation index. The value was then statistically adjusted on the basis of the ET estimations by the operational algorithm. The results of implementing the different approaches were tested against eddy covariance ET derived from measurements in Fluxnet towers spread across Europe and representing different landscape characteristics. The analysis allowed the identification of pros and cons of the tested methodological approaches as well as their performance in different land cover arrangements.
Indirect estimation of emission factors for phosphate surface mining using air dispersion modeling.
Tartakovsky, Dmitry; Stern, Eli; Broday, David M
2016-06-15
To date, phosphate surface mining suffers from lack of reliable emission factors. Due to complete absence of data to derive emissions factors, we developed a methodology for estimating them indirectly by studying a range of possible emission factors for surface phosphate mining operations and comparing AERMOD calculated concentrations to concentrations measured around the mine. We applied this approach for the Khneifiss phosphate mine, Syria, and the Al-Hassa and Al-Abyad phosphate mines, Jordan. The work accounts for numerous model unknowns and parameter uncertainties by applying prudent assumptions concerning the parameter values. Our results suggest that the net mining operations (bulldozing, grading and dragline) contribute rather little to ambient TSP concentrations in comparison to phosphate processing and transport. Based on our results, the common practice of deriving the emission rates for phosphate mining operations from the US EPA emission factors for surface coal mining or from the default emission factor of the EEA seems to be reasonable. Yet, since multiple factors affect dispersion from surface phosphate mines, a range of emission factors, rather than only a single value, was found to satisfy the model performance. Copyright © 2016 Elsevier B.V. All rights reserved.
Developing a more useful surface quality metric for laser optics
NASA Astrophysics Data System (ADS)
Turchette, Quentin; Turner, Trey
2011-02-01
Light scatter due to surface defects on laser resonator optics produces losses which lower system efficiency and output power. The traditional methodology for surface quality inspection involves visual comparison of a component to scratch and dig (SAD) standards under controlled lighting and viewing conditions. Unfortunately, this process is subjective and operator dependent. Also, there is no clear correlation between inspection results and the actual performance impact of the optic in a laser resonator. As a result, laser manufacturers often overspecify surface quality in order to ensure that optics will not degrade laser performance due to scatter. This can drive up component costs and lengthen lead times. Alternatively, an objective test system for measuring optical scatter from defects can be constructed with a microscope, calibrated lighting, a CCD detector and image processing software. This approach is quantitative, highly repeatable and totally operator independent. Furthermore, it is flexible, allowing the user to set threshold levels as to what will or will not constitute a defect. This paper details how this automated, quantitative type of surface quality measurement can be constructed, and shows how its results correlate against conventional loss measurement techniques such as cavity ringdown times.
Estimation of Regional Net CO2 Exchange over the Southern Great Plains
NASA Astrophysics Data System (ADS)
Biraud, S. C.; Riley, W. J.; Fischer, M. L.; Torn, M. S.; Cooley, H. S.
2004-12-01
Estimating spatially distributed ecosystem CO2 exchange is an important component of the North American Carbon Program. We describe here a methodology to estimate Net Ecosystem Exchange (NEE) over the Southern Great Plains, using: (1) data from the Department Of Energy's Atmospheric Radiation Measurement (ARM) sites in Oklahoma and Kansas; (2) meteorological forcing data from the Mesonet facilities; (3) soil and vegetation types from 1 km resolution USGS databases; (4) vegetation status (e.g., LAI) from 1 km satellite measurements of surface reflectance (MODIS); (5) a tested land-surface model; and (6) a coupled land-surface and meteorological model (MM5/ISOLSM). This framework allows us to simulate regional surface fluxes in addition to ABL and free troposphere concentrations of CO2 at a continental scale with fine-scale nested grids centered on the ARM central facility. We use the offline land-surface and coupled models to estimate regional NEE, and compare predictions to measurements from the 9 Extended Facility sites with eddy correlation measurements. Site level comparisons to portable ECOR measurements in several crop types are also presented. Our approach also allows us to extend bottom-up estimates to periods and areas where meteorological forcing data are unavailable.
Modeling for free surface flow with phase change and its application to fusion technology
NASA Astrophysics Data System (ADS)
Luo, Xiaoyong
The development of predictive capabilities for free surface flow with phase change is essential to evaluate liquid wall protection schemes for various fusion chambers. With inertial fusion energy (IFE) concepts such as HYLIFE-II, rapid condensation into cold liquid surfaces is required when using liquid curtains for protecting reactor walls from blasts and intense neutron radiation. With magnetic fusion energy (MFE) concepts, droplets are injected onto the free surface of the liquid to minimize evaporation by minimizing the surface temperature. This dissertation presents a numerical methodology for free surface flow with phase change to help resolve feasibility issues encountered in the aforementioned fusion engineering fields, especially spray droplet condensation efficiency in IFE and droplet heat transfer enhancement on free surface liquid divertors in MFE. The numerical methodology is being conducted within the framework of the incompressible flow with the phase change model. A new second-order projection method is presented in conjunction with Approximate-Factorization techniques (AF method) for incompressible Navier-Stokes equations. A sub-cell conception is introduced and the Ghost Fluid Method in extended in a modified mass transfer model to accurately calculate the mass transfer across the interface. The Crank-Nicholson method is used for the diffusion term to eliminate the numerical viscous stability restriction. The third-order ENO scheme is used for the convective term to guarantee the accuracy of the method. The level set method is used to capture accurately the free surface of the flow and the deformation of the droplets. This numerical investigation identifies the physics characterizing transient heat and mass transfer of the droplet and the free surface flow. The results show that the numerical methodology is quite successful in modeling the free surface with phase change even though some severe deformations such as breaking and merging occur. The versatility of the numerical methodology shows that the work can easily handle complex physical conditions that occur in the fusion science and engineering.
Improving the Method of Roof Fall Susceptibility Assessment based on Fuzzy Approach
NASA Astrophysics Data System (ADS)
Ghasemi, Ebrahim; Ataei, Mohammad; Shahriar, Kourosh
2017-03-01
Retreat mining is always accompanied by a great amount of accidents and most of them are due to roof fall. Therefore, development of methodologies to evaluate the roof fall susceptibility (RFS) seems essential. Ghasemi et al. (2012) proposed a systematic methodology to assess the roof fall risk during retreat mining based on risk assessment classic approach. The main defect of this method is ignorance of subjective uncertainties due to linguistic input value of some factors, low resolution, fixed weighting, sharp class boundaries, etc. To remove this defection and improve the mentioned method, in this paper, a novel methodology is presented to assess the RFS using fuzzy approach. The application of fuzzy approach provides an effective tool to handle the subjective uncertainties. Furthermore, fuzzy analytical hierarchy process (AHP) is used to structure and prioritize various risk factors and sub-factors during development of this method. This methodology is applied to identify the susceptibility of roof fall occurrence in main panel of Tabas Central Mine (TCM), Iran. The results indicate that this methodology is effective and efficient in assessing RFS.
Farias, Lisette; Laliberte Rudman, Debbie; Pollard, Nick; Schiller, Sandra; Serrata Malfitano, Ana Paula; Thomas, Kerry; van Bruggen, Hanneke
2018-05-03
Calls for embracing the potential and responsibility of occupational therapy to address socio-political conditions that perpetuate occupational injustices have materialized in the literature. However, to reach beyond traditional frameworks informing practices, this social agenda requires the incorporation of diverse epistemological and methodological approaches to support action commensurate with social transformative goals. Our intent is to present a methodological approach that can help extend the ways of thinking or frameworks used in occupational therapy and science to support the ongoing development of practices with and for individuals and collectives affected by marginalizing conditions. We describe the epistemological and theoretical underpinnings of a methodological approach drawing on Freire and Bakhtin's work. Integrating our shared experience taking part in an example study, we discuss the unique advantages of co-generating data using two methods aligned with this approach; dialogical interviews and critical reflexivity. Key considerations when employing this approach are presented, based on its proposed epistemological and theoretical stance and our shared experiences engaging in it. A critical dialogical approach offers one way forward in expanding occupational therapy and science scholarship by promoting collaborative knowledge generation and examination of taken-for-granted understandings that shape individuals assumptions and actions.
Rita C.L.B. Rodrigues; William R. Kenealy; Diane Dietrich; Thomas W. Jeffries
2012-01-01
Response surface methodology (RSM), based on a 22 full factorial design, evaluated the moisture effects in recovering xylose by diethyloxalate (DEO) hydrolysis. Experiments were carried out in laboratory reactors (10 mL glass ampoules) containing corn stover (0.5 g) properly ground. The ampoules were kept at 160 °C for 90 min. Both DEO...
NASA Astrophysics Data System (ADS)
Hansen, A. L.; Donnelly, C.; Refsgaard, J. C.; Karlsson, I. B.
2018-01-01
This paper describes a modeling approach proposed to simulate the impact of local-scale, spatially targeted N-mitigation measures for the Baltic Sea Basin. Spatially targeted N-regulations aim at exploiting the considerable spatial differences in the natural N-reduction taking place in groundwater and surface water. While such measures can be simulated using local-scale physically-based catchment models, use of such detailed models for the 1.8 million km2 Baltic Sea basin is not feasible due to constraints on input data and computing power. Large-scale models that are able to simulate the Baltic Sea basin, on the other hand, do not have adequate spatial resolution to simulate some of the field-scale measures. Our methodology combines knowledge and results from two local-scale physically-based MIKE SHE catchment models, the large-scale and more conceptual E-HYPE model, and auxiliary data in order to enable E-HYPE to simulate how spatially targeted regulation of agricultural practices may affect N-loads to the Baltic Sea. We conclude that the use of E-HYPE with this upscaling methodology enables the simulation of the impact on N-loads of applying a spatially targeted regulation at the Baltic Sea basin scale to the correct order-of-magnitude. The E-HYPE model together with the upscaling methodology therefore provides a sound basis for large-scale policy analysis; however, we do not expect it to be sufficiently accurate to be useful for the detailed design of local-scale measures.
Methods for the guideline-based development of quality indicators--a systematic review
2012-01-01
Background Quality indicators (QIs) are used in many healthcare settings to measure, compare, and improve quality of care. For the efficient development of high-quality QIs, rigorous, approved, and evidence-based development methods are needed. Clinical practice guidelines are a suitable source to derive QIs from, but no gold standard for guideline-based QI development exists. This review aims to identify, describe, and compare methodological approaches to guideline-based QI development. Methods We systematically searched medical literature databases (Medline, EMBASE, and CINAHL) and grey literature. Two researchers selected publications reporting methodological approaches to guideline-based QI development. In order to describe and compare methodological approaches used in these publications, we extracted detailed information on common steps of guideline-based QI development (topic selection, guideline selection, extraction of recommendations, QI selection, practice test, and implementation) to predesigned extraction tables. Results From 8,697 hits in the database search and several grey literature documents, we selected 48 relevant references. The studies were of heterogeneous type and quality. We found no randomized controlled trial or other studies comparing the ability of different methodological approaches to guideline-based development to generate high-quality QIs. The relevant publications featured a wide variety of methodological approaches to guideline-based QI development, especially regarding guideline selection and extraction of recommendations. Only a few studies reported patient involvement. Conclusions Further research is needed to determine which elements of the methodological approaches identified, described, and compared in this review are best suited to constitute a gold standard for guideline-based QI development. For this research, we provide a comprehensive groundwork. PMID:22436067
Zeng, Kaizhu; Li, Qian; Wang, Jing; Yin, Guowei; Zhang, Yajun; Xiao, Chaoni; Fan, Taiping; Zheng, Xiaohui
2017-01-01
Protein immobilization techniques play an important role in the development of assays for disease diagnosis and drug discovery. However, many of these approaches are not applicable to transmembrane proteins. G protein-coupled receptors (GPCRs) are the largest protein superfamily encoded by the human genome and are targeted by a quarter of all prescription drugs. GPCRs are highly dynamic and sensitive to changes in the ambient environment, and current immobilization methodologies are not suitable for GPCRs. We used haloalkane dehalogenase (Halo) as an immobilization tag fused to the β2-adrenoceptor (β2-AR), angiotensin II type 1 (AT1) and angiotensin II type 2 (AT2) receptors. The engineered Halo-tag covalently binds to a specific substrate chloroalkane through Asp 106 in the catalytic pocket. The Halo-tagged GPCRs were expressed in Escherichia coli at a suitable yield. Accordingly, we loaded cell lysate containing Halo-tagged GPCRs onto a macroporous silica gel coated with chloroalkane. Morphological characterization indicated a homogeneous monolayer of immobilized Halo-tagged GPCRs on the silica gel surface. The immobilized receptors proved to be surrounded by specific bound phospholipids including PG C18:1/C18:1. We observed a radio-ligand binding ability and ligand-induced conformational changes in the immobilized GPCRs, suggesting the preservation of bioactivity. This method is a one-step approach for the specific immobilization of GPCRs from cell lysates and validates that immobilized receptors retain canonical ligand binding capacity. Our immobilization strategy circumvents labor-intensive purification procedures and minimizes loss of activity. The immobilized receptors can be applied to high-throughput drug and interaction partner screening for GPCRs. PMID:29629116
Selection of Sustainable Technology for VOC Abatement in an Industry: An Integrated AHP-QFD Approach
NASA Astrophysics Data System (ADS)
Gupta, Alok Kumar; Modi, Bharat A.
2018-04-01
Volatile organic compounds (VOCs) are universally present in global atmospheric pollutants. These VOCs are responsible for photo chemical reaction in atmosphere leading to serious harmful effects on human health and environment. VOCs are produced from both natural and man-made sources and may have good commercial value if it can be utilized as alternate fuel. As per data from US EPA, 15% of total VOC emissions are generated from surface coating industry but VOC concentration and exhaust air volume varies to a great extent and is dependent on processes used by industry. Various technologies are available for abatement of VOCs. Physical, Chemical and Biological technologies are available to remove VOCs by either recovery or destruction with many advantages and limitations. With growing environmental awareness and considering the resource limitations of medium and small scale industries, requirement of a tool for selecting appropriate techno economically viable solution for removal of VOCs from industrial process exhaust is envisaged. The aim of the present study is to provide management a tool to determine the overall effect of implementation of VOC abatement technology on business performance and VOC emissions. The primary purpose of this work is to outline a methodology to rate various VOC abatement technologies with respect to the constraint of meeting current and foreseeable future regulatory requirements, operational flexibility and Over All Economics Parameters considering conservation of energy. In this paper an integrated approach has been proposed to select most appropriate abatement technology strategically. Analytical hierarchy process and Quality function deployment have been integrated for Techno-commercial evaluation. A case study on selection of VOC abatement technology for a leading aluminium foil surface coating, lamination and printing facility using this methodology is presented in this study.
Kumar, Mukesh; Singh, Amrinder; Beniwal, Vikas; Salar, Raj Kumar
2016-12-01
Tannase (tannin acyl hydrolase E.C 3.1.1.20) is an inducible, largely extracellular enzyme that causes the hydrolysis of ester and depside bonds present in various substrates. Large scale industrial application of this enzyme is very limited owing to its high production costs. In the present study, cost effective production of tannase by Klebsiella pneumoniae KP715242 was studied under submerged fermentation using different tannin rich agro-residues like Indian gooseberry leaves (Phyllanthus emblica), Black plum leaves (Syzygium cumini), Eucalyptus leaves (Eucalyptus glogus) and Babul leaves (Acacia nilotica). Among all agro-residues, Indian gooseberry leaves were found to be the best substrate for tannase production under submerged fermentation. Sequential optimization approach using Taguchi orthogonal array screening and response surface methodology was adopted to optimize the fermentation variables in order to enhance the enzyme production. Eleven medium components were screened primarily by Taguchi orthogonal array design to identify the most contributing factors towards the enzyme production. The four most significant contributing variables affecting tannase production were found to be pH (23.62 %), tannin extract (20.70 %), temperature (20.33 %) and incubation time (14.99 %). These factors were further optimized with central composite design using response surface methodology. Maximum tannase production was observed at 5.52 pH, 39.72 °C temperature, 91.82 h of incubation time and 2.17 % tannin content. The enzyme activity was enhanced by 1.26 fold under these optimized conditions. The present study emphasizes the use of agro-residues as a potential substrate with an aim to lower down the input costs for tannase production so that the enzyme could be used proficiently for commercial purposes.
Water Quality and Quantity Modeling for Hydrologic and Policy Decision Making
NASA Astrophysics Data System (ADS)
Rubiano, J.; Giron, E.; Quintero, M.; O'Brien, R.
2004-12-01
This paper presents the results of a research project that elucidate the excesses of nitrogen and phosphorous using a spatial-temporal modeling approach. The project uses the approach of integrating biophysical and socio-economic knowledge to offer sound solution to multiple stakeholders within a watershed context. The aim is to promote rural development and solve environmental conflicts by focusing on the internalization of externalities derived from watershed management, triggering the transference of funding from urban to rural populations, making the city invest in environmental goods or services offered by rural environments. The integrated modeling is focused towards identifying causal relationships between land use and management on the one hand, and water quantity/quality and sedimentation downstream on the other. Estimation of the amount of contaminated sediments transported in the study area and its impact is also studied here. The soil runoff information within the study area is obtained considering the characteristics of erosion using a MUSLE model as a sub-model of SWAT model. Using regression analysis, mathematical relationships between rainfall and surface runoff and between land use or management practices and the measured nitrate and phosphate load are established. The methodology first integrates most of the key spatial information available for the site to facilitate envisioning different land use scenarios and their impacts upon water resources. Subsequently, selected alternatives scenarios regarding the identified externalities are analyzed using optimization models. Opportunities for and constraints to promoting co-operation among users are exposed with the aid of economic games in which more sustainable land use or management alternatives are suggested. Strategic alliances and collective action are promoted in order to implement those alternatives that are environmentally sound and economically feasible. Such options are supported by co-funding schemes designed with the private and public stakeholders having a role in the study area. The significance of this research is clearly depicted by the results of the different models applying here for the assessment of water quality parameters and for modeling upper catchments terrain surface change in the study area. Application of the methodology is presented for the Fuquene Lake Basin in Cundinamarca, Colombia. Additional research needs and limitations of the methodology are highlighted.
NASA Technical Reports Server (NTRS)
Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.
1992-01-01
An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.
Measures of outdoor play and independent mobility in children and youth: A methodological review.
Bates, Bree; Stone, Michelle R
2015-09-01
Declines in children's outdoor play have been documented globally, which are partly due to heightened restrictions around children's independent mobility. Literature on outdoor play and children's independent mobility is increasing, yet no paper has summarized the various methodological approaches used. A methodological review could highlight most commonly used measures and comprehensive research designs that could result in more standardized methodological approaches. Methodological review. A standardized protocol guided a methodological review of published research on measures of outdoor play and children's independent mobility in children and youth (0-18 years). Online searches of 8 electronic databases were conducted and studies included if they contained a subjective/objective measure of outdoor play or children's independent mobility. References of included articles were scanned to identify additional articles. Twenty-four studies were included on outdoor play, and twenty-three on children's independent mobility. Study designs were diverse. Common objective measures included accelerometry, global positioning systems and direct observation; questionnaires, surveys and interviews were common subjective measures. Focus groups, activity logs, monitoring sheets, travel/activity diaries, behavioral maps and guided tours were also utilized. Questionnaires were used most frequently, yet few studies used the same questionnaire. Five studies employed comprehensive, mixed-methods designs. Outdoor play and children's independent mobility have been measured using a wide variety of techniques, with only a few studies using similar methodologies. A standardized methodological approach does not exist. Future researchers should consider including both objective measures (accelerometry and global positioning systems) and subjective measures (questionnaires, activity logs, interviews), as more comprehensive designs will enhance understanding of each multidimensional construct. Creating a standardized methodological approach would improve study comparisons. Copyright © 2014 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lowe, Benjamin M.; Skylaris, Chris-Kriton; Green, Nicolas G.; Shibuta, Yasushi; Sakata, Toshiya
2018-04-01
Continuum-based methods are important in calculating electrostatic properties of interfacial systems such as the electric field and surface potential but are incapable of providing sufficient insight into a range of fundamentally and technologically important phenomena which occur at atomistic length-scales. In this work a molecular dynamics methodology is presented for interfacial electric field and potential calculations. The silica–water interface was chosen as an example system, which is highly relevant for understanding the response of field-effect transistors sensors (FET sensors). Detailed validation work is presented, followed by the simulated surface charge/surface potential relationship. This showed good agreement with experiment at low surface charge density but at high surface charge density the results highlighted challenges presented by an atomistic definition of the surface potential. This methodology will be used to investigate the effect of surface morphology and biomolecule addition; both factors which are challenging using conventional continuum models.
Optimum surface roughness prediction for titanium alloy by adopting response surface methodology
NASA Astrophysics Data System (ADS)
Yang, Aimin; Han, Yang; Pan, Yuhang; Xing, Hongwei; Li, Jinze
Titanium alloy has been widely applied in industrial engineering products due to its advantages of great corrosion resistance and high specific strength. This paper investigated the processing parameters for finish turning of titanium alloy TC11. Firstly, a three-factor central composite design of experiment, considering the cutting speed, feed rate and depth of cut, are conducted in titanium alloy TC11 and the corresponding surface roughness are obtained. Then a mathematic model is constructed by the response surface methodology to fit the relationship between the process parameters and the surface roughness. The prediction accuracy was verified by the one-way ANOVA. Finally, the contour line of the surface roughness under different combination of process parameters are obtained and used for the optimum surface roughness prediction. Verification experimental results demonstrated that material removal rate (MRR) at the obtained optimum can be significantly improved without sacrificing the surface roughness.
ERIC Educational Resources Information Center
Zhao, Yue; Huen, Jenny M. Y.; Chan, Y. W.
2017-01-01
This study pioneers a Rasch scoring approach and compares it to a conventional summative approach for measuring longitudinal gains in student learning. In this methodological note, our proposed methodology is demonstrated using an example of rating scales in a student survey as part of a higher education outcome assessment. Such assessments have…
NASA Technical Reports Server (NTRS)
Tamma, Kumar K.; Railkar, Sudhir B.
1988-01-01
This paper describes new and recent advances in the development of a hybrid transfinite element computational methodology for applicability to conduction/convection/radiation heat transfer problems. The transfinite element methodology, while retaining the modeling versatility of contemporary finite element formulations, is based on application of transform techniques in conjunction with classical Galerkin schemes and is a hybrid approach. The purpose of this paper is to provide a viable hybrid computational methodology for applicability to general transient thermal analysis. Highlights and features of the methodology are described and developed via generalized formulations and applications to several test problems. The proposed transfinite element methodology successfully provides a viable computational approach and numerical test problems validate the proposed developments for conduction/convection/radiation thermal analysis.
Haines, Seth S.
2015-07-13
The quantities of water and hydraulic fracturing proppant required for producing petroleum (oil, gas, and natural gas liquids) from continuous accumulations, and the quantities of water extracted during petroleum production, can be quantitatively assessed using a probabilistic approach. The water and proppant assessment methodology builds on the U.S. Geological Survey methodology for quantitative assessment of undiscovered technically recoverable petroleum resources in continuous accumulations. The U.S. Geological Survey assessment methodology for continuous petroleum accumulations includes fundamental concepts such as geologically defined assessment units, and probabilistic input values including well-drainage area, sweet- and non-sweet-spot areas, and success ratio within the untested area of each assessment unit. In addition to petroleum-related information, required inputs for the water and proppant assessment methodology include probabilistic estimates of per-well water usage for drilling, cementing, and hydraulic-fracture stimulation; the ratio of proppant to water for hydraulic fracturing; the percentage of hydraulic fracturing water that returns to the surface as flowback; and the ratio of produced water to petroleum over the productive life of each well. Water and proppant assessments combine information from recent or current petroleum assessments with water- and proppant-related input values for the assessment unit being studied, using Monte Carlo simulation, to yield probabilistic estimates of the volume of water for drilling, cementing, and hydraulic fracture stimulation; the quantity of proppant for hydraulic fracture stimulation; and the volumes of water produced as flowback shortly after well completion, and produced over the life of the well.
NASA Astrophysics Data System (ADS)
Bayo, A.; Rodrigo, C.; Barrado, D.; Allard, F.
One of the very first steps astronomers working in stellar physics perform to advance in their studies, is to determine the most common/relevant physical parameters of the objects of study (effective temperature, bolometric luminosity, surface gravity, etc.). Different methodologies exist depending on the nature of the data, intrinsic properties of the objects, etc. One common approach is to compare the observational data with theoretical models passed through some simulator that will leave in the synthetic data the same imprint than the observational data carries, and see what set of parameters reproduce the observations best. Even in this case, depending on the kind of data the astronomer has, the methodology changes slightly. After parameters are published, the community tend to quote, praise and criticize them, sometimes paying little attention on whether the possible discrepancies come from the theoretical models, the data themselves or just the methodology used in the analysis. In this work we perform the simple, yet interesting, exercise of comparing the effective temperatures obtained via SED and more detailed spectral fittings (to the same grid of models), of a sample of well known and characterized young M-type objects members to different star forming regions and show how differences in temperature of up to 350 K can be expected just from the difference in methodology/data used. On the other hand we show how these differences are smaller for colder objects even when the complexity of the fit increases like for example introducing differential extinction. To perform this exercise we benefit greatly from the framework offered by the Virtual Observaotry.
NASA Astrophysics Data System (ADS)
Bowe, Brian W.; Daly, Siobhan; Flynn, Cathal; Howard, Robert
2003-03-01
In this paper a model for the implementation of a problem-based learning (PBL) course for a typical year physics one programme is described. Reference is made to how PBL has been implemented in relation to geometrical and physical optics. PBL derives from the theory that learning is an active process in which the learner constructs new knowledge on the basis of current knowledge, unlike traditional teaching practices in higher education, where the emphasis is on the transmission of factual knowledge. The course consists of a set of optics related real life problems that are carefully constructed to meet specified learning outcomes. The students, working in groups, encounter these problem-solving situations and are facilitated to produce a solution. The PBL course promotes student engagement in order to achieve higher levels of cognitive learning. Evaluation of the course indicates that the students adopt a deep learning approach and that they attain a thorough understanding of the subject instead of the superficial understanding associated with surface learning. The methodology also helps students to develop metacognitive skills. Another outcome of this teaching methodology is the development of key skills such as the ability to work in a group and to communicate, and present, information effectively.
Surface immobilized antibody orientation determined using ToF-SIMS and multivariate analysis.
Welch, Nicholas G; Madiona, Robert M T; Payten, Thomas B; Easton, Christopher D; Pontes-Braz, Luisa; Brack, Narelle; Scoble, Judith A; Muir, Benjamin W; Pigram, Paul J
2017-06-01
Antibody orientation at solid phase interfaces plays a critical role in the sensitive detection of biomolecules during immunoassays. Correctly oriented antibodies with solution-facing antigen binding regions have improved antigen capture as compared to their randomly oriented counterparts. Direct characterization of oriented proteins with surface analysis methods still remains a challenge however surface sensitive techniques such as Time-of-Flight Secondary Ion Mass Spectrometry (ToF-SIMS) provide information-rich data that can be used to probe antibody orientation. Diethylene glycol dimethyl ether plasma polymers (DGpp) functionalized with chromium (DGpp+Cr) have improved immunoassay performance that is indicative of preferential antibody orientation. Herein, ToF-SIMS data from proteolytic fragments of anti-EGFR antibody bound to DGpp and DGpp+Cr are used to construct artificial neural network (ANN) and principal component analysis (PCA) models indicative of correctly oriented systems. Whole antibody samples (IgG) test against each of the models indicated preferential antibody orientation on DGpp+Cr. Cross-reference between ANN and PCA models yield 20 mass fragments associated with F(ab') 2 region representing correct orientation, and 23 mass fragments associated with the Fc region representing incorrect orientation. Mass fragments were then compared to amino acid fragments and amino acid composition in F(ab') 2 and Fc regions. A ratio of the sum of the ToF-SIMS ion intensities from the F(ab') 2 fragments to the Fc fragments demonstrated a 50% increase in intensity for IgG on DGpp+Cr as compared to DGpp. The systematic data analysis methodology employed herein offers a new approach for the investigation of antibody orientation applicable to a range of substrates. Controlled orientation of antibodies at solid phases is critical for maximizing antigen detection in biosensors and immunoassays. Surface-sensitive techniques (such as ToF-SIMS), capable of direct characterization of surface immobilized and oriented antibodies, are under-utilized in current practice. Selection of a small number of mass fragments for analysis, typically pertaining to amino acids, is commonplace in literature, leaving the majority of the information-rich spectra unanalyzed. The novelty of this work is the utilization of a comprehensive, unbiased mass fragment list and the employment of principal component analysis (PCA) and artificial neural network (ANN) models in a unique methodology to prove antibody orientation. This methodology is of significant and broad interest to the scientific community as it is applicable to a range of substrates and allows for direct, label-free characterization of surface bound proteins. Copyright © 2017 Acta Materialia Inc. All rights reserved.
Lv, Shao-Wa; Liu, Dong; Hu, Pan-Pan; Ye, Xu-Yan; Xiao, Hong-Bin; Kuang, Hai-Xue
2010-03-01
To optimize the process of extracting effective constituents from Aralia elata by response surface methodology. The independent variables were ethanol concentration, reflux time and solvent fold, the dependent variable was extraction rate of total saponins in Aralia elata. Linear or no-linear mathematic models were used to estimate the relationship between independent and dependent variables. Response surface methodology was used to optimize the process of extraction. The prediction was carried out through comparing the observed and predicted values. Regression coefficient of binomial fitting complex model was as high as 0.9617, the optimum conditions of extraction process were 70% ethanol, 2.5 hours for reflux, 20-fold solvent and 3 times for extraction. The bias between observed and predicted values was -2.41%. It shows the optimum model is highly predictive.
Rough set approach for accident chains exploration.
Wong, Jinn-Tsai; Chung, Yi-Shih
2007-05-01
This paper presents a novel non-parametric methodology--rough set theory--for accident occurrence exploration. The rough set theory allows researchers to analyze accidents in multiple dimensions and to model accident occurrence as factor chains. Factor chains are composed of driver characteristics, trip characteristics, driver behavior and environment factors that imply typical accident occurrence. A real-world database (2003 Taiwan single auto-vehicle accidents) is used as an example to demonstrate the proposed approach. The results show that although most accident patterns are unique, some accident patterns are significant and worth noting. Student drivers who are young and less experienced exhibit a relatively high possibility of being involved in off-road accidents on roads with a speed limit between 51 and 79 km/h under normal driving circumstances. Notably, for bump-into-facility accidents, wet surface is a distinctive environmental factor.
Optimal Design of Material and Process Parameters in Powder Injection Molding
NASA Astrophysics Data System (ADS)
Ayad, G.; Barriere, T.; Gelin, J. C.; Song, J.; Liu, B.
2007-04-01
The paper is concerned with optimization and parametric identification for the different stages in Powder Injection Molding process that consists first in injection of powder mixture with polymer binder and then to the sintering of the resulting powders part by solid state diffusion. In the first part, one describes an original methodology to optimize the process and geometry parameters in injection stage based on the combination of design of experiments and an adaptive Response Surface Modeling. Then the second part of the paper describes the identification strategy that one proposes for the sintering stage, using the identification of sintering parameters from dilatometeric curves followed by the optimization of the sintering process. The proposed approaches are applied to the optimization of material and process parameters for manufacturing a ceramic femoral implant. One demonstrates that the proposed approach give satisfactory results.
Photonic Resins: Designing Optical Appearance via Block Copolymer Self-Assembly
2018-01-01
Despite a huge variety of methodologies having been proposed to produce photonic structures by self-assembly, the lack of an effective fabrication approach has hindered their practical uses. These approaches are typically limited by the poor control in both optical and mechanical properties. Here we report photonic thermosetting polymeric resins obtained through brush block copolymer (BBCP) self-assembly. We demonstrate that the control of the interplay between order and disorder in the obtained photonic structure offers a powerful tool box for designing the optical appearance of the polymer resins in terms of reflected wavelength and scattering properties. The obtained materials exhibit excellent mechanical properties with hardness up to 172 MPa and Young’s modulus over 2.9 GPa, indicating great potential for practical uses as photonic coatings on a variety of surfaces. PMID:29681653
An Approach for Implementation of Project Management Information Systems
NASA Astrophysics Data System (ADS)
Běrziša, Solvita; Grabis, Jānis
Project management is governed by project management methodologies, standards, and other regulatory requirements. This chapter proposes an approach for implementing and configuring project management information systems according to requirements defined by these methodologies. The approach uses a project management specification framework to describe project management methodologies in a standardized manner. This specification is used to automatically configure the project management information system by applying appropriate transformation mechanisms. Development of the standardized framework is based on analysis of typical project management concepts and process and existing XML-based representations of project management. A demonstration example of project management information system's configuration is provided.
Radiometric spectral and band rendering of targets using anisotropic BRDFs and measured backgrounds
NASA Astrophysics Data System (ADS)
Hilgers, John W.; Hoffman, Jeffrey A.; Reynolds, William R.; Jafolla, James C.
2000-07-01
Achievement of ultra-high fidelity signature modeling of targets requires a significant level of complexity for all of the components required in the rendering process. Specifically, the reflectance of the surface must be described using the bi-directional distribution function (BRDF). In addition, the spatial representation of the background must be high fidelity. A methodology and corresponding model for spectral and band rendering of targets using both isotropic and anisotropic BRDFs is presented. In addition, a set of tools will be described for generating theoretical anisotropic BRDFs and for reducing data required for a description of an anisotropic BRDF by 5 orders of magnitude. This methodology is hybrid using a spectrally measured panoramic of the background mapped to a large hemisphere. Both radiosity and ray-tracing approaches are incorporated simultaneously for a robust solution. In the thermal domain the spectral emission is also included in the solution. Rendering examples using several BRDFs will be presented.
Freger, Viatcheslav
2004-06-01
The paper introduces a new methodology for studying polyamide composite membranes for reverse osmosis (RO) and nanofiltration (NF) in liquid environments. The methodology is based on atomic force microscopy of the active layer, which had been separated from the support and placed on a solid substrate. The approach was employed to determine the thickness, interfacial morphology, and dimensional changes in solution (swelling) of polyamide films. The face (active) and back (facing the support) surfaces of the RO films appeared morphologically similar, in agreement with the recently proposed model of skin formation. Measured thickness and swelling data in conjunction with the intrinsic permeability of the membranes suggest that the selective barrier in RO membrane constitutes only a fraction of the polyamide skin, whereas NF membranes behave as nearly uniform films. For NF membranes, there was reasonable correlation between the changes in the swelling and in the permeability of the membrane and the salinity and pH of the feed.
NASA Astrophysics Data System (ADS)
de Vito, Rossella; Portoghese, Ivan; Pagano, Alessandro; Fratino, Umberto; Vurro, Michele
2017-12-01
Increasing pressure affects water resources, especially in the agricultural sector, with cascading impacts on energy consumption. This is particularly relevant in the Mediterranean area, showing significant water scarcity problems, further exacerbated by the crucial economic role of agricultural production. Assessing the sustainability of water resource use is thus essential to preserving ecosystems and maintaining high levels of agricultural productivity. This paper proposes an integrated methodology based on the Water-Energy-Food Nexus to evaluate the multi-dimensional implications of irrigation practices. Three different indices are introduced, based on an analysis of the most influential factors. The methodology is then implemented in a catchment located in Puglia (Italy) and a comparative analysis of the three indices is presented. The results mainly highlight that economic land productivity is a key driver of irrigated agriculture, and that groundwater is highly affordable compared to surface water, thus being often dangerously perceived as freely available.
Effects of Mesh Irregularities on Accuracy of Finite-Volume Discretization Schemes
NASA Technical Reports Server (NTRS)
Diskin, Boris; Thomas, James L.
2012-01-01
The effects of mesh irregularities on accuracy of unstructured node-centered finite-volume discretizations are considered. The focus is on an edge-based approach that uses unweighted least-squares gradient reconstruction with a quadratic fit. For inviscid fluxes, the discretization is nominally third order accurate on general triangular meshes. For viscous fluxes, the scheme is an average-least-squares formulation that is nominally second order accurate and contrasted with a common Green-Gauss discretization scheme. Gradient errors, truncation errors, and discretization errors are separately studied according to a previously introduced comprehensive methodology. The methodology considers three classes of grids: isotropic grids in a rectangular geometry, anisotropic grids typical of adapted grids, and anisotropic grids over a curved surface typical of advancing layer grids. The meshes within the classes range from regular to extremely irregular including meshes with random perturbation of nodes. Recommendations are made concerning the discretization schemes that are expected to be least sensitive to mesh irregularities in applications to turbulent flows in complex geometries.
NASA Astrophysics Data System (ADS)
de Andrea González, Ángel; González-Gutiérrez, Leo M.
2017-09-01
The Rayleigh-Taylor instability (RTI) in an infinite slab where a constant density lower fluid is initially separated from an upper stratified fluid is discussed in linear regime. The upper fluid is of increasing exponential density and surface tension is considered between both of them. It was found useful to study stability by using the initial value problem approach (IVP), so that we ensure the inclusion of certain continuum modes, otherwise neglected. This methodology includes the branch cut in the complex plane, consequently, in addition to discrete modes (surface RTI modes), a set of continuum modes (internal RTI modes) also appears. As a result, the usual information given by the normal mode method is now complete. Furthermore, a new role is found for surface tension: to transform surface RTI modes (discrete spectrum) into internal RTI modes belonging to a continuous spectrum at a critical wavenumber. As a consequence, the cut-off wavenumber disappears: i.e. the growth rate of the RTI surface mode does not decay to zero at the cut-off wavenumber, as previous researchers used to believe. Finally, we found that, due to the continuum, the asymptotic behavior of the perturbation with respect to time is slower than the exponential when only the continuous spectrum exists.
A methodology for modeling surface effects on stiff and soft solids
NASA Astrophysics Data System (ADS)
He, Jin; Park, Harold S.
2017-09-01
We present a computational method that can be applied to capture surface stress and surface tension-driven effects in both stiff, crystalline nanostructures, like size-dependent mechanical properties, and soft solids, like elastocapillary effects. We show that the method is equivalent to the classical Young-Laplace model. The method is based on converting surface tension and surface elasticity on a zero-thickness surface to an initial stress and corresponding elastic properties on a finite thickness shell, where the consideration of geometric nonlinearity enables capturing the out-of-plane component of the surface tension that results for curved surfaces through evaluation of the surface stress in the deformed configuration. In doing so, we are able to use commercially available finite element technology, and thus do not require consideration and implementation of the classical Young-Laplace equation. Several examples are presented to demonstrate the capability of the methodology for modeling surface stress in both soft solids and crystalline nanostructures.
A methodology for modeling surface effects on stiff and soft solids
NASA Astrophysics Data System (ADS)
He, Jin; Park, Harold S.
2018-06-01
We present a computational method that can be applied to capture surface stress and surface tension-driven effects in both stiff, crystalline nanostructures, like size-dependent mechanical properties, and soft solids, like elastocapillary effects. We show that the method is equivalent to the classical Young-Laplace model. The method is based on converting surface tension and surface elasticity on a zero-thickness surface to an initial stress and corresponding elastic properties on a finite thickness shell, where the consideration of geometric nonlinearity enables capturing the out-of-plane component of the surface tension that results for curved surfaces through evaluation of the surface stress in the deformed configuration. In doing so, we are able to use commercially available finite element technology, and thus do not require consideration and implementation of the classical Young-Laplace equation. Several examples are presented to demonstrate the capability of the methodology for modeling surface stress in both soft solids and crystalline nanostructures.
Estimation of Land Surface Fluxes and Their Uncertainty via Variational Data Assimilation Approach
NASA Astrophysics Data System (ADS)
Abdolghafoorian, A.; Farhadi, L.
2016-12-01
Accurate estimation of land surface heat and moisture fluxes as well as root zone soil moisture is crucial in various hydrological, meteorological, and agricultural applications. "In situ" measurements of these fluxes are costly and cannot be readily scaled to large areas relevant to weather and climate studies. Therefore, there is a need for techniques to make quantitative estimates of heat and moisture fluxes using land surface state variables. In this work, we applied a novel approach based on the variational data assimilation (VDA) methodology to estimate land surface fluxes and soil moisture profile from the land surface states. This study accounts for the strong linkage between terrestrial water and energy cycles by coupling the dual source energy balance equation with the water balance equation through the mass flux of evapotranspiration (ET). Heat diffusion and moisture diffusion into the column of soil are adjoined to the cost function as constraints. This coupling results in more accurate prediction of land surface heat and moisture fluxes and consequently soil moisture at multiple depths with high temporal frequency as required in many hydrological, environmental and agricultural applications. One of the key limitations of VDA technique is its tendency to be ill-posed, meaning that a continuum of possibilities exists for different parameters that produce essentially identical measurement-model misfit errors. On the other hand, the value of heat and moisture flux estimation to decision-making processes is limited if reasonable estimates of the corresponding uncertainty are not provided. In order to address these issues, in this research uncertainty analysis will be performed to estimate the uncertainty of retrieved fluxes and root zone soil moisture. The assimilation algorithm is tested with a series of experiments using a synthetic data set generated by the simultaneous heat and water (SHAW) model. We demonstrate the VDA performance by comparing the (synthetic) true measurements (including profile of soil moisture and temperature, land surface water and heat fluxes, and root water uptake) with VDA estimates. In addition, the feasibility of extending the proposed approach to use remote sensing observations is tested by limiting the number of LST observations and soil moisture observations.
Evaluation Model for Pavement Surface Distress on 3d Point Clouds from Mobile Mapping System
NASA Astrophysics Data System (ADS)
Aoki, K.; Yamamoto, K.; Shimamura, H.
2012-07-01
This paper proposes a methodology to evaluate the pavement surface distress for maintenance planning of road pavement using 3D point clouds from Mobile Mapping System (MMS). The issue on maintenance planning of road pavement requires scheduled rehabilitation activities for damaged pavement sections to keep high level of services. The importance of this performance-based infrastructure asset management on actual inspection data is globally recognized. Inspection methodology of road pavement surface, a semi-automatic measurement system utilizing inspection vehicles for measuring surface deterioration indexes, such as cracking, rutting and IRI, have already been introduced and capable of continuously archiving the pavement performance data. However, any scheduled inspection using automatic measurement vehicle needs much cost according to the instruments' specification or inspection interval. Therefore, implementation of road maintenance work, especially for the local government, is difficult considering costeffectiveness. Based on this background, in this research, the methodologies for a simplified evaluation for pavement surface and assessment of damaged pavement section are proposed using 3D point clouds data to build urban 3D modelling. The simplified evaluation results of road surface were able to provide useful information for road administrator to find out the pavement section for a detailed examination and for an immediate repair work. In particular, the regularity of enumeration of 3D point clouds was evaluated using Chow-test and F-test model by extracting the section where the structural change of a coordinate value was remarkably achieved. Finally, the validity of the current methodology was investigated by conducting a case study dealing with the actual inspection data of the local roads.
PRO_LIGAND: An approach to de novo molecular design. 4. Application to the design of peptides
NASA Astrophysics Data System (ADS)
Frenkel, David; Clark, David E.; Li, Jin; Murray, Christopher W.; Robson, Barry; Waszkowycz, Bohdan; Westhead, David R.
1995-06-01
In some instances, peptides can play an important role in the discovery of lead compounds. This paper describes the peptide design facility of the de novo drug design package, PRO_LIGAND. The package provides a unified framework for the design of peptides that are similar or complementary to a specified target. The approach uses single amino acid residues, selected from preconstructed libraries of different residues and conformations, and places them on top of predefined target interaction sites. This approach is a well-tested methodology for the design of organics but has not been used for peptides before. Peptides represent a difficulty because of their great conformational flexibility and a study of the advantages and disavantages of this simple approach is an important step in the development of design tools. After a description of our general approach, a more detailed discussion of its adaptation to peptides is given. The method is then applied to the design of peptide-based inhibitors to HIV-1 protease and the design of structural mimics of the surface region of lysozyme. The results are encouraging and point the way towards further development of interaction site-based approaches for peptide design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maclaurin, Galen; Sengupta, Manajit; Xie, Yu
A significant source of bias in the transposition of global horizontal irradiance to plane-of-array (POA) irradiance arises from inaccurate estimations of surface albedo. The current physics-based model used to produce the National Solar Radiation Database (NSRDB) relies on model estimations of surface albedo from a reanalysis climatalogy produced at relatively coarse spatial resolution compared to that of the NSRDB. As an input to spectral decomposition and transposition models, more accurate surface albedo data from remotely sensed imagery at finer spatial resolutions would improve accuracy in the final product. The National Renewable Energy Laboratory (NREL) developed an improved white-sky (bi-hemispherical reflectance)more » broadband (0.3-5.0 ..mu..m) surface albedo data set for processing the NSRDB from two existing data sets: a gap-filled albedo product and a daily snow cover product. The Moderate Resolution Imaging Spectroradiometer (MODIS) sensors onboard the Terra and Aqua satellites have provided high-quality measurements of surface albedo at 30 arc-second spatial resolution and 8-day temporal resolution since 2001. The high spatial and temporal resolutions and the temporal coverage of the MODIS sensor will allow for improved modeling of POA irradiance in the NSRDB. However, cloud and snow cover interfere with MODIS observations of ground surface albedo, and thus they require post-processing. The MODIS production team applied a gap-filling methodology to interpolate observations obscured by clouds or ephemeral snow. This approach filled pixels with ephemeral snow cover because the 8-day temporal resolution is too coarse to accurately capture the variability of snow cover and its impact on albedo estimates. However, for this project, accurate representation of daily snow cover change is important in producing the NSRDB. Therefore, NREL also used the Integrated Multisensor Snow and Ice Mapping System data set, which provides daily snow cover observations of the Northern Hemisphere for the temporal extent of the NSRDB (1998-2015). We provide a review of validation studies conducted on these two products and describe the methodology developed by NREL to remap the data products to the NSRDB grid and integrate them into a seamless daily data set.« less
Documentation and Detection of Colour Changes of Bas Relieves Using Close Range Photogrammetry
NASA Astrophysics Data System (ADS)
Malinverni, E. S.; Pierdicca, R.; Sturari, M.; Colosi, F.; Orazi, R.
2017-05-01
The digitization of complex buildings, findings or bas relieves can strongly facilitate the work of archaeologists, mainly for in depth analysis tasks. Notwithstanding, whether new visualization techniques ease the study phase, a classical naked-eye approach for determining changes or surface alteration could bring towards several drawbacks. The research work described in these pages is aimed at providing experts with a workflow for the evaluation of alterations (e.g. color decay or surface alterations), allowing a more rapid and objective monitoring of monuments. More in deep, a pipeline of work has been tested in order to evaluate the color variation between surfaces acquired at different époques. The introduction of reliable tools of change detection in the archaeological domain is needful; in fact, the most widespread practice, among archaeologists and practitioners, is to perform a traditional monitoring of surfaces that is made of three main steps: production of a hand-made map based on a subjective analysis, selection of a sub-set of regions of interest, removal of small portion of surface for in depth analysis conducted in laboratory. To overcome this risky and time consuming process, digital automatic change detection procedure represents a turning point. To do so, automatic classification has been carried out according to two approaches: a pixel-based and an object-based method. Pixel-based classification aims to identify the classes by means of the spectral information provided by each pixel belonging to the original bands. The object-based approach operates on sets of pixels (objects/regions) grouped together by means of an image segmentation technique. The methodology was tested by studying the bas-relieves of a temple located in Peru, named Huaca de la Luna. Despite the data sources were collected with unplanned surveys, the workflow proved to be a valuable solution useful to understand which are the main changes over time.
[Scientific and methodologic approaches to evaluating medical management for workers of Kazakhstan].
2012-01-01
The article covers topical problems of workers' health preservation. Complex research results enabled to evaluate and analyze occupational risks in leading industries of Kazakhstan, for improving scientific and methodologic approaches to medical management for workers subjected to hazardous conditions.
ERIC Educational Resources Information Center
Nordstrum, Lee E.; LeMahieu, Paul G.; Dodd, Karen
2017-01-01
Purpose: This paper is one of seven in this volume elaborating different approaches to quality improvement in education. This paper aims to delineate a methodology called Deliverology. Design/methodology/approach: The paper presents the origins, theoretical foundations, core principles and a case study showing an application of Deliverology in the…
2011-09-01
a quality evaluation with limited data, a model -based assessment must be...that affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a ...affect system performance, a multistage approach to system validation, a modeling and experimental methodology for efficiently addressing a wide range
The "Push-Pull" Approach to Fast-Track Management Development: A Case Study in Scientific Publishing
ERIC Educational Resources Information Center
Fojt, Martin; Parkinson, Stephen; Peters, John; Sandelands, Eric
2008-01-01
Purpose: The purpose of this paper is to explore how a medium sized business has addressed what it has termed a "push-pull" method of management and organization development, based around an action learning approach. Design/methodology/approach: The paper sets out a methodology that other SMEs might look to replicate in their management and…
NASA Astrophysics Data System (ADS)
Huang, Zhongjie; Siozos-Rousoulis, Leonidas; De Troyer, Tim; Ghorbaniasl, Ghader
2018-02-01
This paper presents a time-domain method for noise prediction of supersonic rotating sources in a moving medium. The proposed approach can be interpreted as an extensive time-domain solution for the convected permeable Ffowcs Williams and Hawkings equation, which is capable of avoiding the Doppler singularity. The solution requires special treatment for construction of the emission surface. The derived formula can explicitly and efficiently account for subsonic uniform constant flow effects on radiated noise. Implementation of the methodology is realized through the Isom thickness noise case and high-speed impulsive noise prediction from helicopter rotors.
Cross calibration of the Landsat-7 ETM+ and EO-1 ALI sensor
Chander, G.; Meyer, D.J.; Helder, D.L.
2004-01-01
As part of the Earth Observer 1 (EO-1) Mission, the Advanced Land Imager (ALI) demonstrates a potential technological direction for Landsat Data Continuity Missions. To evaluate ALI's capabilities in this role, a cross-calibration methodology has been developed using image pairs from the Landsat-7 (L7) Enhanced Thematic Mapper Plus (ETM+) and EO-1 (ALI) to verify the radiometric calibration of ALI with respect to the well-calibrated L7 ETM+ sensor. Results have been obtained using two different approaches. The first approach involves calibration of nearly simultaneous surface observations based on image statistics from areas observed simultaneously by the two sensors. The second approach uses vicarious calibration techniques to compare the predicted top-of-atmosphere radiance derived from ground reference data collected during the overpass to the measured radiance obtained from the sensor. The results indicate that the relative sensor chip assemblies gains agree with the ETM+ visible and near-infrared bands to within 2% and the shortwave infrared bands to within 4%.
Corron, Louise; Marchal, François; Condemi, Silvana; Chaumoître, Kathia; Adalian, Pascal
2017-01-01
Juvenile age estimation methods used in forensic anthropology generally lack methodological consistency and/or statistical validity. Considering this, a standard approach using nonparametric Multivariate Adaptive Regression Splines (MARS) models were tested to predict age from iliac biometric variables of male and female juveniles from Marseilles, France, aged 0-12 years. Models using unidimensional (length and width) and bidimensional iliac data (module and surface) were constructed on a training sample of 176 individuals and validated on an independent test sample of 68 individuals. Results show that MARS prediction models using iliac width, module and area give overall better and statistically valid age estimates. These models integrate punctual nonlinearities of the relationship between age and osteometric variables. By constructing valid prediction intervals whose size increases with age, MARS models take into account the normal increase of individual variability. MARS models can qualify as a practical and standardized approach for juvenile age estimation. © 2016 American Academy of Forensic Sciences.
Efficient numerical method of freeform lens design for arbitrary irradiance shaping
NASA Astrophysics Data System (ADS)
Wojtanowski, Jacek
2018-05-01
A computational method to design a lens with a flat entrance surface and a freeform exit surface that can transform a collimated, generally non-uniform input beam into a beam with a desired irradiance distribution of arbitrary shape is presented. The methodology is based on non-linear elliptic partial differential equations, known as Monge-Ampère PDEs. This paper describes an original numerical algorithm to solve this problem by applying the Gauss-Seidel method with simplified boundary conditions. A joint MATLAB-ZEMAX environment is used to implement and verify the method. To prove the efficiency of the proposed approach, an exemplary study where the designed lens is faced with the challenging illumination task is shown. An analysis of solution stability, iteration-to-iteration ray mapping evolution (attached in video format), depth of focus and non-zero étendue efficiency is performed.
Tracer Methods for Characterizing Fracture Creation in Engineered Geothermal Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Peter; Harris, Joel
2014-05-08
The aim of this proposal is to develop, through novel high-temperature-tracing approaches, three technologies for characterizing fracture creation within Engineered Geothermal Systems (EGS). The objective of a first task is to identify, develop and demonstrate adsorbing tracers for characterizing interwell reservoir-rock surface areas and fracture spacing. The objective of a second task is to develop and demonstrate a methodology for measuring fracture surface areas adjacent to single wells. The objective of a third task is to design, fabricate and test an instrument that makes use of tracers for measuring fluid flow between newly created fractures and wellbores. In one methodmore » of deployment, it will be used to identify qualitatively which fractures were activated during a hydraulic stimulation experiment. In a second method of deployment, it will serve to measure quantitatively the rate of fluid flowing from one or more activated fracture during a production test following a hydraulic stimulation.« less
Two-probe STM experiments at the atomic level.
Kolmer, Marek; Olszowski, Piotr; Zuzak, Rafal; Godlewski, Szymon; Joachim, Christian; Szymonski, Marek
2017-11-08
Direct characterization of planar atomic or molecular scale devices and circuits on a supporting surface by multi-probe measurements requires unprecedented stability of single atom contacts and manipulation of scanning probes over large, nanometer scale area with atomic precision. In this work, we describe the full methodology behind atomically defined two-probe scanning tunneling microscopy (STM) experiments performed on a model system: dangling bond dimer wire supported on a hydrogenated germanium (0 0 1) surface. We show that 70 nm long atomic wire can be simultaneously approached by two independent STM scanners with exact probe to probe distance reaching down to 30 nm. This allows direct wire characterization by two-probe I-V characteristics at distances below 50 nm. Our technical results presented in this work open a new area for multi-probe research, which can be now performed with precision so far accessible only by single-probe scanning probe microscopy (SPM) experiments.
NASA Technical Reports Server (NTRS)
Lin, Shian-Jiann; DaSilva, Arlindo; Atlas, Robert (Technical Monitor)
2001-01-01
Toward the development of a finite-volume Data Assimilation System (fvDAS), a consistent finite-volume methodology is developed for interfacing the NASA/DAO's Physical Space Statistical Analysis System (PSAS) to the joint NASA/NCAR finite volume CCM3 (fvCCM3). To take advantage of the Lagrangian control-volume vertical coordinate of the fvCCM3, a novel "shaving" method is applied to the lowest few model layers to reflect the surface pressure changes as implied by the final analysis. Analysis increments (from PSAS) to the upper air variables are then consistently put onto the Lagrangian layers as adjustments to the volume-mean quantities during the analysis cycle. This approach is demonstrated to be superior to the conventional method of using independently computed "tendency terms" for surface pressure and upper air prognostic variables.
Wavefront modulation and subwavelength diffractive acoustics with an acoustic metasurface.
Xie, Yangbo; Wang, Wenqi; Chen, Huanyang; Konneker, Adam; Popa, Bogdan-Ioan; Cummer, Steven A
2014-11-24
Metasurfaces are a family of novel wavefront-shaping devices with planar profile and subwavelength thickness. Acoustic metasurfaces with ultralow profile yet extraordinary wave manipulating properties would be highly desirable for improving the performance of many acoustic wave-based applications. However, designing acoustic metasurfaces with similar functionality to their electromagnetic counterparts remains challenging with traditional metamaterial design approaches. Here we present a design and realization of an acoustic metasurface based on tapered labyrinthine metamaterials. The demonstrated metasurface can not only steer an acoustic beam as expected from the generalized Snell's law, but also exhibits various unique properties such as conversion from propagating wave to surface mode, extraordinary beam-steering and apparent negative refraction through higher-order diffraction. Such designer acoustic metasurfaces provide a new design methodology for acoustic signal modulation devices and may be useful for applications such as acoustic imaging, beam steering, ultrasound lens design and acoustic surface wave-based applications.
NASA Astrophysics Data System (ADS)
Camacho, A. G.; Fernández, J.; Cannavò, F.
2018-02-01
We present a software package to carry out inversions of surface deformation data (any combination of InSAR, GPS, and terrestrial data, e.g., EDM, levelling) as produced by 3D free-geometry extended bodies with anomalous pressure changes. The anomalous structures are described as an aggregation of elementary cells (whose effects are estimated as coming from point sources) in an elastic half space. The linear inverse problem (considering some simple regularization conditions) is solved by means of an exploratory approach. This software represents the open implementation of a previously published methodology (Camacho et al., 2011). It can be freely used with large data sets (e.g. InSAR data sets) or with data coming from small control networks (e.g. GPS monitoring data), mainly in volcanic areas, to estimate the expected pressure bodies representing magmatic intrusions. Here, the software is applied to some real test cases.
Numerical Modeling of Ablation Heat Transfer
NASA Technical Reports Server (NTRS)
Ewing, Mark E.; Laker, Travis S.; Walker, David T.
2013-01-01
A unique numerical method has been developed for solving one-dimensional ablation heat transfer problems. This paper provides a comprehensive description of the method, along with detailed derivations of the governing equations. This methodology supports solutions for traditional ablation modeling including such effects as heat transfer, material decomposition, pyrolysis gas permeation and heat exchange, and thermochemical surface erosion. The numerical scheme utilizes a control-volume approach with a variable grid to account for surface movement. This method directly supports implementation of nontraditional models such as material swelling and mechanical erosion, extending capabilities for modeling complex ablation phenomena. Verifications of the numerical implementation are provided using analytical solutions, code comparisons, and the method of manufactured solutions. These verifications are used to demonstrate solution accuracy and proper error convergence rates. A simple demonstration of a mechanical erosion (spallation) model is also provided to illustrate the unique capabilities of the method.
NASA Astrophysics Data System (ADS)
Schubert, Alexander; Falvo, Cyril; Meier, Christoph
2016-08-01
We present mixed quantum-classical simulations on relaxation and dephasing of vibrationally excited carbon monoxide within a protein environment. The methodology is based on a vibrational surface hopping approach treating the vibrational states of CO quantum mechanically, while all remaining degrees of freedom are described by means of classical molecular dynamics. The CO vibrational states form the "surfaces" for the classical trajectories of protein and solvent atoms. In return, environmentally induced non-adiabatic couplings between these states cause transitions describing the vibrational relaxation from first principles. The molecular dynamics simulation yields a detailed atomistic picture of the energy relaxation pathways, taking the molecular structure and dynamics of the protein and its solvent fully into account. Using the ultrafast photolysis of CO in the hemoprotein FixL as an example, we study the relaxation of vibrationally excited CO and evaluate the role of each of the FixL residues forming the heme pocket.
Towards a Methodology for the Design of Multimedia Public Access Interfaces.
ERIC Educational Resources Information Center
Rowley, Jennifer
1998-01-01
Discussion of information systems methodologies that can contribute to interface design for public access systems covers: the systems life cycle; advantages of adopting information systems methodologies; soft systems methodologies; task-oriented approaches to user interface design; holistic design, the Star model, and prototyping; the…
Propellant Readiness Level: A Methodological Approach to Propellant Characterization
NASA Technical Reports Server (NTRS)
Bossard, John A.; Rhys, Noah O.
2010-01-01
A methodological approach to defining propellant characterization is presented. The method is based on the well-established Technology Readiness Level nomenclature. This approach establishes the Propellant Readiness Level as a metric for ascertaining the readiness of a propellant or a propellant combination by evaluating the following set of propellant characteristics: thermodynamic data, toxicity, applications, combustion data, heat transfer data, material compatibility, analytical prediction modeling, injector/chamber geometry, pressurization, ignition, combustion stability, system storability, qualification testing, and flight capability. The methodology is meant to be applicable to all propellants or propellant combinations; liquid, solid, and gaseous propellants as well as monopropellants and propellant combinations are equally served. The functionality of the proposed approach is tested through the evaluation and comparison of an example set of hydrocarbon fuels.
Aligholi, Hadi; Rezayat, Seyed Mahdi; Azari, Hassan; Ejtemaei Mehr, Shahram; Akbari, Mohammad; Modarres Mousavi, Seyed Mostafa; Attari, Fatemeh; Alipour, Fatemeh; Hassanzadeh, Gholamreza; Gorji, Ali
2016-07-01
Cultivation of neural stem/progenitor cells (NS/PCs) in PuraMatrix (PM) hydrogel is an option for stem cell transplantation. The efficacy of a novel method for placing adult rat NS/PCs in PM (injection method) was compared to encapsulation and surface plating approaches. In addition, the efficacy of injection method for transplantation of autologous NS/PCs was studied in a rat model of brain injury. NS/PCs were obtained from the subventricular zone (SVZ) and cultivated without (control) or with scaffold (three-dimensional cultures; 3D). The effect of different approaches on survival, proliferation, and differentiation of NS/PCs were investigated. In in vivo study, brain injury was induced 45 days after NS/PCs were harvested from the SVZ and phosphate buffered saline, PM, NS/PCs, or PM+NS/PCs were injected into the brain lesion. There was an increase in cell viability and proliferation after injection and surface plating of NS/PCs compared to encapsulation and neural differentiation markers were expressed seven days after culturing the cells. Using injection method, transplantation of NS/PCs cultured in PM resulted in significant reduction of lesion volume, improvement of neurological deficits, and enhancement of surviving cells. In addition, the transplanted cells could differentiate in to neurons, astrocytes, or oligodendrocytes. Our results indicate that the injection and surface plating methods enhanced cell survival and proliferation of NS/PCs and suggest the injection method as a promising approach for transplantation of NS/PCs in brain injury. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Smits, Kathleen M.; Ngo, Viet V.; Cihan, Abdullah; Sakaki, Toshihiro; Illangasekare, Tissa H.
2012-12-01
Bare soil evaporation is a key process for water exchange between the land and the atmosphere and an important component of the water balance. However, there is no agreement on the best modeling methodology to determine evaporation under different atmospheric boundary conditions. Also, there is a lack of directly measured soil evaporation data for model validation to compare these methods to establish the validity of their mathematical formulations. Thus, a need exists to systematically compare evaporation estimates using existing methods to experimental observations. The goal of this work is to test different conceptual and mathematical formulations that are used to estimate evaporation from bare soils to critically investigate various formulations and surface boundary conditions. Such a comparison required the development of a numerical model that has the ability to incorporate these boundary conditions. For this model, we modified a previously developed theory that allows nonequilibrium liquid/gas phase change with gas phase vapor diffusion to better account for dry soil conditions. Precision data under well-controlled transient heat and wind boundary conditions were generated, and results from numerical simulations were compared with experimental data. Results demonstrate that the approaches based on different boundary conditions varied in their ability to capture different stages of evaporation. All approaches have benefits and limitations, and no one approach can be deemed most appropriate for every scenario. Comparisons of different formulations of the surface boundary condition validate the need for further research on heat and vapor transport processes in soil for better modeling accuracy.
Satellite-based Calibration of Heat Flux at the Ocean Surface
NASA Astrophysics Data System (ADS)
Barron, C. N.; Dastugue, J. M.; May, J. C.; Rowley, C. D.; Smith, S. R.; Spence, P. L.; Gremes-Cordero, S.
2016-02-01
Model forecasts of upper ocean heat content and variability on diurnal to daily scales are highly dependent on estimates of heat flux through the air-sea interface. Satellite remote sensing is applied to not only inform the initial ocean state but also to mitigate errors in surface heat flux and model representations affecting the distribution of heat in the upper ocean. Traditional assimilation of sea surface temperature (SST) observations re-centers ocean models at the start of each forecast cycle. Subsequent evolution depends on estimates of surface heat fluxes and upper-ocean processes over the forecast period. The COFFEE project (Calibration of Ocean Forcing with satellite Flux Estimates) endeavors to correct ocean forecast bias through a responsive error partition among surface heat flux and ocean dynamics sources. A suite of experiments in the southern California Current demonstrates a range of COFFEE capabilities, showing the impact on forecast error relative to a baseline three-dimensional variational (3DVAR) assimilation using Navy operational global or regional atmospheric forcing. COFFEE addresses satellite-calibration of surface fluxes to estimate surface error covariances and links these to the ocean interior. Experiment cases combine different levels of flux calibration with different assimilation alternatives. The cases may use the original fluxes, apply full satellite corrections during the forecast period, or extend hindcast corrections into the forecast period. Assimilation is either baseline 3DVAR or standard strong-constraint 4DVAR, with work proceeding to add a 4DVAR expanded to include a weak constraint treatment of the surface flux errors. Covariance of flux errors is estimated from the recent time series of forecast and calibrated flux terms. While the California Current examples are shown, the approach is equally applicable to other regions. These approaches within a 3DVAR application are anticipated to be useful for global and larger regional domains where a full 4DVAR methodology may be cost-prohibitive.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jordan, Dirk C.; Deceglie, Michael G.; Kurtz, Sarah R.
What is the best method to determine long-term PV system performance and degradation rates? Ideally, one universally applicable methodology would be desirable so that a single number could be derived. However, data sets vary in their attributes and evidence is presented that defining two methodologies may be preferable. Monte Carlo simulations of artificial performance data allowed investigation of different methodologies and their respective confidence intervals. Tradeoffs between different approaches were delineated, elucidating as to why two separate approaches may need to be included in a standard. Regression approaches tend to be preferable when data sets are less contaminated by seasonality,more » noise and occurrence of outliers although robust regression can significantly improve the accuracy when outliers are present. In the presence of outliers, marked seasonality, or strong soiling events, year-on-year approaches tend to outperform regression approaches.« less
Perceived Managerial and Leadership Effectiveness in Colombia
ERIC Educational Resources Information Center
Torres, Luis Eduardo; Ruiz, Carlos Enrique; Hamlin, Bob; Velez-Calle, Andres
2015-01-01
Purpose: The purpose of this study was to identify what Colombians perceive as effective and least effective/ineffective managerial behavior. Design/methodology/approach: This study was conducted following a qualitative methodology based on the philosophical assumptions of pragmatism and the "pragmatic approach" (Morgan, 2007). The…
Jeong, Jeong-Won; Shin, Dae C; Do, Synho; Marmarelis, Vasilis Z
2006-08-01
This paper presents a novel segmentation methodology for automated classification and differentiation of soft tissues using multiband data obtained with the newly developed system of high-resolution ultrasonic transmission tomography (HUTT) for imaging biological organs. This methodology extends and combines two existing approaches: the L-level set active contour (AC) segmentation approach and the agglomerative hierarchical kappa-means approach for unsupervised clustering (UC). To prevent the trapping of the current iterative minimization AC algorithm in a local minimum, we introduce a multiresolution approach that applies the level set functions at successively increasing resolutions of the image data. The resulting AC clusters are subsequently rearranged by the UC algorithm that seeks the optimal set of clusters yielding the minimum within-cluster distances in the feature space. The presented results from Monte Carlo simulations and experimental animal-tissue data demonstrate that the proposed methodology outperforms other existing methods without depending on heuristic parameters and provides a reliable means for soft tissue differentiation in HUTT images.
Michalek, Lukas; Barner, Leonie; Barner-Kowollik, Christopher
2018-03-07
Well-defined polymer strands covalently tethered onto solid substrates determine the properties of the resulting functional interface. Herein, the current approaches to determine quantitative grafting densities are assessed. Based on a brief introduction into the key theories describing polymer brush regimes, a user's guide is provided to estimating maximum chain coverage and-importantly-examine the most frequently employed approaches for determining grafting densities, i.e., dry thickness measurements, gravimetric assessment, and swelling experiments. An estimation of the reliability of these determination methods is provided via carefully evaluating their assumptions and assessing the stability of the underpinning equations. A practical access guide for comparatively and quantitatively evaluating the reliability of a given approach is thus provided, enabling the field to critically judge experimentally determined grafting densities and to avoid the reporting of grafting densities that fall outside the physically realistic parameter space. The assessment is concluded with a perspective on the development of advanced approaches for determination of grafting density, in particular, on single-chain methodologies. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Smid, Marek; Costa, Ana; Pebesma, Edzer; Granell, Carlos; Bhattacharya, Devanjan
2016-04-01
Human kind is currently predominantly urban based, and the majority of ever continuing population growth will take place in urban agglomerations. Urban systems are not only major drivers of climate change, but also the impact hot spots. Furthermore, climate change impacts are commonly managed at city scale. Therefore, assessing climate change impacts on urban systems is a very relevant subject of research. Climate and its impacts on all levels (local, meso and global scale) and also the inter-scale dependencies of those processes should be a subject to detail analysis. While global and regional projections of future climate are currently available, local-scale information is lacking. Hence, statistical downscaling methodologies represent a potentially efficient way to help to close this gap. In general, the methodological reviews of downscaling procedures cover the various methods according to their application (e.g. downscaling for the hydrological modelling). Some of the most recent and comprehensive studies, such as the ESSEM COST Action ES1102 (VALUE), use the concept of Perfect Prog and MOS. Other examples of classification schemes of downscaling techniques consider three main categories: linear methods, weather classifications and weather generators. Downscaling and climate modelling represent a multidisciplinary field, where researchers from various backgrounds intersect their efforts, resulting in specific terminology, which may be somewhat confusing. For instance, the Polynomial Regression (also called the Surface Trend Analysis) is a statistical technique. In the context of the spatial interpolation procedures, it is commonly classified as a deterministic technique, and kriging approaches are classified as stochastic. Furthermore, the terms "statistical" and "stochastic" (frequently used as names of sub-classes in downscaling methodological reviews) are not always considered as synonymous, even though both terms could be seen as identical since they are referring to methods handling input modelling factors as variables with certain probability distributions. In addition, the recent development is going towards multi-step methodologies containing deterministic and stochastic components. This evolution leads to the introduction of new terms like hybrid or semi-stochastic approaches, which makes the efforts to systematically classifying downscaling methods to the previously defined categories even more challenging. This work presents a review of statistical downscaling procedures, which classifies the methods in two steps. In the first step, we describe several techniques that produce a single climatic surface based on observations. The methods are classified into two categories using an approximation to the broadest consensual statistical terms: linear and non-linear methods. The second step covers techniques that use simulations to generate alternative surfaces, which correspond to different realizations of the same processes. Those simulations are essential because there is a limited number of real observational data, and such procedures are crucial for modelling extremes. This work emphasises the link between statistical downscaling methods and the research of climate change impacts at city scale.