Sample records for bottom-up methods applied

  1. Top down, bottom up structured programming and program structuring

    NASA Technical Reports Server (NTRS)

    Hamilton, M.; Zeldin, S.

    1972-01-01

    New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.

  2. The updated bottom up solution applied to atmospheric pressure photoionization and electrospray ionization mass spectrometry

    USDA-ARS?s Scientific Manuscript database

    The Updated Bottom Up Solution (UBUS) was recently applied to atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) of triacylglycerols (TAGs). This report demonstrates that the UBUS applies equally well to atmospheric pressure photoionization (APPI) MS and to electrospray ionizatio...

  3. Using Top-down and Bottom-up Costing Approaches in LMICs: The Case for Using Both to Assess the Incremental Costs of New Technologies at Scale.

    PubMed

    Cunnama, Lucy; Sinanovic, Edina; Ramma, Lebogang; Foster, Nicola; Berrie, Leigh; Stevens, Wendy; Molapo, Sebaka; Marokane, Puleng; McCarthy, Kerrigan; Churchyard, Gavin; Vassall, Anna

    2016-02-01

    Estimating the incremental costs of scaling-up novel technologies in low-income and middle-income countries is a methodologically challenging and substantial empirical undertaking, in the absence of routine cost data collection. We demonstrate a best practice pragmatic approach to estimate the incremental costs of new technologies in low-income and middle-income countries, using the example of costing the scale-up of Xpert Mycobacterium tuberculosis (MTB)/resistance to riframpicin (RIF) in South Africa. We estimate costs, by applying two distinct approaches of bottom-up and top-down costing, together with an assessment of processes and capacity. The unit costs measured using the different methods of bottom-up and top-down costing, respectively, are $US16.9 and $US33.5 for Xpert MTB/RIF, and $US6.3 and $US8.5 for microscopy. The incremental cost of Xpert MTB/RIF is estimated to be between $US14.7 and $US17.7. While the average cost of Xpert MTB/RIF was higher than previous studies using standard methods, the incremental cost of Xpert MTB/RIF was found to be lower. Costs estimates are highly dependent on the method used, so an approach, which clearly identifies resource-use data collected from a bottom-up or top-down perspective, together with capacity measurement, is recommended as a pragmatic approach to capture true incremental cost where routine cost data are scarce. © 2016 The Authors. Health Economics published by John Wiley & Sons Ltd.

  4. Seven perspectives on GPCR H/D-exchange proteomics methods

    PubMed Central

    Zhang, Xi

    2017-01-01

    Recent research shows surging interest to visualize human G protein-coupled receptor (GPCR) dynamic structures using the bottom-up H/D-exchange (HDX) proteomics technology. This opinion article clarifies critical technical nuances and logical thinking behind the GPCR HDX proteomics method, to help scientists overcome cross-discipline pitfalls, and understand and reproduce the protocol at high quality. The 2010 89% HDX structural coverage of GPCR was achieved with both structural and analytical rigor. This article emphasizes systematically considering membrane protein structure stability and compatibility with chromatography and mass spectrometry (MS) throughout the pipeline, including the effects of metal ions, zero-detergent shock, and freeze-thaws on HDX result rigor. This article proposes to view bottom-up HDX as two steps to guide choices of detergent buffers and chromatography settings: (I) protein HDX labeling in native buffers, and (II) peptide-centric analysis of HDX labels, which applies (a) bottom-up MS/MS to construct peptide matrix and (b) HDX MS to locate and quantify H/D labels. The detergent-low-TCEP digestion method demystified the challenge of HDX-grade GPCR digestion. GPCR HDX proteomics is a structural approach, thus its choice of experimental conditions should let structure lead and digestion follow, not the opposite. PMID:28529698

  5. Agricultural ammonia emissions in China: reconciling bottom-up and top-down estimates

    NASA Astrophysics Data System (ADS)

    Zhang, Lin; Chen, Youfan; Zhao, Yuanhong; Henze, Daven K.; Zhu, Liye; Song, Yu; Paulot, Fabien; Liu, Xuejun; Pan, Yuepeng; Lin, Yi; Huang, Binxiang

    2018-01-01

    Current estimates of agricultural ammonia (NH3) emissions in China differ by more than a factor of 2, hindering our understanding of their environmental consequences. Here we apply both bottom-up statistical and top-down inversion methods to quantify NH3 emissions from agriculture in China for the year 2008. We first assimilate satellite observations of NH3 column concentration from the Tropospheric Emission Spectrometer (TES) using the GEOS-Chem adjoint model to optimize Chinese anthropogenic NH3 emissions at the 1/2° × 2/3° horizontal resolution for March-October 2008. Optimized emissions show a strong summer peak, with emissions about 50 % higher in summer than spring and fall, which is underestimated in current bottom-up NH3 emission estimates. To reconcile the latter with the top-down results, we revisit the processes of agricultural NH3 emissions and develop an improved bottom-up inventory of Chinese NH3 emissions from fertilizer application and livestock waste at the 1/2° × 2/3° resolution. Our bottom-up emission inventory includes more detailed information on crop-specific fertilizer application practices and better accounts for meteorological modulation of NH3 emission factors in China. We find that annual anthropogenic NH3 emissions are 11.7 Tg for 2008, with 5.05 Tg from fertilizer application and 5.31 Tg from livestock waste. The two sources together account for 88 % of total anthropogenic NH3 emissions in China. Our bottom-up emission estimates also show a distinct seasonality peaking in summer, consistent with top-down results from the satellite-based inversion. Further evaluations using surface network measurements show that the model driven by our bottom-up emissions reproduces the observed spatial and seasonal variations of NH3 gas concentrations and ammonium (NH4+) wet deposition fluxes over China well, providing additional credibility to the improvements we have made to our agricultural NH3 emission inventory.

  6. Guidelines for bottom-up approach of nanocarbon film formation from pentacene using heated tungsten on quartz substrate without metal catalyst

    NASA Astrophysics Data System (ADS)

    Heya, Akira; Matsuo, Naoto

    2018-04-01

    The guidelines for a bottom-up approach of nanographene formation from pentacene using heated tungsten were investigated using a novel method called hot mesh deposition (HMD). In this method, a heated W mesh was set between a pentacene source and a quartz substrate. Pentacene molecules were decomposed by the heated W mesh. The generated pentacene-based decomposed precursors were then deposited on the quartz substrate. The pentacene dimer (peripentacene) was obtained from pentacene by HMD using two heated catalysts. As expected from the calculation with the density functional theory in the literature, it was confirmed that the pentacene dimer can be formed by a reaction between pentacene and 6,13-dihydropentacene. This technique can be applied to the formation of novel nanographene on various substrates without metal catalysts.

  7. Bottom-up and top-down influences at untrained conditions determine perceptual learning specificity and transfer

    PubMed Central

    Xiong, Ying-Zi; Zhang, Jun-Yun; Yu, Cong

    2016-01-01

    Perceptual learning is often orientation and location specific, which may indicate neuronal plasticity in early visual areas. However, learning specificity diminishes with additional exposure of the transfer orientation or location via irrelevant tasks, suggesting that the specificity is related to untrained conditions, likely because neurons representing untrained conditions are neither bottom-up stimulated nor top-down attended during training. To demonstrate these top-down and bottom-up contributions, we applied a “continuous flash suppression” technique to suppress the exposure stimulus into sub-consciousness, and with additional manipulations to achieve pure bottom-up stimulation or top-down attention with the transfer condition. We found that either bottom-up or top-down influences enabled significant transfer of orientation and Vernier discrimination learning. These results suggest that learning specificity may result from under-activations of untrained visual neurons due to insufficient bottom-up stimulation and/or top-down attention during training. High-level perceptual learning thus may not functionally connect to these neurons for learning transfer. DOI: http://dx.doi.org/10.7554/eLife.14614.001 PMID:27377357

  8. Turning up the heat: temperature influences the relative importance of top-down and bottom-up effects.

    PubMed

    Hoekman, David

    2010-10-01

    Understanding how communities respond to changes in temperature is a major challenge for community ecology. Temperature influences the relative degree to which top-down and bottom-up forces structure ecological communities. In greenhouse experiments using the aquatic community found in pitcher plants (Sarracenia purpurea), I tested how temperature affected the relative importance of top-down (mosquito predation) and bottom-up (ant carcasses) forces on protozoa and bacteria populations. While bottom-up effects did not vary consistently with temperature, the top-down effects of predators on protozoa increased at higher temperatures. These results suggest that temperature could change the relative importance of top-down and bottom-up effects in ecological communities. Specifically, higher temperature may increase the strength of top-down effects by raising predator metabolic rate and concomitant processes (e.g., activity, foraging, digestion, growth) relative to cooler temperatures. These findings apply broadly to an understanding of trophic interactions in a variable environment and are especially relevant in the context of ongoing climate change.

  9. Reconciling Basin-Scale Top-Down and Bottom-Up Methane Emission Measurements for Onshore Oil and Gas Development: Cooperative Research and Development Final Report, CRADA Number CRD-14-572

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heath, Garvin A.

    The overall objective of the Research Partnership to Secure Energy for America (RPSEA)-funded research project is to develop independent estimates of methane emissions using top-down and bottom-up measurement approaches and then to compare the estimates, including consideration of uncertainty. Such approaches will be applied at two scales: basin and facility. At facility scale, multiple methods will be used to measure methane emissions of the whole facility (controlled dual tracer and single tracer releases, aircraft-based mass balance and Gaussian back-trajectory), which are considered top-down approaches. The bottom-up approach will sum emissions from identified point sources measured using appropriate source-level measurement techniquesmore » (e.g., high-flow meters). At basin scale, the top-down estimate will come from boundary layer airborne measurements upwind and downwind of the basin, using a regional mass balance model plus approaches to separate atmospheric methane emissions attributed to the oil and gas sector. The bottom-up estimate will result from statistical modeling (also known as scaling up) of measurements made at selected facilities, with gaps filled through measurements and other estimates based on other studies. The relative comparison of the bottom-up and top-down estimates made at both scales will help improve understanding of the accuracy of the tested measurement and modeling approaches. The subject of this CRADA is NREL's contribution to the overall project. This project resulted from winning a competitive solicitation no. RPSEA RFP2012UN001, proposal no. 12122-95, which is the basis for the overall project. This Joint Work Statement (JWS) details the contributions of NREL and Colorado School of Mines (CSM) in performance of the CRADA effort.« less

  10. Bottom-Up Guidance in Visual Search for Conjunctions

    ERIC Educational Resources Information Center

    Proulx, Michael J.

    2007-01-01

    Understanding the relative role of top-down and bottom-up guidance is crucial for models of visual search. Previous studies have addressed the role of top-down and bottom-up processes in search for a conjunction of features but with inconsistent results. Here, the author used an attentional capture method to address the role of top-down and…

  11. Students' Perceptions about Online Teaching Effectiveness: A Bottom-Up Approach for Identifying Online Instructors' Roles

    ERIC Educational Resources Information Center

    Gómez-Rey, Pilar; Barbera, Elena; Fernández-Navarro, Francisco

    2018-01-01

    The topic of online instructors' roles has been of interest to the educational community since the late twentieth century. In previous studies, the identification of online instructors' roles was done using a top-down (deductive) approach. This study applied a bottom-up (inductive) procedure to examine not only the roles of online instructors from…

  12. Spatial accuracy of a simplified disaggregation method for traffic emissions applied in seven mid-sized Chilean cities

    NASA Astrophysics Data System (ADS)

    Ossés de Eicker, Margarita; Zah, Rainer; Triviño, Rubén; Hurni, Hans

    The spatial accuracy of top-down traffic emission inventory maps obtained with a simplified disaggregation method based on street density was assessed in seven mid-sized Chilean cities. Each top-down emission inventory map was compared against a reference, namely a more accurate bottom-up emission inventory map from the same study area. The comparison was carried out using a combination of numerical indicators and visual interpretation. Statistically significant differences were found between the seven cities with regard to the spatial accuracy of their top-down emission inventory maps. In compact cities with a simple street network and a single center, a good accuracy of the spatial distribution of emissions was achieved with correlation values>0.8 with respect to the bottom-up emission inventory of reference. In contrast, the simplified disaggregation method is not suitable for complex cities consisting of interconnected nuclei, resulting in correlation values<0.5. Although top-down disaggregation of traffic emissions generally exhibits low accuracy, the accuracy is significantly higher in compact cities and might be further improved by applying a correction factor for the city center. Therefore, the method can be used by local environmental authorities in cities with limited resources and with little knowledge on the pollution situation to get an overview on the spatial distribution of the emissions generated by traffic activities.

  13. Bottom-up guidance in visual search for conjunctions.

    PubMed

    Proulx, Michael J

    2007-02-01

    Understanding the relative role of top-down and bottom-up guidance is crucial for models of visual search. Previous studies have addressed the role of top-down and bottom-up processes in search for a conjunction of features but with inconsistent results. Here, the author used an attentional capture method to address the role of top-down and bottom-up processes in conjunction search. The role of bottom-up processing was assayed by inclusion of an irrelevant-size singleton in a search for a conjunction of color and orientation. One object was uniquely larger on each trial, with chance probability of coinciding with the target; thus, the irrelevant feature of size was not predictive of the target's location. Participants searched more efficiently for the target when it was also the size singleton, and they searched less efficiently for the target when a nontarget was the size singleton. Although a conjunction target cannot be detected on the basis of bottom-up processing alone, participants used search strategies that relied significantly on bottom-up guidance in finding the target, resulting in interference from the irrelevant-size singleton.

  14. Analysis of Academic and Non-Academic Outcomes from a Bottom-up Comprehensive School Reform in the Absence of Student Level Data through Simulation Methods: A Mixed Methods Case Study

    ERIC Educational Resources Information Center

    Sondergeld, Toni A.

    2009-01-01

    This dissertation examines the efficacy of a bottom-up comprehensive school reform (CSR) program by evaluating its impact on student achievement, attendance, and behavior outcomes through an explanatory mixed methods design. The CSR program (Gear Up) was implemented in an urban junior high school over the course of seven years allowing for…

  15. Subjective Well-Being: The Constructionist Point of View. A Longitudinal Study to Verify the Predictive Power of Top-Down Effects and Bottom-Up Processes

    ERIC Educational Resources Information Center

    Leonardi, Fabio; Spazzafumo, Liana; Marcellini, Fiorella

    2005-01-01

    Based on the constructionist point of view applied to Subjective Well-Being (SWB), five hypotheses were advanced about the predictive power of the top-down effects and bottom-up processes over a five years period. The sample consisted of 297 respondents, which represent the Italian sample of a European longitudinal survey; the first phase was…

  16. Effects of High-Pressure Treatment on the Muscle Proteome of Hake by Bottom-Up Proteomics.

    PubMed

    Carrera, Mónica; Fidalgo, Liliana G; Saraiva, Jorge A; Aubourg, Santiago P

    2018-05-02

    A bottom-up proteomics approach was applied for the study of the effects of high-pressure (HP) treatment on the muscle proteome of fish. The performance of the approach was established for a previous HP treatment (150-450 MPa for 2 min) on frozen (up to 5 months at -10 °C) European hake ( Merluccius merluccius). Concerning possible protein biomarkers of quality changes, a significant degradation after applying a pressure ≥430 MPa could be observed for phosphoglycerate mutase-1, enolase, creatine kinase, fructose bisphosphate aldolase, triosephosphate isomerase, and nucleoside diphosphate kinase; contrary, electrophoretic bands assigned to tropomyosin, glyceraldehyde-3-phosphate dehydrogenase, and beta parvalbumin increased their intensity after applying a pressure ≥430 MPa. This repository of potential protein biomarkers may be very useful for further HP investigations related to fish quality.

  17. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion.

    PubMed

    Xiu, Daiming; Geiger, Maximilian J; Klaver, Peter

    2015-01-01

    This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive ("happy"), neutral and negative ("angry" or "fearful") faces. Dynamic Causal Modeling (DCM) was applied on the functional magnetic resonance imaging (fMRI) data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus) and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala, and orbitofrontal cortex). The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  18. An integrative top-down and bottom-up qualitative model construction framework for exploration of biochemical systems.

    PubMed

    Wu, Zujian; Pang, Wei; Coghill, George M

    Computational modelling of biochemical systems based on top-down and bottom-up approaches has been well studied over the last decade. In this research, after illustrating how to generate atomic components by a set of given reactants and two user pre-defined component patterns, we propose an integrative top-down and bottom-up modelling approach for stepwise qualitative exploration of interactions among reactants in biochemical systems. Evolution strategy is applied to the top-down modelling approach to compose models, and simulated annealing is employed in the bottom-up modelling approach to explore potential interactions based on models constructed from the top-down modelling process. Both the top-down and bottom-up approaches support stepwise modular addition or subtraction for the model evolution. Experimental results indicate that our modelling approach is feasible to learn the relationships among biochemical reactants qualitatively. In addition, hidden reactants of the target biochemical system can be obtained by generating complex reactants in corresponding composed models. Moreover, qualitatively learned models with inferred reactants and alternative topologies can be used for further web-lab experimental investigations by biologists of interest, which may result in a better understanding of the system.

  19. Improving Reading Fluency and Comprehension in Adult ESL Learners Using Bottom-Up and Top-Down Vocabulary Training

    ERIC Educational Resources Information Center

    Oliver, Rhonda; Young, Shahreen

    2016-01-01

    The current research examines the effect of two methods of vocabulary training on reading fluency and comprehension of adult English as second language (ESL) tertiary-bound students. The methods used were isolated vocabulary training (bottom-up reading) and vocabulary training in context (top-down reading). The current exploratory and…

  20. A combined bottom-up/top-down approach to prepare a sterile injectable nanosuspension.

    PubMed

    Hu, Xi; Chen, Xi; Zhang, Ling; Lin, Xia; Zhang, Yu; Tang, Xing; Wang, Yanjiao

    2014-09-10

    To prepare a uniform nanosuspension of strongly hydrophobic riboflavin laurate (RFL) allowing sterile filtration, physical modification (bottom-up) was combined with high-pressure homogenization (top-down) method. Unlike other bottom-up approaches, physical modification with surfactants (TPGS and PL-100) by lyophilization controlled crystallization and compensated for the poor wettability of RFL. On one hand, crystal growth and aggregation during freezing was restricted by a stabilizer-layer adsorbed on the drug surface by hydrophobic interaction. On the other hand, subsequent crystallization of drug in the sublimation process was limited to the interstitial spaces between solvent crystals. After lyophilization, modified drug with a smaller particle size and better wettability was obtained. When adding surfactant solution, water molecules passed between the hydrophilic groups of surface active molecules and activated the polymer chains allowing them to stretch into water. The coarse suspension was crushed into a nanosuspension (MP=162 nm) by high-pressure homogenization. For long term stability, lyophilization was applied again to solidify the nanosuspension (sorbitol as cryoprotectant). A slight crystal growth to about 600 nm was obtained to allow slow release for a sustained effect after muscular administration. Moreover, no paw-licking responses and very slight muscular inflammation demonstrated the excellent biocompatibility of this long-acting RFL injection. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. Nanoelectronics from the bottom up.

    PubMed

    Lu, Wei; Lieber, Charles M

    2007-11-01

    Electronics obtained through the bottom-up approach of molecular-level control of material composition and structure may lead to devices and fabrication strategies not possible with top-down methods. This review presents a brief summary of bottom-up and hybrid bottom-up/top-down strategies for nanoelectronics with an emphasis on memories based on the crossbar motif. First, we will discuss representative electromechanical and resistance-change memory devices based on carbon nanotube and core-shell nanowire structures, respectively. These device structures show robust switching, promising performance metrics and the potential for terabit-scale density. Second, we will review architectures being developed for circuit-level integration, hybrid crossbar/CMOS circuits and array-based systems, including experimental demonstrations of key concepts such lithography-independent, chemically coded stochastic demultipluxers. Finally, bottom-up fabrication approaches, including the opportunity for assembly of three-dimensional, vertically integrated multifunctional circuits, will be critically discussed.

  2. Intelligent Evaluation Method of Tank Bottom Corrosion Status Based on Improved BP Artificial Neural Network

    NASA Astrophysics Data System (ADS)

    Qiu, Feng; Dai, Guang; Zhang, Ying

    According to the acoustic emission information and the appearance inspection information of tank bottom online testing, the external factors associated with tank bottom corrosion status are confirmed. Applying artificial neural network intelligent evaluation method, three tank bottom corrosion status evaluation models based on appearance inspection information, acoustic emission information, and online testing information are established. Comparing with the result of acoustic emission online testing through the evaluation of test sample, the accuracy of the evaluation model based on online testing information is 94 %. The evaluation model can evaluate tank bottom corrosion accurately and realize acoustic emission online testing intelligent evaluation of tank bottom.

  3. Glucose-6-phosphate dehydrogenase deficiency and the use of primaquine: top-down and bottom-up estimation of professional costs.

    PubMed

    Peixoto, Henry Maia; Brito, Marcelo Augusto Mota; Romero, Gustavo Adolfo Sierra; Monteiro, Wuelton Marcelo; Lacerda, Marcus Vinícius Guimarães de; Oliveira, Maria Regina Fernandes de

    2017-10-05

    The aim of this study has been to study whether the top-down method, based on the average value identified in the Brazilian Hospitalization System (SIH/SUS), is a good estimator of the cost of health professionals per patient, using the bottom-up method for comparison. The study has been developed from the context of hospital care offered to the patient carrier of glucose-6-phosphate dehydrogenase (G6PD) deficiency with severe adverse effect because of the use of primaquine, in the Brazilian Amazon. The top-down method based on the spending with SIH/SUS professional services, as a proxy for this cost, corresponded to R$60.71, and the bottom-up, based on the salaries of the physician (R$30.43), nurse (R$16.33), and nursing technician (R$5.93), estimated a total cost of R$52.68. The difference was only R$8.03, which shows that the amounts paid by the Hospital Inpatient Authorization (AIH) are estimates close to those obtained by the bottom-up technique for the professionals directly involved in the care.

  4. How to fold a spin chain: Integrable boundaries of the Heisenberg XXX and Inozemtsev hyperbolic models

    NASA Astrophysics Data System (ADS)

    De La Rosa Gomez, Alejandro; MacKay, Niall; Regelskis, Vidas

    2017-04-01

    We present a general method of folding an integrable spin chain, defined on a line, to obtain an integrable open spin chain, defined on a half-line. We illustrate our method through two fundamental models with sl2 Lie algebra symmetry: the Heisenberg XXX and the Inozemtsev hyperbolic spin chains. We obtain new long-range boundary Hamiltonians and demonstrate that they exhibit Yangian symmetries, thus ensuring integrability of the models we obtain. The method presented provides a ;bottom-up; approach for constructing integrable boundaries and can be applied to any spin chain model.

  5. Beyond Hammers and Nails: Mitigating and Verifying Greenhouse Gas Emissions

    NASA Astrophysics Data System (ADS)

    Gurney, Kevin Robert

    2013-05-01

    One of the biggest challenges to future international agreements on climate change is an independent, science-driven method of verifying reductions in greenhouse gas emissions (GHG) [Niederberger and Kimble, 2011]. The scientific community has thus far emphasized atmospheric measurements to assess changes in emissions. An alternative is direct measurement or estimation of fluxes at the source. Given the many challenges facing the approach that uses "top-down" atmospheric measurements and recent advances in "bottom-up" estimation methods, I challenge the current doctrine, which has the atmospheric measurement approach "validating" bottom-up, "good-faith" emissions estimation [Balter, 2012] or which holds that the use of bottom-up estimation is like "dieting without weighing oneself" [Nisbet and Weiss, 2010].

  6. Trophic cascades of bottom-up and top-down forcing on nutrients and plankton in the Kattegat, evaluated by modelling

    NASA Astrophysics Data System (ADS)

    Petersen, Marcell Elo; Maar, Marie; Larsen, Janus; Møller, Eva Friis; Hansen, Per Juel

    2017-05-01

    The aim of the study was to investigate the relative importance of bottom-up and top-down forcing on trophic cascades in the pelagic food-web and the implications for water quality indicators (summer phytoplankton biomass and winter nutrients) in relation to management. The 3D ecological model ERGOM was validated and applied in a local set-up of the Kattegat, Denmark, using the off-line Flexsem framework. The model scenarios were conducted by changing the forcing by ± 20% of nutrient inputs (bottom-up) and mesozooplankton mortality (top-down), and both types of forcing combined. The model results showed that cascading effects operated differently depending on the forcing type. In the single-forcing bottom-up scenarios, the cascade directions were in the same direction as the forcing. For scenarios involving top-down, there was a skipped-level-transmission in the trophic responses that was either attenuated or amplified at different trophic levels. On a seasonal scale, bottom-up forcing showed strongest response during winter-spring for DIN and Chl a concentrations, whereas top-down forcing had the highest cascade strength during summer for Chl a concentrations and microzooplankton biomass. On annual basis, the system was more bottom-up than top-down controlled. Microzooplankton was found to play an important role in the pelagic food web as mediator of nutrient and energy fluxes. This study demonstrated that the best scenario for improved water quality was a combined reduction in nutrient input and mesozooplankton mortality calling for the need of an integrated management of marine areas exploited by human activities.

  7. Stress testing hydrologic models using bottom-up climate change assessment

    NASA Astrophysics Data System (ADS)

    Stephens, C.; Johnson, F.; Marshall, L. A.

    2017-12-01

    Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.

  8. Analysis of top-down and bottom-up North American CO2 and CH4 emissions estimates in the second State of the Carbon Cycle Report

    NASA Astrophysics Data System (ADS)

    Miller, J. B.; Jacobson, A. R.; Bruhwiler, L.; Michalak, A.; Hayes, D. J.; Vargas, R.

    2017-12-01

    In just ten years since publication of the original State of the Carbon Cycle Report in 2007, global CO2 concentrations have risen by more than 22 ppm to 405 ppm. This represents 18% of the increase over preindustrial levels of 280 ppm. This increase is being driven unequivocally by fossil fuel combustion with North American emissions comprising roughly 20% of the global total over the past decade. At the global scale, we know by comparing well-known fossil fuel inventories and rates of atmospheric CO2 increase that about half of all emissions are absorbed at Earth's surface. For North America, however, we can not apply a simple mass balance to determine sources and sinks. Instead, contributions from ecosystems must be estimated using top-down and bottom-up methods. SOCCR-2 estimates North American net CO2 uptake from ecosystems using bottom-up (inventory) methods as 577 +/- 433 TgC/yr and 634 +/- 288 TgC/yr from top-down atmospheric inversions. Although the global terrestrial carbon sink is not precisely known, these values represent possibly 30% of the global values. As with net sink estimates reported in SOCCR, these new top-down and bottom-up estimates are statistically consistent with one another. However, the uncertainties on each of these estimates are now substantially smaller, giving us more confidence about where the truth lies. Atmospheric inversions also yield estimates of interannual variations (IAV) in CO2 and CH4 fluxes. Our syntheses suggest that IAV of ecosystem CO2 fluxes is of order 100 TgC/yr, mainly originating in the conterminous US, with lower variability in boreal and arctic regions. Moreover, this variability is much larger than for inventory-based fluxes reported by the US to the UNFCCC. Unlike CO2, bottom-up CH4 emissions are larger than those derived from large-scale atmospheric data, with the continental discrepancy resulting primarily from differences in arctic and boreal regions. In addition to the current state of the science, we will also discuss the primary sources of uncertainty and how existing and emerging measurement and modeling technologies can address them.

  9. The generation of myricetin-nicotinamide nanococrystals by top down and bottom up technologies

    NASA Astrophysics Data System (ADS)

    Liu, Mingyu; Hong, Chao; Li, Guowen; Ma, Ping; Xie, Yan

    2016-09-01

    Myricetin-nicotinamide (MYR-NIC) nanococrystal preparation methods were developed and optimized using both top down and bottom up approaches. The grinding (top down) method successfully achieved nanococrystals, but there were some micrometer range particles and aggregation. The key consideration of the grinding technology was to control the milling time to determine a balance between the particle size and distribution. In contrast, a modified bottom up approach based on a solution method in conjunction with sonochemistry resulted in a uniform MYR-NIC nanococrystal that was confirmed by powder x-ray diffraction, scanning electron microscopy, dynamic light scattering, and differential scanning calorimeter, and the particle dissolution rate and amount were significantly greater than that of MYR-NIC cocrystal. Notably, this was a simple method without the addition of any non-solvent. We anticipate our findings will provide some guidance for future nanococrystal preparation as well as its application in both chemical and pharmaceutical area.

  10. Integrated Bottom-Up and Top-Down Liquid Chromatography-Mass Spectrometry for Characterization of Recombinant Human Growth Hormone Degradation Products.

    PubMed

    Wang, Yu Annie; Wu, Di; Auclair, Jared R; Salisbury, Joseph P; Sarin, Richa; Tang, Yang; Mozdzierz, Nicholas J; Shah, Kartik; Zhang, Anna Fan; Wu, Shiaw-Lin; Agar, Jeffery N; Love, J Christopher; Love, Kerry R; Hancock, William S

    2017-12-05

    With the advent of biosimilars to the U.S. market, it is important to have better analytical tools to ensure product quality from batch to batch. In addition, the recent popularity of using a continuous process for production of biopharmaceuticals, the traditional bottom-up method, alone for product characterization and quality analysis is no longer sufficient. Bottom-up method requires large amounts of material for analysis and is labor-intensive and time-consuming. Additionally, in this analysis, digestion of the protein with enzymes such as trypsin could induce artifacts and modifications which would increase the complexity of the analysis. On the other hand, a top-down method requires a minimum amount of sample and allows for analysis of the intact protein mass and sequence generated from fragmentation within the instrument. However, fragmentation usually occurs at the N-terminal and C-terminal ends of the protein with less internal fragmentation. Herein, we combine the use of the complementary techniques, a top-down and bottom-up method, for the characterization of human growth hormone degradation products. Notably, our approach required small amounts of sample, which is a requirement due to the sample constraints of small scale manufacturing. Using this approach, we were able to characterize various protein variants, including post-translational modifications such as oxidation and deamidation, residual leader sequence, and proteolytic cleavage. Thus, we were able to highlight the complementarity of top-down and bottom-up approaches, which achieved the characterization of a wide range of product variants in samples of human growth hormone secreted from Pichia pastoris.

  11. An integrated top-down and bottom-up proteomic approach to characterize the antigen-binding fragment of antibodies.

    PubMed

    Dekker, Lennard; Wu, Si; Vanduijn, Martijn; Tolić, Nikolai; Stingl, Christoph; Zhao, Rui; Luider, Theo; Paša-Tolić, Ljiljana

    2014-05-01

    We have previously shown that different individuals exposed to the same antigen produce antibodies with identical mutations in their complementarity determining regions (CDR), suggesting that CDR tryptic peptides can serve as biomarkers for disease diagnosis and prognosis. Complete Fabs derived from disease specific antibodies have even higher potential; they could potentially be used for disease treatment and are required to identify the antigens toward which the antibodies are directed. However, complete Fab sequence characterization via LC-MS analysis of tryptic peptides (i.e. bottom-up) has proven to be impractical for mixtures of antibodies. To tackle this challenge, we have developed an integrated bottom-up and top-down MS approach, employing 2D chromatography coupled with Fourier transform mass spectrometry (FTMS), and applied this approach for full characterization of the variable parts of two pharmaceutical monoclonal antibodies with sensitivity comparable to the bottom-up standard. These efforts represent an essential step toward the identification of disease specific antibodies in patient samples with potentially significant clinical impact. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. An integrated top-down and bottom-up proteomic approach to characterize the antigen binding fragment of antibodies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, Leendert J.; Wu, Si; vanDuijn, Martijn M.

    2014-05-31

    We have previously shown that different individuals exposed to the same antigen produce antibodies with identical mutations in their complementarity determining regions (CDR), suggesting that CDR tryptic peptides can serve as biomarkers for disease diagnosis and prognosis. Complete Fabs derived from disease specific antibodies have even higher potential; they could potentially be used for disease treatment and are required to identify the antigens towards which the antibodies are directed. However, complete Fab sequence characterization via LC-MS analysis of tryptic peptides (i.e. bottom-up) has proven to be impractical for mixtures of antibodies. To tackle this challenge, we have developed an integratedmore » bottom-up and top-down MS approach, employing 2D chromatography coupled with Fourier transform mass spectrometry (FTMS), and applied this approach for full characterization of the variable parts of two pharmaceutical monoclonal antibodies with sensitivity comparable to the bottom-up standard. These efforts represent an essential step towards the identification of disease specific antibodies in patient samples with potentially significant clinical impact.« less

  13. Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.

    NASA Astrophysics Data System (ADS)

    Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.

    2014-12-01

    Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.

  14. Large-Scale Fabrication of Carbon Nanotube Probe Tips For Atomic Force Microscopy Critical Dimension Imaging Applications

    NASA Technical Reports Server (NTRS)

    Ye, Qi Laura; Cassell, Alan M.; Stevens, Ramsey M.; Meyyappan, Meyya; Li, Jun; Han, Jie; Liu, Hongbing; Chao, Gordon

    2004-01-01

    Carbon nanotube (CNT) probe tips for atomic force microscopy (AFM) offer several advantages over Si/Si3N4 probe tips, including improved resolution, shape, and mechanical properties. This viewgraph presentation discusses these advantages, and the drawbacks of existing methods for fabricating CNT probe tips for AFM. The presentation introduces a bottom up wafer scale fabrication method for CNT probe tips which integrates catalyst nanopatterning and nanomaterials synthesis with traditional silicon cantilever microfabrication technology. This method makes mass production of CNT AFM probe tips feasible, and can be applied to the fabrication of other nanodevices with CNT elements.

  15. Human body segmentation via data-driven graph cut.

    PubMed

    Li, Shifeng; Lu, Huchuan; Shao, Xingqing

    2014-11-01

    Human body segmentation is a challenging and important problem in computer vision. Existing methods usually entail a time-consuming training phase for prior knowledge learning with complex shape matching for body segmentation. In this paper, we propose a data-driven method that integrates top-down body pose information and bottom-up low-level visual cues for segmenting humans in static images within the graph cut framework. The key idea of our approach is first to exploit human kinematics to search for body part candidates via dynamic programming for high-level evidence. Then, by using the body parts classifiers, obtaining bottom-up cues of human body distribution for low-level evidence. All the evidence collected from top-down and bottom-up procedures are integrated in a graph cut framework for human body segmentation. Qualitative and quantitative experiment results demonstrate the merits of the proposed method in segmenting human bodies with arbitrary poses from cluttered backgrounds.

  16. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    PubMed

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. © The Author(s) 2016.

  17. Comparison of adjoint and analytical Bayesian inversion methods for constraining Asian sources of carbon monoxide using satellite (MOPITT) measurements of CO columns

    NASA Astrophysics Data System (ADS)

    Kopacz, Monika; Jacob, Daniel J.; Henze, Daven K.; Heald, Colette L.; Streets, David G.; Zhang, Qiang

    2009-02-01

    We apply the adjoint of an atmospheric chemical transport model (GEOS-Chem CTM) to constrain Asian sources of carbon monoxide (CO) with 2° × 2.5° spatial resolution using Measurement of Pollution in the Troposphere (MOPITT) satellite observations of CO columns in February-April 2001. Results are compared to the more common analytical method for solving the same Bayesian inverse problem and applied to the same data set. The analytical method is more exact but because of computational limitations it can only constrain emissions over coarse regions. We find that the correction factors to the a priori CO emission inventory from the adjoint inversion are generally consistent with those of the analytical inversion when averaged over the large regions of the latter. The adjoint solution reveals fine-scale variability (cities, political boundaries) that the analytical inversion cannot resolve, for example, in the Indian subcontinent or between Korea and Japan, and some of that variability is of opposite sign which points to large aggregation errors in the analytical solution. Upward correction factors to Chinese emissions from the prior inventory are largest in central and eastern China, consistent with a recent bottom-up revision of that inventory, although the revised inventory also sees the need for upward corrections in southern China where the adjoint and analytical inversions call for downward correction. Correction factors for biomass burning emissions derived from the adjoint and analytical inversions are consistent with a recent bottom-up inventory on the basis of MODIS satellite fire data.

  18. Disentangling the Importance of Psychological Predispositions and Social Constructions in the Organization of American Political Ideology.

    PubMed

    Verhulst, Brad; Hatemi, Peter K; Eaves, Lindon J

    2012-06-01

    Ideological preferences within the American electorate are contingent on both the environmental conditions that provide the content of the contemporary political debate and internal predispositions that motivate people to hold liberal or conservative policy preferences. In this article we apply Jost, Federico, and Napier's (2009) top-down/bottom-up theory of political attitude formation to a genetically informative population sample. In doing so, we further develop the theory by operationalizing the top-down pathway to be a function of the social environment and the bottom-up pathway as a latent set of genetic factors. By merging insights from psychology, behavioral genetics, and political science, we find strong support for the top-down/bottom-up framework that segregates the two independent pathways in the formation of political attitudes and identifies a different pattern of relationships between political attitudes at each level of analysis.

  19. A new approach for the construction of gridded emission inventories from satellite data

    NASA Astrophysics Data System (ADS)

    Kourtidis, Konstantinos; Georgoulias, Aristeidis; Mijling, Bas; van der A, Ronald; Zhang, Qiang; Ding, Jieying

    2017-04-01

    We present a new method for the derivation of anthropogenic emission estimates for SO2. The method, which we term Enhancement Ratio Method (ERM), uses observed relationships between measured OMI satellite tropospheric columnar levels of SO2 and NOx in each 0.25 deg X 0.25 deg grid box at low wind speeds, and the Daily Emission estimates Constrained by Satellite Observations (DECSO) versions v1 and v3a NOx emission estimates to scale the SO2 emissions. The method is applied over China, and emission estimates for SO2 are derived for different seasons and years (2007-2011), thus allowing an insight into the interannual evolution of the emissions. The inventory shows a large decrease of emissions during 2007-2009 and a modest increase between 2010-2011. The evolution in emission strength over time calculated here is in general agreement with bottom-up inventories, although differences exist, not only between the current inventory and other inventories but also among the bottom up inventories themselves. The gridded emission estimates derived appear to be consistent, both in their spatial distribution and their magnitude, with the Multi-resolution Emission Inventory for China (MEIC). The total emissions correlate very well with most existing inventories. This research has been financed under the FP7 Programme MarcoPolo (Grand Number 606953, Theme SPA.2013.3.2-01).

  20. Hydrophobic Interaction Chromatography for Bottom-Up Proteomics Analysis of Single Proteins and Protein Complexes.

    PubMed

    Rackiewicz, Michal; Große-Hovest, Ludger; Alpert, Andrew J; Zarei, Mostafa; Dengjel, Jörn

    2017-06-02

    Hydrophobic interaction chromatography (HIC) is a robust standard analytical method to purify proteins while preserving their biological activity. It is widely used to study post-translational modifications of proteins and drug-protein interactions. In the current manuscript we employed HIC to separate proteins, followed by bottom-up LC-MS/MS experiments. We used this approach to fractionate antibody species followed by comprehensive peptide mapping as well as to study protein complexes in human cells. HIC-reversed-phase chromatography (RPC)-mass spectrometry (MS) is a powerful alternative to fractionate proteins for bottom-up proteomics experiments making use of their distinct hydrophobic properties.

  1. Ammonia emissions from an anaerobic digestion plant estimated using atmospheric measurements and dispersion modelling.

    PubMed

    Bell, Michael W; Tang, Y Sim; Dragosits, Ulrike; Flechard, Chris R; Ward, Paul; Braban, Christine F

    2016-10-01

    Anaerobic digestion (AD) is becoming increasingly implemented within organic waste treatment operations. The storage and processing of large volumes of organic wastes through AD has been identified as a significant source of ammonia (NH3) emissions, however the totality of ammonia emissions from an AD plant have not been previously quantified. The emissions from an AD plant processing food waste were estimated through integrating ambient NH3 concentration measurements, atmospheric dispersion modelling, and comparison with published emission factors (EFs). Two dispersion models (ADMS and a backwards Lagrangian stochastic (bLS) model) were applied to calculate emission estimates. The bLS model (WindTrax) was used to back-calculate a total (top-down) emission rate for the AD plant from a point of continuous NH3 measurement downwind from the plant. The back-calculated emission rates were then input to the ADMS forward dispersion model to make predictions of air NH3 concentrations around the site, and evaluated against weekly passive sampler NH3 measurements. As an alternative approach emission rates from individual sources within the plant were initially estimated by applying literature EFs to the available site parameters concerning the chemical composition of waste materials, room air concentrations, ventilation rates, etc. The individual emission rates were input to ADMS and later tuned by fitting the simulated ambient concentrations to the observed (passive sampler) concentration field, which gave an excellent match to measurements after an iterative process. The total emission from the AD plant thus estimated by a bottom-up approach was 16.8±1.8mgs(-1), which was significantly higher than the back-calculated top-down estimate (7.4±0.78mgs(-1)). The bottom-up approach offered a more realistic treatment of the source distribution within the plant area, while the complexity of the site was not ideally suited to the bLS method, thus the bottom-up method is believed to give a better estimate of emissions. The storage of solid digestate and the aerobic treatment of liquid effluents at the site were the greatest sources of NH3 emissions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Comparing top-down and bottom-up estimates of methane emissions across multiple U.S. oil and gas basins provides insights into national O&G emissions, mitigation strategies, and research priorities

    NASA Astrophysics Data System (ADS)

    Lyon, D. R.; Alvarez, R.; Zavala Araiza, D.; Hamburg, S.

    2017-12-01

    We develop a county-level inventory of U.S. anthropogenic methane emissions by integrating multiple data sources including the Drillinginfo oil and gas (O&G) production database, Environmental Protection Agency (EPA) Greenhouse Gas Reporting Program, a previously published gridded EPA Greenhouse Gas Inventory (Maasakkers et al 2016), and recent measurements studies of O&G pneumatic devices, equipment leaks, abandoned wells, and midstream facilities. Our bottom-up estimates of total and O&G methane emissions are consistently lower than top-down, aerial mass balance estimates in ten O&G production areas. We evaluate several hypotheses for the top-down/bottom-up discrepancy including potential bias of the aerial mass balance method, temporal mismatch of top-down and bottom-up emission estimates, and source attribution errors. In most basins, the top-down/bottom-up gap cannot be explained fully without additional O&G emissions from sources not included in traditional inventories, such as super-emitters caused by malfunctions or abnormal process conditions. Top-down/bottom-up differences across multiple basins are analyzed to estimate the magnitude of these additional emissions and constrain total methane emissions from the U.S. O&G supply chain. We discuss the implications for mitigating O&G methane emissions and suggest research priorities for increasing the accuracy of future emission inventories.

  3. Research on Sustainable Development Level Evaluation of Resource-based Cities Based on Shapely Entropy and Chouqet Integral

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Qu, Weilu; Qiu, Weiting

    2018-03-01

    In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.

  4. Sediment unmixing using detrital geochronology

    NASA Astrophysics Data System (ADS)

    Sharman, Glenn R.; Johnstone, Samuel A.

    2017-11-01

    Sediment mixing within sediment routing systems can exert a strong influence on the preservation of provenance signals that yield insight into the effect of environmental forcing (e.g., tectonism, climate) on the Earth's surface. Here, we discuss two approaches to unmixing detrital geochronologic data in an effort to characterize complex changes in the sedimentary record. First, we summarize 'top-down' mixing, which has been successfully employed in the past to characterize the different fractions of prescribed source distributions ('parents') that characterize a derived sample or set of samples ('daughters'). Second, we propose the use of 'bottom-up' methods, previously used primarily for grain size distributions, to model parent distributions and the abundances of these parents within a set of daughters. We demonstrate the utility of both top-down and bottom-up approaches to unmixing detrital geochronologic data within a well-constrained sediment routing system in central California. Use of a variety of goodness-of-fit metrics in top-down modeling reveals the importance of considering the range of allowable that is well mixed over any single best-fit mixture calculation. Bottom-up modeling of 12 daughter samples from beaches and submarine canyons yields modeled parent distributions that are remarkably similar to those expected from the geologic context of the sediment-routing system. In general, mixture modeling has the potential to supplement more widely applied approaches in comparing detrital geochronologic data by casting differences between samples as differing proportions of geologically meaningful end-member provenance categories.

  5. The emergence of top-down proteomics in clinical research

    PubMed Central

    2013-01-01

    Proteomic technology has advanced steadily since the development of 'soft-ionization' techniques for mass-spectrometry-based molecular identification more than two decades ago. Now, the large-scale analysis of proteins (proteomics) is a mainstay of biological research and clinical translation, with researchers seeking molecular diagnostics, as well as protein-based markers for personalized medicine. Proteomic strategies using the protease trypsin (known as bottom-up proteomics) were the first to be developed and optimized and form the dominant approach at present. However, researchers are now beginning to understand the limitations of bottom-up techniques, namely the inability to characterize and quantify intact protein molecules from a complex mixture of digested peptides. To overcome these limitations, several laboratories are taking a whole-protein-based approach, in which intact protein molecules are the analytical targets for characterization and quantification. We discuss these top-down techniques and how they have been applied to clinical research and are likely to be applied in the near future. Given the recent improvements in mass-spectrometry-based proteomics and stronger cooperation between researchers, clinicians and statisticians, both peptide-based (bottom-up) strategies and whole-protein-based (top-down) strategies are set to complement each other and help researchers and clinicians better understand and detect complex disease phenotypes. PMID:23806018

  6. A novel method for quantitative geosteering using azimuthal gamma-ray logging.

    PubMed

    Yuan, Chao; Zhou, Cancan; Zhang, Feng; Hu, Song; Li, Chaoliu

    2015-02-01

    A novel method for quantitative geosteering by using azimuthal gamma-ray logging is proposed. Real-time up and bottom gamma-ray logs when a logging tool travels through a boundary surface with different relative dip angles are simulated with the Monte Carlo method. Study results show that response points of up and bottom gamma-ray logs when the logging tool moves towards a highly radioactive formation can be used to predict the relative dip angle, and then the distance from the drilling bit to the boundary surface is calculated. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Finding regions of interest in pathological images: an attentional model approach

    NASA Astrophysics Data System (ADS)

    Gómez, Francisco; Villalón, Julio; Gutierrez, Ricardo; Romero, Eduardo

    2009-02-01

    This paper introduces an automated method for finding diagnostic regions-of-interest (RoIs) in histopathological images. This method is based on the cognitive process of visual selective attention that arises during a pathologist's image examination. Specifically, it emulates the first examination phase, which consists in a coarse search for tissue structures at a "low zoom" to separate the image into relevant regions.1 The pathologist's cognitive performance depends on inherent image visual cues - bottom-up information - and on acquired clinical medicine knowledge - top-down mechanisms -. Our pathologist's visual attention model integrates the latter two components. The selected bottom-up information includes local low level features such as intensity, color, orientation and texture information. Top-down information is related to the anatomical and pathological structures known by the expert. A coarse approximation to these structures is achieved by an oversegmentation algorithm, inspired by psychological grouping theories. The algorithm parameters are learned from an expert pathologist's segmentation. Top-down and bottom-up integration is achieved by calculating a unique index for each of the low level characteristics inside the region. Relevancy is estimated as a simple average of these indexes. Finally, a binary decision rule defines whether or not a region is interesting. The method was evaluated on a set of 49 images using a perceptually-weighted evaluation criterion, finding a quality gain of 3dB when comparing to a classical bottom-up model of attention.

  8. Why bottom-up taxonomies are unlikely to satisfy the quest for a definitive taxonomy of situations.

    PubMed

    Reis, Harry T

    2018-03-01

    The recent advent of methods for large-scale data collection has provided an unprecedented opportunity for researchers who seek to develop a taxonomy of situations. Parrigon, Woo, Tay, and Wang's (2017) CAPTIONs model is the latest such effort. In this comment, I argue that although bottom-up approaches of this sort have clear value, they are unlikely to provide the sort of definitive, comprehensive, and theoretically integrative taxonomy that the field wants and needs. In large part, this is because bottom-up taxonomies represent what is common about situations and not what is theoretically important and influential about them. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  9. The updated bottom up solution applied to mass spectrometry of soybean oil in a dietary supplement gelcap

    USDA-ARS?s Scientific Manuscript database

    Among the goals of lipidomics applied to triacylglycerols (TAGs) is identification of molecular species, degree and location of unsaturation, and positions of fatty acyl chains (i.e., identification of regioisomers). Toward those ends, we define one, two, and three ‘Critical Ratios’ for Type I, II, ...

  10. Oriented bottom-up growth of armchair graphene nanoribbons on germanium

    DOEpatents

    Arnold, Michael Scott; Jacobberger, Robert Michael

    2016-03-15

    Graphene nanoribbon arrays, methods of growing graphene nanoribbon arrays and electronic and photonic devices incorporating the graphene nanoribbon arrays are provided. The graphene nanoribbons in the arrays are formed using a scalable, bottom-up, chemical vapor deposition (CVD) technique in which the (001) facet of the germanium is used to orient the graphene nanoribbon crystals along the [110] directions of the germanium.

  11. Event-Related Potentials of Bottom-Up and Top-Down Processing of Emotional Faces

    PubMed Central

    Moradi, Afsane; Mehrinejad, Seyed Abolghasem; Ghadiri, Mohammad; Rezaei, Farzin

    2017-01-01

    Introduction: Emotional stimulus is processed automatically in a bottom-up way or can be processed voluntarily in a top-down way. Imaging studies have indicated that bottom-up and top-down processing are mediated through different neural systems. However, temporal differentiation of top-down versus bottom-up processing of facial emotional expressions has remained to be clarified. The present study aimed to explore the time course of these processes as indexed by the emotion-specific P100 and late positive potential (LPP) event-related potential (ERP) components in a group of healthy women. Methods: Fourteen female students of Alzahra University, Tehran, Iran aged 18–30 years, voluntarily participated in the study. The subjects completed 2 overt and covert emotional tasks during ERP acquisition. Results: The results indicated that fearful expressions significantly produced greater P100 amplitude compared to other expressions. Moreover, the P100 findings showed an interaction between emotion and processing conditions. Further analysis indicated that within the overt condition, fearful expressions elicited more P100 amplitude compared to other emotional expressions. Also, overt conditions created significantly more LPP latencies and amplitudes compared to covert conditions. Conclusion: Based on the results, early perceptual processing of fearful face expressions is enhanced in top-down way compared to bottom-up way. It also suggests that P100 may reflect an attentional bias toward fearful emotions. However, no such differentiation was observed within later processing stages of face expressions, as indexed by the ERP LPP component, in a top-down versus bottom-up way. Overall, this study provides a basis for further exploring of bottom-up and top-down processes underlying emotion and may be typically helpful for investigating the temporal characteristics associated with impaired emotional processing in psychiatric disorders. PMID:28446947

  12. Lipoaspirate fluid proteome: A preliminary investigation by LC-MS top-down/bottom-up integrated platform of a high potential biofluid in regenerative medicine.

    PubMed

    Inserra, Ilaria; Martelli, Claudia; Cipollina, Mara; Cicione, Claudia; Iavarone, Federica; Taranto, Giuseppe Di; Barba, Marta; Castagnola, Massimo; Desiderio, Claudia; Lattanzi, Wanda

    2016-04-01

    The lipoaspirate fluid (LAF) is emerging as a potentially valuable source in regenerative medicine. In particular, our group recently demonstrated that it is able to exert osteoinductive properties in vitro. This original observation stimulated the investigation of the proteomic component of LAF, by means of LC-ESI-LTQ-Orbitrap-MS top-down/bottom-up integrated approach, which represents the object of the present study. Top-down analyses required the optimization of sample pretreatment procedures to enable the correct investigation of the intact proteome. Bottom-up analyses have been directly applied to untreated samples after monodimensional SDS-PAGE separation. The analysis of the acid-soluble fraction of LAF by top-down approach allowed demonstrating the presence of albumin and hemoglobin fragments (i.e. VV- and LVV-hemorphin-7), thymosins β4 and β10 peptides, ubiquitin and acyl-CoA binding protein; adipogenesis regulatory factor, perilipin-1 fragments, and S100A6, along with their PTMs. Part of the bottom-up proteomic profile was reproducibly found in both tested samples. The bottom-up approach allowed demonstrating the presence of proteins, listed among the components of adipose tissue and/or comprised within the ASCs intracellular content and secreted proteome. Our data provide a first glance on the LAF molecular profile, which is consistent with its tissue environment. LAF appeared to contain bioactive proteins, peptides and paracrine factors, suggesting its potential translational exploitation. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Calculation of parameters of technological equipment for deep-sea mining

    NASA Astrophysics Data System (ADS)

    Yungmeister, D. A.; Ivanov, S. E.; Isaev, A. I.

    2018-03-01

    The actual problem of extracting minerals from the bottom of the world ocean is considered. On the ocean floor, three types of minerals are of interest: iron-manganese concretions (IMC), cobalt-manganese crusts (CMC) and sulphides. The analysis of known designs of machines and complexes for the extraction of IMC is performed. These machines are based on the principle of excavating the bottom surface; however such methods do not always correspond to “gentle” methods of mining. The ecological purity of such mining methods does not meet the necessary requirements. Such machines require the transmission of high electric power through the water column, which in some cases is a significant challenge. The authors analyzed the options of transportation of the extracted mineral from the bottom. The paper describes the design of machines that collect IMC by the method of vacuum suction. In this method, the gripping plates or drums are provided with cavities in which a vacuum is created and individual IMC are attracted to the devices by a pressure drop. The work of such machines can be called “gentle” processing technology of the bottom areas. Their environmental impact is significantly lower than mechanical devices that carry out the raking of IMC. The parameters of the device for lifting the IMC collected on the bottom are calculated. With the use of Kevlar ropes of serial production up to 0.06 meters in diameter, with a cycle time of up to 2 hours and a lifting speed of up to 3 meters per second, a productivity of about 400,000 tons per year can be realized for IMC. The development of machines based on the calculated parameters and approbation of their designs will create a unique complex for the extraction of minerals at oceanic deposits.

  14. Salient region detection by fusing bottom-up and top-down features extracted from a single image.

    PubMed

    Tian, Huawei; Fang, Yuming; Zhao, Yao; Lin, Weisi; Ni, Rongrong; Zhu, Zhenfeng

    2014-10-01

    Recently, some global contrast-based salient region detection models have been proposed based on only the low-level feature of color. It is necessary to consider both color and orientation features to overcome their limitations, and thus improve the performance of salient region detection for images with low-contrast in color and high-contrast in orientation. In addition, the existing fusion methods for different feature maps, like the simple averaging method and the selective method, are not effective sufficiently. To overcome these limitations of existing salient region detection models, we propose a novel salient region model based on the bottom-up and top-down mechanisms: the color contrast and orientation contrast are adopted to calculate the bottom-up feature maps, while the top-down cue of depth-from-focus from the same single image is used to guide the generation of final salient regions, since depth-from-focus reflects the photographer's preference and knowledge of the task. A more general and effective fusion method is designed to combine the bottom-up feature maps. According to the degree-of-scattering and eccentricities of feature maps, the proposed fusion method can assign adaptive weights to different feature maps to reflect the confidence level of each feature map. The depth-from-focus of the image as a significant top-down feature for visual attention in the image is used to guide the salient regions during the fusion process; with its aid, the proposed fusion method can filter out the background and highlight salient regions for the image. Experimental results show that the proposed model outperforms the state-of-the-art models on three public available data sets.

  15. Procedural uncertainties of Proctor compaction tests applied on MSWI bottom ash.

    PubMed

    Izquierdo, Maria; Querol, Xavier; Vazquez, Enric

    2011-02-28

    MSWI bottom ash is a well-graded highly compactable material that can be used as a road material in unbound pavements. Achieving the compactness assumed in the design of the pavement is of primary concern to ensure long term structural stability. Regulations on road construction in a number of EU countries rely on standard tests originally developed for natural aggregates, which may not be appropriate to accurately assess MSWI bottom ash. This study is intended to assist in consistently assessing MSWI bottom ash compaction by means of the Proctor method. This test is routinely applied to address unbound road materials and suggests two methods. Compaction parameters show a marked procedural dependency due to the particle morphology and weak particle strength of ash. Re-compacting a single batch sample to determine Proctor curves is a common practise that turns out to overvalue optimum moisture contents and maximum dry densities. This could result in wet-side compactions not meeting stiffness requirements. Inaccurate moisture content measurements during testing may also induce erroneous determinations of compaction parameters. The role of a number of physical properties of MSWI bottom ash in compaction is also investigated. Copyright © 2011 Elsevier B.V. All rights reserved.

  16. Quantifying atmospheric pollutant emissions from open biomass burning with multiple methods: a case study for Yangtze River Delta region, China

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Zhao, Y.

    2017-12-01

    To understand the differences and their origins of emission inventories based on various methods for the source, emissions of PM10, PM2.5, OC, BC, CH4, VOCs, CO, CO2, NOX, SO2 and NH3 from open biomass burning (OBB) in Yangtze River Delta (YRD) are calculated for 2005-2012 using three (bottom-up, FRP-based and constraining) approaches. The inter-annual trends in emissions with FRP-based and constraining methods are similar with the fire counts in 2005-2012, while that with bottom-up method is different. For most years, emissions of all species estimated with constraining method are smaller than those with bottom-up method (except for VOCs), while they are larger than those with FRP-based (except for EC, CH4 and NH3). Such discrepancies result mainly from different masses of crop residues burned in the field (CRBF) estimated in the three methods. Among the three methods, the simulated concentrations from chemistry transport modeling with the constrained emissions are the closest to available observations, implying the result from constraining method is the best estimation for OBB emissions. CO emissions in the three methods are compared with other studies. Similar temporal variations were found for the constrained emissions, FRP-based emissions, GFASv1.0 and GFEDv4.1s, with the largest and the lowest emissions estimated for 2012 and 2006, respectively. The constrained CO emissions in this study are smaller than those in other studies based on bottom-up method and larger than those based on burned area and FRP derived from satellite. The contributions of OBB to two particulate pollution events in 2010 and 2012 are analyzed with the brute-force method. The average contribution of OBB to PM10 mass concentrations in June 8-14 2012 was estimated at 38.9% (74.8 μg m-3), larger than that in June 17-24, 2010 at 23.6 % (38.5 μg m-3). Influences of diurnal curves and meteorology on air pollution caused by OBB are also evaluated, and the results suggest that air pollution caused by OBB will become heavier if the meteorological conditions are unfavorable, and that more attention should be paid to the supervision in night. Quantified with the Monte-Carlo simulation, the uncertainties of OBB emissions with constraining method are significantly lower than those with bottom-up or FRP-based methods.

  17. Methods, Tools, and Data for Coastal System Resilience Assessments

    DTIC Science & Technology

    2015-10-06

    Baseline Resilience Indicators for Communities ( BRIC ) BUILDING STRONG® Innovative solutions for a safer, better world ► T: SoVI® – Social...address policy  Objective, although data may be arbitrarily weighted in an index Examples: Top-Down (T) and Bottom-Up (B) Bottom-UpTop-Down BRIC ...Coastal Resilience Index USACE’s Resilience Matrix ASCE’s Infrastructure Report Card T: Baseline Resilience Index for Communities ( BRIC

  18. Applying Research to Making Life-Affecting Judgments and Decisions

    ERIC Educational Resources Information Center

    Gibbs, Leonard

    2007-01-01

    This keynote address argues that in order for baccalaureate and masters degree students to apply research to make better judgments and decisions in their life-affecting practice and in response to the information revolution, the helping professions need to redesign (from the bottom up) not overhaul (make a few changes in) the way research methods…

  19. Cognitive functions of the posterior parietal cortex: top-down and bottom-up attentional control

    PubMed Central

    Shomstein, Sarah

    2012-01-01

    Although much less is known about human parietal cortex than that of homologous monkey cortex, recent studies, employing neuroimaging, and neuropsychological methods, have begun to elucidate increasingly fine-grained functional and structural distinctions. This review is focused on recent neuroimaging and neuropsychological studies elucidating the cognitive roles of dorsal and ventral regions of parietal cortex in top-down and bottom-up attentional orienting, and on the interaction between the two attentional allocation mechanisms. Evidence is reviewed arguing that regions along the dorsal areas of the parietal cortex, including the superior parietal lobule (SPL) are involved in top-down attentional orienting, while ventral regions including the temporo-parietal junction (TPJ) are involved in bottom-up attentional orienting. PMID:22783174

  20. C-STrap Sample Preparation Method--In-Situ Cysteinyl Peptide Capture for Bottom-Up Proteomics Analysis in the STrap Format.

    PubMed

    Zougman, Alexandre; Banks, Rosamonde E

    2015-01-01

    Recently we introduced the concept of Suspension Trapping (STrap) for bottom-up proteomics sample processing that is based upon SDS-mediated protein extraction, swift detergent removal and rapid reactor-type protein digestion in a quartz depth filter trap. As the depth filter surface is made of silica, it is readily modifiable with various functional groups using the silane coupling chemistries. Thus, during the digest, peptides possessing specific features could be targeted for enrichment by the functionalized depth filter material while non-targeted peptides could be collected as an unbound distinct fraction after the digest. In the example presented here the quartz depth filter surface is functionalized with the pyridyldithiol group therefore enabling reversible in-situ capture of the cysteine-containing peptides generated during the STrap-based digest. The described C-STrap method retains all advantages of the original STrap methodology and provides robust foundation for the conception of the targeted in-situ peptide fractionation in the STrap format for bottom-up proteomics. The presented data support the method's use in qualitative and semi-quantitative proteomics experiments.

  1. Top-down and bottom-up lipidomic analysis of rabbit lipoproteins under different metabolic conditions using flow field-flow fractionation, nanoflow liquid chromatography and mass spectrometry.

    PubMed

    Byeon, Seul Kee; Kim, Jin Yong; Lee, Ju Yong; Chung, Bong Chul; Seo, Hong Seog; Moon, Myeong Hee

    2015-07-31

    This study demonstrated the performances of top-down and bottom-up approaches in lipidomic analysis of lipoproteins from rabbits raised under different metabolic conditions: healthy controls, carrageenan-induced inflammation, dehydration, high cholesterol (HC) diet, and highest cholesterol diet with inflammation (HCI). In the bottom-up approach, the high density lipoproteins (HDL) and the low density lipoproteins (LDL) were size-sorted and collected on a semi-preparative scale using a multiplexed hollow fiber flow field-flow fractionation (MxHF5), followed by nanoflow liquid chromatography-ESI-MS/MS (nLC-ESI-MS/MS) analysis of the lipids extracted from each lipoprotein fraction. In the top-down method, size-fractionated lipoproteins were directly infused to MS for quantitative analysis of targeted lipids using chip-type asymmetrical flow field-flow fractionation-electrospray ionization-tandem mass spectrometry (cAF4-ESI-MS/MS) in selected reaction monitoring (SRM) mode. The comprehensive bottom-up analysis yielded 122 and 104 lipids from HDL and LDL, respectively. Rabbits within the HC and HCI groups had lipid patterns that contrasted most substantially from those of controls, suggesting that HC diet significantly alters the lipid composition of lipoproteins. Among the identified lipids, 20 lipid species that exhibited large differences (>10-fold) were selected as targets for the top-down quantitative analysis in order to compare the results with those from the bottom-up method. Statistical comparison of the results from the two methods revealed that the results were not significantly different for most of the selected species, except for those species with only small differences in concentration between groups. The current study demonstrated that top-down lipid analysis using cAF4-ESI-MS/MS is a powerful high-speed analytical platform for targeted lipidomic analysis that does not require the extraction of lipids from blood samples. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Developing a Comprehensive and Comparative Questionnaire for Measuring Personality in Chimpanzees Using a Simultaneous Top-Down/Bottom-Up Design

    PubMed Central

    Freeman, Hani D.; Brosnan, Sarah F.; Hopper, Lydia M.; Lambeth, Susan P.; Schapiro, Steven J.; Gosling, Samuel D.

    2013-01-01

    One effective method for measuring personality in primates is to use personality trait ratings to distill the experience of people familiar with the individual animals. Previous rating instruments were created using either top-down or bottom-up approaches. Top-down approaches, which essentially adapt instruments originally designed for use with another species, can unfortunately lead to the inclusion of traits irrelevant to chimpanzees or fail to include all relevant aspects of chimpanzee personality. Conversely, because bottom-up approaches derive traits specifically for chimpanzees, their unique items may impede comparisons with findings in other studies and other species. To address the limitations of each approach, we developed a new personality rating scale using a combined top-down/bottom-up design. Seventeen raters rated 99 chimpanzees on the new 41-item scale, with all but one item being rated reliably. Principal components analysis, using both varimax and direct oblimin rotations, identified six broad factors. Strong evidence was found for five of the factors (Reactivity/Undependability, Dominance, Openness, Extraversion, and Agreeableness). A sixth factor (Methodical) was offered provisionally until more data are collected. We validated the factors against behavioral data collected independently on the chimpanzees. The five factors demonstrated good evidence for convergent and predictive validity, thereby underscoring the robustness of the factors. Our combined top-down/ bottom-up approach provides the most extensive data to date to support the universal existence of these five personality factors in chimpanzees. This framework, which facilitates cross-species comparisons, can also play a vital role in understanding the evolution of personality and can assist with husbandry and welfare efforts. PMID:23733359

  3. Evaluation of multi-resolution satellite sensors for assessing water quality and bottom depth of Lake Garda.

    PubMed

    Giardino, Claudia; Bresciani, Mariano; Cazzaniga, Ilaria; Schenk, Karin; Rieger, Patrizia; Braga, Federica; Matta, Erica; Brando, Vittorio E

    2014-12-15

    In this study we evaluate the capabilities of three satellite sensors for assessing water composition and bottom depth in Lake Garda, Italy. A consistent physics-based processing chain was applied to Moderate Resolution Imaging Spectroradiometer (MODIS), Landsat-8 Operational Land Imager (OLI) and RapidEye. Images gathered on 10 June 2014 were corrected for the atmospheric effects with the 6SV code. The computed remote sensing reflectance (Rrs) from MODIS and OLI were converted into water quality parameters by adopting a spectral inversion procedure based on a bio-optical model calibrated with optical properties of the lake. The same spectral inversion procedure was applied to RapidEye and to OLI data to map bottom depth. In situ measurements of Rrs and of concentrations of water quality parameters collected in five locations were used to evaluate the models. The bottom depth maps from OLI and RapidEye showed similar gradients up to 7 m (r = 0.72). The results indicate that: (1) the spatial and radiometric resolutions of OLI enabled mapping water constituents and bottom properties; (2) MODIS was appropriate for assessing water quality in the pelagic areas at a coarser spatial resolution; and (3) RapidEye had the capability to retrieve bottom depth at high spatial resolution. Future work should evaluate the performance of the three sensors in different bio-optical conditions.

  4. Unsupervised tattoo segmentation combining bottom-up and top-down cues

    NASA Astrophysics Data System (ADS)

    Allen, Josef D.; Zhao, Nan; Yuan, Jiangbo; Liu, Xiuwen

    2011-06-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for finding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a figureground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is efficient and suitable for further tattoo classification and retrieval purpose.

  5. Implementation of Non-Destructive Evaluation and Process Monitoring in DLP-based Additive Manufacturing

    NASA Astrophysics Data System (ADS)

    Kovalenko, Iaroslav; Verron, Sylvain; Garan, Maryna; Šafka, Jiří; Moučka, Michal

    2017-04-01

    This article describes a method of in-situ process monitoring in the digital light processing (DLP) 3D printer. It is based on the continuous measurement of the adhesion force between printing surface and bottom of a liquid resin bath. This method is suitable only for the bottom-up DPL printers. Control system compares the force at the moment of unsticking of printed layer from the bottom of the tank, when it has the largest value in printing cycle, with theoretical value. Implementation of suggested algorithm can make detection of faults during the printing process possible.

  6. Computer-aided design of nano-filter construction using DNA self-assembly

    NASA Astrophysics Data System (ADS)

    Mohammadzadegan, Reza; Mohabatkar, Hassan

    2007-01-01

    Computer-aided design plays a fundamental role in both top-down and bottom-up nano-system fabrication. This paper presents a bottom-up nano-filter patterning process based on DNA self-assembly. In this study we designed a new method to construct fully designed nano-filters with the pores between 5 nm and 9 nm in diameter. Our calculations illustrated that by constructing such a nano-filter we would be able to separate many molecules.

  7. Sediment unmixing using detrital geochronology

    USGS Publications Warehouse

    Sharman, Glenn R.; Johnstone, Samuel

    2017-01-01

    Sediment mixing within sediment routing systems can exert a strong influence on the preservation of provenance signals that yield insight into the influence of environmental forcings (e.g., tectonism, climate) on the earth’s surface. Here we discuss two approaches to unmixing detrital geochronologic data in an effort to characterize complex changes in the sedimentary record. First we summarize ‘top-down’ mixing, which has been successfully employed in the past to characterize the different fractions of prescribed source distributions (‘parents’) that characterize a derived sample or set of samples (‘daughters’). Second we propose the use of ‘bottom-up’ methods, previously used primarily for grain size distributions, to model parent distributions and the abundances of these parents within a set of daughters. We demonstrate the utility of both top-down and bottom-up approaches to unmixing detrital geochronologic data within a well-constrained sediment routing system in central California. Use of a variety of goodness-of-fit metrics in top-down modeling reveals the importance of considering the range of allowable mixtures over any single best-fit mixture calculation. Bottom-up modeling of 12 daughter samples from beaches and submarine canyons yields modeled parent distributions that are remarkably similar to those expected from the geologic context of the sediment-routing system. In general, mixture modeling has potential to supplement more widely applied approaches in comparing detrital geochronologic data by casting differences between samples as differing proportions of geologically meaningful end-member provenance categories.

  8. Factors controlling bacteria and protists in selected Mazurian eutrophic lakes (North-Eastern Poland) during spring

    PubMed Central

    2013-01-01

    Background The bottom-up (food resources) and top-down (grazing pressure) controls, with other environmental parameters (water temperature, pH) are the main factors regulating the abundance and structure of microbial communities in aquatic ecosystems. It is still not definitively decided which of the two control mechanisms is more important. The significance of bottom-up versus top-down controls may alter with lake productivity and season. In oligo- and/or mesotrophic environments, the bottom-up control is mostly important in regulating bacterial abundances, while in eutrophic systems, the top-down control may be more significant. Results The abundance of bacteria, heterotrophic (HNF) and autotrophic (ANF) nanoflagellates and ciliates, as well as bacterial production (BP) and metabolically active cells of bacteria (CTC, NuCC, EST) were studied in eutrophic lakes (Mazurian Lake District, Poland) during spring. The studied lakes were characterized by high nanoflagellate (mean 17.36 ± 8.57 × 103 cells ml-1) and ciliate abundances (mean 59.9 ± 22.4 ind. ml-1) that were higher in the euphotic zone than in the bottom waters, with relatively low bacterial densities (4.76 ± 2.08 × 106 cells ml-1) that were lower in the euphotic zone compared to the profundal zone. Oligotrichida (Rimostrombidium spp.), Prostomatida (Urotricha spp.) and Scuticociliatida (Histiobalantium bodamicum) dominated in the euphotic zone, whereas oligotrichs Tintinnidium sp. and prostomatids Urotricha spp. were most numerous in the bottom waters. Among the staining methods used to examine bacterial cellular metabolic activity, the lowest percentage of active cells was recorded with the CTC (1.5–15.4%) and EST (2.7–14.2%) assay in contrast to the NuCC (28.8–97.3%) method. Conclusions In the euphotic zone, the bottom-up factors (TP and DOC concentrations) played a more important role than top-down control (grazing by protists) in regulating bacterial numbers and activity. None of the single analyzed factors controlled bacterial abundance in the bottom waters. The results of this study suggest that both control mechanisms, bottom-up and top-down, simultaneously regulated bacterial community and their activity in the profundal zone of the studied lakes during spring. In both lake water layers, food availability (algae, nanoflagellates) was probably the major factor determining ciliate abundance and their composition. In the bottom waters, both groups of protists appeared to be also influenced by oxygen, temperature, and total phosphorus. PMID:23566491

  9. When the firm prevents the crash: Avoiding market collapse with partial control.

    PubMed

    Levi, Asaf; Sabuco, Juan; A F Sanjuán, Miguel

    2017-01-01

    Market collapse is one of the most dramatic events in economics. Such a catastrophic event can emerge from the nonlinear interactions between the economic agents at the micro level of the economy. Transient chaos might be a good description of how a collapsing market behaves. In this work, we apply a new control method, the partial control method, with the goal of avoiding this disastrous event. Contrary to common control methods that try to influence the system from the outside, here the market is controlled from the bottom up by one of the most basic components of the market-the firm. This is the first time that the partial control method is applied on a strictly economical system in which we also introduce external disturbances. We show how the firm is capable of controlling the system avoiding the collapse by only adjusting the selling price of the product or the quantity of production in accordance to the market circumstances. Additionally, we demonstrate how a firm with a large market share is capable of influencing the demand achieving price stability across the retail and wholesale markets. Furthermore, we prove that the control applied in both cases is much smaller than the external disturbances.

  10. Top-down and bottom-up definitions of human failure events in human reliability analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald Laurids

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less

  11. Bottom-up heating method for producing polyethylene lunar concrete in lunar environment

    NASA Astrophysics Data System (ADS)

    Lee, Jaeho; Ann, Ki Yong; Lee, Tai Sik; Mitikie, Bahiru Bewket

    2018-07-01

    The Apollo Program launched numerous missions to the Moon, Earth's nearest and only natural satellite. NASA is now planning new Moon missions as a first step toward human exploration of Mars and other planets. However, the Moon has an extreme environment for humans. In-situ resource utilization (ISRU) construction must be used on the Moon to build habitable structures. Previous studies on polymeric lunar concrete investigated top-down heating for stabilizing the surface. This study investigates bottom-up heating with manufacturing temperatures as low as 200 °C in a vacuum chamber that simulates the lunar environment. A maximum compressive strength of 5.7 MPa is attained; this is suitable for constructing habitable structures. Furthermore, the bottom-up heating approach achieves solidification two times faster than does the top-down heating approach.

  12. Baking a mass-spectrometry data PIE with McMC and simulated annealing: predicting protein post-translational modifications from integrated top-down and bottom-up data.

    PubMed

    Jefferys, Stuart R; Giddings, Morgan C

    2011-03-15

    Post-translational modifications are vital to the function of proteins, but are hard to study, especially since several modified isoforms of a protein may be present simultaneously. Mass spectrometers are a great tool for investigating modified proteins, but the data they provide is often incomplete, ambiguous and difficult to interpret. Combining data from multiple experimental techniques-especially bottom-up and top-down mass spectrometry-provides complementary information. When integrated with background knowledge this allows a human expert to interpret what modifications are present and where on a protein they are located. However, the process is arduous and for high-throughput applications needs to be automated. This article explores a data integration methodology based on Markov chain Monte Carlo and simulated annealing. Our software, the Protein Inference Engine (the PIE) applies these algorithms using a modular approach, allowing multiple types of data to be considered simultaneously and for new data types to be added as needed. Even for complicated data representing multiple modifications and several isoforms, the PIE generates accurate modification predictions, including location. When applied to experimental data collected on the L7/L12 ribosomal protein the PIE was able to make predictions consistent with manual interpretation for several different L7/L12 isoforms using a combination of bottom-up data with experimentally identified intact masses. Software, demo projects and source can be downloaded from http://pie.giddingslab.org/

  13. A new method for estimating the turbulent heat flux at the bottom of the daily mixed layer

    NASA Technical Reports Server (NTRS)

    Imawaki, Shiro; Niiler, Pearn P.; Gautier, Catherine H.; Knox, Robert A.; Halpern, David

    1988-01-01

    Temperature data in the mixed layer and net solar irradiance data at the sea surface are used to estimate the vertical turbulent heat flux at the bottom of the daily mixed layer. The method is applied to data obtained in the eastern tropical Pacific, where the daily cycle in the temperature field is confined to the upper 10-25 m. Equatorial turbulence measurements indicate that the turbulent heat flux is much greater during nighttime than daytime.

  14. One Size Does Not Fit All: Human Failure Event Decomposition and Task Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald Laurids Boring, PhD

    2014-09-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered or exacerbated by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally,more » both approaches should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down—defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications. In this paper, I first review top-down and bottom-up approaches for defining HFEs and then present a seven-step guideline to ensure a task analysis completed as part of human error identification decomposes to a level suitable for use as HFEs. This guideline illustrates an effective way to bridge the bottom-up approach with top-down requirements.« less

  15. Bottom-up modeling of damage in heterogeneous quasi-brittle solids

    NASA Astrophysics Data System (ADS)

    Rinaldi, Antonio

    2013-03-01

    The theoretical modeling of multisite cracking in quasi-brittle materials is a complex damage problem, hard to model with traditional methods of fracture mechanics due to its multiscale nature and to strain localization induced by microcracks interaction. Macroscale "effective" elastic models can be conveniently applied if a suitable Helmholtz free energy function is identified for a given material scenario. Del Piero and Truskinovsky (Continuum Mech Thermodyn 21:141-171, 2009), among other authors, investigated macroscale continuum solutions capable of matching—in a top-down view—the phenomenology of the damage process for quasi-brittle materials regardless of the microstructure. On the contrary, this paper features a physically based solution method that starts from the direct consideration of the microscale properties and, in a bottom-up view, recovers a continuum elastic description. This procedure is illustrated for a simple one-dimensional problem of this type, a bar modeled stretched by an axial displacement, where the bar is modeled as a 2D random lattice of decohesive spring elements of finite strength. The (microscale) data from simulations are used to identify the "exact" (macro-) damage parameter and to build up the (macro-) Helmholtz function for the equivalent elastic model, bridging the macroscale approach by Del Piero and Truskinovsky. The elastic approach, coupled with microstructural knowledge, becomes a more powerful tool to reproduce a broad class of macroscopic material responses by changing the convexity-concavity of the Helmholtz energy. The analysis points out that mean-field statistics are appropriate prior to damage localization but max-field statistics are better suited in the softening regime up to failure, where microstrain fluctuation needs to be incorporated in the continuum model. This observation is of consequence to revise mean-field damage models from literature and to calibrate Nth gradient continuum models.

  16. Indirect nitrous oxide emissions from streams within the US Corn Belt scale with stream order

    PubMed Central

    Turner, Peter A.; Griffis, Timothy J.; Lee, Xuhui; Baker, John M.; Venterea, Rodney T.; Wood, Jeffrey D.

    2015-01-01

    N2O is an important greenhouse gas and the primary stratospheric ozone depleting substance. Its deleterious effects on the environment have prompted appeals to regulate emissions from agriculture, which represents the primary anthropogenic source in the global N2O budget. Successful implementation of mitigation strategies requires robust bottom-up inventories that are based on emission factors (EFs), simulation models, or a combination of the two. Top-down emission estimates, based on tall-tower and aircraft observations, indicate that bottom-up inventories severely underestimate regional and continental scale N2O emissions, implying that EFs may be biased low. Here, we measured N2O emissions from streams within the US Corn Belt using a chamber-based approach and analyzed the data as a function of Strahler stream order (S). N2O fluxes from headwater streams often exceeded 29 nmol N2O-N m−2⋅s−1 and decreased exponentially as a function of S. This relation was used to scale up riverine emissions and to assess the differences between bottom-up and top-down emission inventories at the local to regional scale. We found that the Intergovernmental Panel on Climate Change (IPCC) indirect EF for rivers (EF5r) is underestimated up to ninefold in southern Minnesota, which translates to a total tier 1 agricultural underestimation of N2O emissions by 40%. We show that accounting for zero-order streams as potential N2O hotspots can more than double the agricultural budget. Applying the same analysis to the US Corn Belt demonstrates that the IPCC EF5r underestimation explains the large differences observed between top-down and bottom-up emission estimates. PMID:26216994

  17. Optimization and limit of a tilt manipulation stage based on the electrowetting-on-dielectric principle

    NASA Astrophysics Data System (ADS)

    Tan, Xiao; Tao, Zhi; Suzuki, Kenji; Li, Haiwang

    2017-12-01

    This work designed a new tilt manipulation stage based on the electrowetting-on-dielectric (EWOD) principle as the actuating mechanism and investigated the performance of that stage. The stage was fabricated using a universal MEMS (Micro-Electro-Mechanical System) fabrication method. In the previously demonstrated form of this device, the tilt stage consisted of a top plate that functions as a mirror, a bottom plate that was designed for changing the shape of water droplets, and supporters that were fixed between the top and bottom plate. That device was actuated by a voltage applied to the bottom plate, resulting in a static electric force actuating the shape change in the droplets by moving the top plate in the vertical direction. Previous experimental results indicated that that device can tilt at up to ±1.8°, with a resolution of 7 μm in displacement and 0.05° in angle. By selecting the best combination of the dielectric layer, the tilt angle was maximized. The new device, fabricated using a common and straightforward fabrication method, avoids deflection of the top plate and grounding in the bottom plate. Because of the limit of Teflon and other MEMS materials, this device has a tilt angle in the range of 3.2-3.5° according to the experimental data for friction and the EWOD device limit, which is close to 1.8°. This paper also describe the investigation of the effects of various parameters, e.g., various dielectric materials, thicknesses, and droplet type and volume, on the performance of the stage. The results indicate that the apparent frictions coefficient of the solid-liquid interface may remain constant, i.e., the friction force is proportional to the normal support force and the apparent frictions coefficient.

  18. Bayesian Analogy with Relational Transformations

    ERIC Educational Resources Information Center

    Lu, Hongjing; Chen, Dawn; Holyoak, Keith J.

    2012-01-01

    How can humans acquire relational representations that enable analogical inference and other forms of high-level reasoning? Using comparative relations as a model domain, we explore the possibility that bottom-up learning mechanisms applied to objects coded as feature vectors can yield representations of relations sufficient to solve analogy…

  19. Comparing Top-down and Bottom-up Estimates of Methane Emissions across Multiple U.S. Basins Provides Insights into National Oil and Gas Emissions and Mitigation Strategies

    NASA Astrophysics Data System (ADS)

    Hamburg, S.; Alvarez, R.; Lyon, D. R.; Zavala-Araiza, D.

    2016-12-01

    Several recent studies quantified regional methane emissions in U.S. oil and gas (O&G) basins using top-down approaches such as airborne mass balance measurements. These studies apportioned total methane emissions to O&G based on hydrocarbon ratios or subtracting bottom-up estimates of other sources. In most studies, top-down estimates of O&G methane emissions exceeded bottom-up emission inventories. An exception is the Barnett Shale Coordinated Campaign, which found agreement between aircraft mass balance estimates and a custom emission inventory. Reconciliation of Barnett Shale O&G emissions depended on two key features: 1) matching the spatial domains of top-down and bottom-up estimates, and 2) accounting for fat-tail sources in site-level emission factors. We construct spatially explicit custom emission inventories for domains with top-down O&G emission estimates in eight major U.S. oil and gas production basins using a variety of data sources including a spatially-allocated U.S. EPA Greenhouse Gas Inventory, the EPA Greenhouse Gas Reporting Program, state emission inventories, and recently published measurement studies. A comparison of top-down and our bottom-up estimates of O&G emissions constrains the gap between these approaches and elucidates regional variability in production-normalized loss rates. A comparison of component-level and site-level emission estimates of production sites in the Barnett Shale region - where comprehensive activity data and emissions estimates are available - indicates that abnormal process conditions contribute about 20% of regional O&G emissions. Combining these two analyses provides insights into the relative importance of different equipment, processes, and malfunctions to emissions in each basin. These data allow us to estimate the U.S. O&G supply chain loss rate, recommend mitigation strategies to reduce emissions from existing infrastructure, and discuss how a similar approach can be applied internationally.

  20. Language Learning in Higher Education: Portuguese Student Voices

    ERIC Educational Resources Information Center

    Pinto, Susana; Araújo e Sá, Maria Helena

    2016-01-01

    This paper begins by reviewing European language education policies in higher education and relating these to the bottom-up language provision practices currently applied in higher education institutions. The paper then focuses on a case study at the University of Aveiro (Portugal) that sets out to identify students' social representations…

  1. Applying Current Approaches to the Teaching of Reading

    ERIC Educational Resources Information Center

    Villanueva de Debat, Elba

    2006-01-01

    This article discusses different approaches to reading instruction for EFL learners based on theoretical frameworks. The author starts with the bottom-up approach to reading instruction, and briefly explains phonics and behaviorist ideas that inform this instructional approach. The author then explains the top-down approach and the new cognitive…

  2. A rapid high-performance liquid chromatography-tandem mass spectrometry assay for unambiguous detection of different milk species employed in cheese manufacturing.

    PubMed

    Bernardi, Nadia; Benetti, Giuseppe; Haouet, Naceur M; Sergi, Manuel; Grotta, Lisa; Marchetti, Sonia; Castellani, Federica; Martino, Giuseppe

    2015-12-01

    The aim of the study was to investigate the possibility to differentiate the 4 most important species in Italian dairy industry (cow, buffalo, sheep, and goat), applying a bottom-up proteomic approach to assess the milk species involved in cheese production. Selective peptides were detected in milk to use as markers in cheese products. Trypsin-digested milk samples of cow, sheep, goat, and buffalo, analyzed by HPLC-tandem mass spectrometry provided species-specific peptides, some of them recognized by Mascot software (Matrix Science Ltd., Boston, MA) as derived from well-known species specific proteins. A multianalyte multiple reaction monitoring method, built with these specific peptides, was successfully applied to cheeses with different composition, showing high specificity in detection of species involved. Neither aging nor production method seemed to affect the response, demonstrating that chosen peptides well act as species markers for dairy products. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  3. Automatic Polyp Detection via A Novel Unified Bottom-up and Top-down Saliency Approach.

    PubMed

    Yuan, Yixuan; Li, Dengwang; Meng, Max Q-H

    2017-07-31

    In this paper, we propose a novel automatic computer-aided method to detect polyps for colonoscopy videos. To find the perceptually and semantically meaningful salient polyp regions, we first segment images into multilevel superpixels. Each level corresponds to different sizes of superpixels. Rather than adopting hand-designed features to describe these superpixels in images, we employ sparse autoencoder (SAE) to learn discriminative features in an unsupervised way. Then a novel unified bottom-up and top-down saliency method is proposed to detect polyps. In the first stage, we propose a weak bottom-up (WBU) saliency map by fusing the contrast based saliency and object-center based saliency together. The contrast based saliency map highlights image parts that show different appearances compared with surrounding areas while the object-center based saliency map emphasizes the center of the salient object. In the second stage, a strong classifier with Multiple Kernel Boosting (MKB) is learned to calculate the strong top-down (STD) saliency map based on samples directly from the obtained multi-level WBU saliency maps. We finally integrate these two stage saliency maps from all levels together to highlight polyps. Experiment results achieve 0.818 recall for saliency calculation, validating the effectiveness of our method. Extensive experiments on public polyp datasets demonstrate that the proposed saliency algorithm performs favorably against state-of-the-art saliency methods to detect polyps.

  4. Synthesis of Remote Sensing and Field Observations to Model and Understand Disturbance and Climate Effects on the Carbon Balance of Oregon & Northern California

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beverly Law; David Turner; Warren Cohen

    2008-05-22

    The goal is to quantify and explain the carbon (C) budget for Oregon and N. California. The research compares "bottom -up" and "top-down" methods, and develops prototype analytical systems for regional analysis of the carbon balance that are potentially applicable to other continental regions, and that can be used to explore climate, disturbance and land-use effects on the carbon cycle. Objectives are: 1) Improve, test and apply a bottom up approach that synthesizes a spatially nested hierarchy of observations (multispectral remote sensing, inventories, flux and extensive sites), and the Biome-BGC model to quantify the C balance across the region; 2)more » Improve, test and apply a top down approach for regional and global C flux modeling that uses a model-data fusion scheme (MODIS products, AmeriFlux, atmospheric CO2 concentration network), and a boundary layer model to estimate net ecosystem production (NEP) across the region and partition it among GPP, R(a) and R(h). 3) Provide critical understanding of the controls on regional C balance (how NEP and carbon stocks are influenced by disturbance from fire and management, land use, and interannual climate variation). The key science questions are, "What are the magnitudes and distributions of C sources and sinks on seasonal to decadal time scales, and what processes are controlling their dynamics? What are regional spatial and temporal variations of C sources and sinks? What are the errors and uncertainties in the data products and results (i.e., in situ observations, remote sensing, models)?« less

  5. When the firm prevents the crash: Avoiding market collapse with partial control

    PubMed Central

    2017-01-01

    Market collapse is one of the most dramatic events in economics. Such a catastrophic event can emerge from the nonlinear interactions between the economic agents at the micro level of the economy. Transient chaos might be a good description of how a collapsing market behaves. In this work, we apply a new control method, the partial control method, with the goal of avoiding this disastrous event. Contrary to common control methods that try to influence the system from the outside, here the market is controlled from the bottom up by one of the most basic components of the market—the firm. This is the first time that the partial control method is applied on a strictly economical system in which we also introduce external disturbances. We show how the firm is capable of controlling the system avoiding the collapse by only adjusting the selling price of the product or the quantity of production in accordance to the market circumstances. Additionally, we demonstrate how a firm with a large market share is capable of influencing the demand achieving price stability across the retail and wholesale markets. Furthermore, we prove that the control applied in both cases is much smaller than the external disturbances. PMID:28832608

  6. Creation of Functional Micro/Nano Systems through Top-down and Bottom-up Approaches

    PubMed Central

    Wong, Tak-Sing; Brough, Branden; Ho, Chih-Ming

    2009-01-01

    Mimicking nature’s approach in creating devices with similar functional complexity is one of the ultimate goals of scientists and engineers. The remarkable elegance of these naturally evolved structures originates from bottom-up self-assembly processes. The seamless integration of top-down fabrication and bottom-up synthesis is the challenge for achieving intricate artificial systems. In this paper, technologies necessary for guided bottom-up assembly such as molecular manipulation, molecular binding, and the self assembling of molecules will be reviewed. In addition, the current progress of synthesizing mechanical devices through top-down and bottom-up approaches will be discussed. PMID:19382535

  7. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunn, Nicholas J. H.; Noid, W. G., E-mail: wnoid@chem.psu.edu

    2015-12-28

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, U{sub V}(V), that approximates the volume-dependence of the PMF.more » We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing U{sub V}, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that U{sub V} accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model.« less

  8. An RC-1 organic Rankine bottoming cycle for an adiabatic diesel engine

    NASA Technical Reports Server (NTRS)

    Dinanno, L. R.; Dibella, F. A.; Koplow, M. D.

    1983-01-01

    A system analysis and preliminary design were conducted for an organic Rankine-cycle system to bottom the high-temperature waste heat of an adiabatic diesel engine. The bottoming cycle is a compact package that includes a cylindrical air cooled condenser regenerator module and other unique features. The bottoming cycle output is 56 horsepower at design point conditions when compounding the reference 317 horsepower turbocharged diesel engine with a resulting brake specific fuel consumption of 0.268 lb/hp-hr for the compound engine. The bottoming cycle when applied to a turbocompound diesel delivers a compound engine brake specific fuel consumption of 0.258 lb/hp-hr. This system for heavy duty transport applications uses the organic working fluid RC-1, which is a mixture of 60 mole percent pentafluorobenzene and 40 mole percent hexafluorobenzene. The thermal stability of the RC-1 organic fluid was tested in a dynamic fluid test loop that simulates the operation of Rankine-cycle. More than 1600 hours of operation were completed with results showing that the RC-1 is thermally stable up to 900 F.

  9. Spray shadowing for stress relief and mechanical locking in thick protective coatings

    DOEpatents

    Hollis, Kendall [Los Alamos, NM; Bartram, Brian [Los Alamos, NM

    2007-05-22

    A method for applying a protective coating on an article, comprising the following steps: selecting an article with a surface for applying a coating thickness; creating undercut grooves on the article, where the grooves depend beneath the surface to a bottom portion with the grooves having an upper width on the surface and a lower width on the bottom portion connected by side walls, where at least one of the side walls connects the upper width and the lower width to form an undercut angle with the surface less than 90.degree.; and, applying the protective coating onto the article to fill the undercut grooves and cover the surface, thereby forming weak paths within the protective coating.

  10. Semi top-down method combined with earth-bank, an effective method for basement construction.

    NASA Astrophysics Data System (ADS)

    Tuan, B. Q.; Tam, Ng M.

    2018-04-01

    Choosing an appropriate method of deep excavation not only plays a decisive role in technical success, but also in economics of the construction project. Presently, we mainly base on to key methods: “Bottom-up” and “Top-down” construction method. Right now, this paper presents an another method of construction that is “Semi Top-down method combining with earth-bank” in order to take the advantages and limit the weakness of the above methods. The Bottom-up method was improved by using the earth-bank to stabilize retaining walls instead of the bracing steel struts. The Top-down method was improved by using the open cut method for the half of the earthwork quantities.

  11. Ribbon growing method and apparatus

    NASA Technical Reports Server (NTRS)

    Morrison, Andrew D. (Inventor)

    1989-01-01

    A method and apparatus are described which facilitate the growing of silicon ribbon. A container for molten silicon has a pair of passages in its bottom through which filaments extend to a level above the molten silicon, so as the filaments are pulled up they drag up molten silicon to form a ribbon. A pair of guides surround the filaments along most of the height of the molten silicon, so that the filament contacts only the upper portion of the melt. This permits a filament to be used which tends to contaminate the melt if it is in long term contact with the melt. This arrangement also enables a higher melt to be used without danger that the molten silicon will run out of any bottom hole.

  12. System reliability approaches for advanced propulsion system structures

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Mahadevan, S.

    1991-01-01

    This paper identifies significant issues that pertain to the estimation and use of system reliability in the design of advanced propulsion system structures. Linkages between the reliabilities of individual components and their effect on system design issues such as performance, cost, availability, and certification are examined. The need for system reliability computation to address the continuum nature of propulsion system structures and synergistic progressive damage modes has been highlighted. Available system reliability models are observed to apply only to discrete systems. Therefore a sequential structural reanalysis procedure is formulated to rigorously compute the conditional dependencies between various failure modes. The method is developed in a manner that supports both top-down and bottom-up analyses in system reliability.

  13. A community-based prevention program in western Norway. Organisation and progression model.

    PubMed

    Skutle, Arvid; Iversen, Erik; Bergan, Tone

    2002-01-01

    This paper presents the organisation, progression, and main findings from a community-based substance use prevention project in five municipalities in western Norway. At the central level, this project was organised with a steering committee and a principal project leader, who is situated at the Department of Health and Social Welfare at the county level. Locally, the way of organizing differed, as one would expect from the community-based model. Top-down/bottom-up strategies can apply both in the way a community organises its efforts, as well as in the relationship between the central project organisation and the participating local communities. It is argued that it can be beneficial for the success of community action programs if one attains a "good mix" between top-down and bottom-up strategies. Factors of importance for such "mix" in the Hordaland project were that the municipalities applied for participation, the availability of economic funding, the venues for meetings between central and local project management, the position of local coordinators, the possibilities for coupling project work to otherwise existing community planning, and the extent of formal bureaucracy.

  14. The Search Conference as a Method in Planning Community Health Promotion Actions

    PubMed Central

    Magnus, Eva; Knudtsen, Margunn Skjei; Wist, Guri; Weiss, Daniel; Lillefjell, Monica

    2016-01-01

    Aims: The aim of this article is to describe and discuss how the search conference can be used as a method for planning health promotion actions in local communities. Design and methods: The article draws on experiences with using the method for an innovative project in health promotion in three Norwegian municipalities. The method is described both in general and how it was specifically adopted for the project. Results and conclusions: The search conference as a method was used to develop evidence-based health promotion action plans. With its use of both bottom-up and top-down approaches, this method is a relevant strategy for involving a community in the planning stages of health promotion actions in line with political expectations of participation, ownership, and evidence-based initiatives. Significance for public health This article describe and discuss how the Search conference can be used as a method when working with knowledge based health promotion actions in local communities. The article describe the sequences of the conference and shows how this have been adapted when planning and prioritizing health promotion actions in three Norwegian municipalities. The significance of the article is that it shows how central elements in the planning of health promotion actions, as participation and involvements as well as evidence was a fundamental thinking in how the conference were accomplished. The article continue discussing how the method function as both a top-down and a bottom-up strategy, and in what way working evidence based can be in conflict with a bottom-up strategy. The experiences described can be used as guidance planning knowledge based health promotion actions in communities. PMID:27747199

  15. Comparison of Instream and Laboratory Methods of Measuring Sediment Oxygen Demand

    USGS Publications Warehouse

    Hall, Dennis C.; Berkas, Wayne R.

    1988-01-01

    Sediment oxygen demand (SOD) was determined at three sites in a gravel-bottomed central Missouri stream by: (1) two variations of an instream method, and (2) a laboratory method. SOD generally was greatest by the instream methods, which are considered more accurate, and least by the laboratory method. Disturbing stream sediment did not significantly decrease SOD by the instream method. Temperature ranges of up to 12 degree Celsius had no significant effect on the SOD. In the gravel-bottomed stream, the placement of chambers was critical to obtain reliable measurements. SOD rates were dependent on the method; therefore, care should be taken in comparing SOD data obtained by different methods. There is a need for a carefully researched standardized method for SOD determinations.

  16. Where to start? Bottom-up attention improves working memory by determining encoding order.

    PubMed

    Ravizza, Susan M; Uitvlugt, Mitchell G; Hazeltine, Eliot

    2016-12-01

    The present study aimed to characterize the mechanism by which working memory is enhanced for items that capture attention because of their novelty or saliency-that is, via bottom-up attention. The first experiment replicated previous research by corroborating that bottom-up attention directed to an item is sufficient for enhancing working memory and, moreover, generalized the effect to the domain of verbal working memory. The subsequent 3 experiments sought to determine how bottom-up attention affects working memory. We considered 2 hypotheses: (1) Bottom-up attention enhances the encoded representation of the stimulus, similar to how voluntary attention functions, or (2) It affects the order of encoding by shifting priority onto the attended stimulus. By manipulating how stimuli were presented (simultaneous/sequential display) and whether the cue predicted the tested items, we found evidence that bottom-up attention improves working memory performance via the order of encoding hypothesis. This finding was observed across change detection and free recall paradigms. In contrast, voluntary attention improved working memory regardless of encoding order and showed greater effects on working memory. We conclude that when multiple information sources compete, bottom-up attention prioritizes the location at which encoding should begin. When encoding order is set, bottom-up attention has little or no benefit to working memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  17. Interactions of Top-Down and Bottom-Up Mechanisms in Human Visual Cortex

    PubMed Central

    McMains, Stephanie; Kastner, Sabine

    2011-01-01

    Multiple stimuli present in the visual field at the same time compete for neural representation by mutually suppressing their evoked activity throughout visual cortex, providing a neural correlate for the limited processing capacity of the visual system. Competitive interactions among stimuli can be counteracted by top-down, goal-directed mechanisms such as attention, and by bottom-up, stimulus-driven mechanisms. Because these two processes cooperate in everyday life to bias processing toward behaviorally relevant or particularly salient stimuli, it has proven difficult to study interactions between top-down and bottom-up mechanisms. Here, we used an experimental paradigm in which we first isolated the effects of a bottom-up influence on neural competition by parametrically varying the degree of perceptual grouping in displays that were not attended. Second, we probed the effects of directed attention on the competitive interactions induced with the parametric design. We found that the amount of attentional modulation varied linearly with the degree of competition left unresolved by bottom-up processes, such that attentional modulation was greatest when neural competition was little influenced by bottom-up mechanisms and smallest when competition was strongly influenced by bottom-up mechanisms. These findings suggest that the strength of attentional modulation in the visual system is constrained by the degree to which competitive interactions have been resolved by bottom-up processes related to the segmentation of scenes into candidate objects. PMID:21228167

  18. Magnet Assisted Composite Manufacturing: A Flexible New Technique for Achieving High Consolidation Pressure in Vacuum Bag/Lay-Up Processes.

    PubMed

    Pishvar, Maya; Amirkhosravi, Mehrad; Altan, M Cengiz

    2018-05-17

    This work demonstrates a protocol to improve the quality of composite laminates fabricated by wet lay-up vacuum bag processes using the recently developed magnet assisted composite manufacturing (MACM) technique. In this technique, permanent magnets are utilized to apply a sufficiently high consolidation pressure during the curing stage. To enhance the intensity of the magnetic field, and thus, to increase the magnetic compaction pressure, the magnets are placed on a magnetic top plate. First, the entire procedure of preparing the composite lay-up on a magnetic bottom steel plate using the conventional wet lay-up vacuum bag process is described. Second, placement of a set of Neodymium-Iron-Boron permanent magnets, arranged in alternating polarity, on the vacuum bag is illustrated. Next, the experimental procedures to measure the magnetic compaction pressure and volume fractions of the composite constituents are presented. Finally, methods used to characterize microstructure and mechanical properties of composite laminates are discussed in detail. The results prove the effectiveness of the MACM method in improving the quality of wet lay-up vacuum bag laminates. This method does not require large capital investment for tooling or equipment and can also be used to consolidate geometrically complex composite parts by placing the magnets on a matching top mold positioned on the vacuum bag.

  19. Utilization of waste materials, non-refined materials, and renewable energy in in situ remediation and their sustainability benefits.

    PubMed

    Favara, Paul; Gamlin, Jeff

    2017-12-15

    In the ramp-up to integrating sustainability into remediation, a key industry focus area has been to reduce the environmental footprint of treatment processes. The typical approach to integrating sustainability into remediation projects has been a top-down approach, which involves developing technology options and then applying sustainability thinking to the technology, after it has been conceptualized. A bottom-up approach allows for systems thinking to be included in remedy selection and could potentially result in new or different technologies being considered. When using a bottom-up approach, there is room to consider the utilization of waste materials, non-refined materials, and renewable energy in remediation technology-all of which generally have a smaller footprint than processed materials and traditional forms of energy. By integrating more systems thinking into remediation projects, practitioners can think beyond the traditional technologies typically used and how technologies are deployed. To compare top-down and bottom-up thinking, a traditional technology that is considered very sustainable-enhanced in situ bioremediation-is compared to a successful, but infrequently deployed technology-subgrade biogeochemical reactors. Life Cycle Assessment is used for the evaluation and shows the footprint of the subgrade biogeochemical reactor to be lower in all seven impact categories evaluated, sometimes to a significant degree. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Top-down and bottom-up characterization of nitrated birch pollen allergen Bet v 1a with CZE hyphenated to an Orbitrap mass spectrometer.

    PubMed

    Gusenkov, Sergey; Stutz, Hanno

    2018-02-01

    Tyrosine (Tyr) residues of the major pollen allergen of birch Betula verrucosa, Bet v 1a, were nitrated by peroxynitrite. This modification enhances the allergenicity. Modified tyrosines were identified by analyzing intact allergen variants in combination with top-down and bottom-up approaches. Therefore, a laboratory-built sheath-liquid assisted ESI interface was applied for hyphenation of CE to an Orbitrap mass spectrometer to localize individual nitration sites. The major focus was on identification of primary nitration sites. The top-down approach unambiguously identified Tyr 5 as the most prominent modification site. Fragments from the allergen core and the C-terminal part carried up to three potential nitration sites, respectively. Thus, a bottom-up approach with tryptic digest was used as a complementary strategy which allowed for the unambiguous localization of nitration sites within the respective peptides. Nitration propensity for individual Tyr residues was addressed by comparison of MS signals of nitrated peptides relative to all cognates of homolog primary sequence. Combined data identified surface exposed Tyr 5 and Tyr 66 as major nitration sites followed by less accessible Tyr 158 whereas Tyr 81, 83 and 150 possess a lower nitration tendency and are apparently modified in variants with higher nitration levels. © 2018 The Authors. Electrophoresis published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. From lists of behaviour change techniques (BCTs) to structured hierarchies: comparison of two methods of developing a hierarchy of BCTs.

    PubMed

    Cane, James; Richardson, Michelle; Johnston, Marie; Ladha, Ruhina; Michie, Susan

    2015-02-01

    Behaviour change technique (BCT) Taxonomy v1 is a hierarchically grouped, consensus-based taxonomy of 93 BCTs for reporting intervention content. To enhance the use and understanding of BCTs, the aims of the present study were to (1) quantitatively examine the 'bottom-up' hierarchical structure of Taxonomy v1, (2) identify whether BCTs can be reliably mapped to theoretical domains using a 'top-down' theoretically driven approach, and (3) identify any overlap between the 'bottom-up' and 'top-down' groupings. The 'bottom-up' structure was examined for higher-order groupings using a dendrogram derived from hierarchical cluster analysis. For the theory-based 'top-down' structure, 18 experts sorted BCTs into 14 theoretical domains. Discriminant Content Validity was used to identify groupings, and chi-square tests and Pearson's residuals were used to examine the overlap between groupings. Behaviour change techniques relating to 'Reward and Punishment' and 'Cues and Cue Responses' were perceived as markedly different to other BCTs. Fifty-nine of the BCTs were reliably allocated to 12 of the 14 theoretical domains; 47 were significant and 12 were of borderline significance. Thirty-four of 208 'bottom-up' × 'top-down' pairings showed greater overlap than expected by chance. However, only six combinations achieved satisfactory evidence of similarity. The moderate overlap between the groupings indicates some tendency to implicitly conceptualize BCTs in terms of the same theoretical domains. Understanding the nature of the overlap will aid the conceptualization of BCTs in terms of theory and application. Further research into different methods of developing a hierarchical taxonomic structure of BCTs for international, interdisciplinary work is now required. Statement of contribution What is already known on this subject? Behaviour change interventions are effective in improving health care and health outcomes. The 'active' components of these interventions are behaviour change techniques and over 93 have been identified. Taxonomies of behaviour change techniques require structure to enable potential applications. What does this study add? This study identifies groups of BCTs to aid the recall of BCTs for intervention coding and design. It compares two methods of grouping--'bottom-up' and theory-based 'top-down'--and finds a moderate overlap. Building on identified BCT groups, it examines relationships between theoretical domains and BCTs. © 2014 The British Psychological Society.

  2. Getting to the Bottom of L2 Listening Instruction: Making a Case for Bottom-Up Activities

    ERIC Educational Resources Information Center

    Siegel, Joseph; Siegel, Aki

    2015-01-01

    This paper argues for the incorporation of bottom-up activities for English as a foreign language (EFL) listening. It discusses theoretical concepts and pedagogic options for addressing bottom-up aural processing in the EFL classroom as well as how and why teachers may wish to include such activities in lessons. This discussion is augmented by a…

  3. Care and Maintenance.

    ERIC Educational Resources Information Center

    Hampton, Carol D.; Hampton, Carolyn H.

    1980-01-01

    Described is a method for bringing the sea into the classroom by setting up a saltwater aquarium. Included is selection of an aquarium, filtering systems, water (whether natural salt or synthetic sea salts), bottom materials, setting up an aquarium, system stabilization, stocking an aquarium, and maintenance of the aquarium. (DS)

  4. Method and device for tensile testing of cable bundles

    DOEpatents

    Robertson, Lawrence M; Ardelean, Emil V; Goodding, James C; Babuska, Vit

    2012-10-16

    A standard tensile test device is improved to accurately measure the mechanical properties of stranded cables, ropes, and other composite structures wherein a witness is attached to the top and bottom mounting blocks holding the cable under test. The witness is comprised of two parts: a top and a bottom rod of similar diameter with the bottom rod having a smaller diameter stem on its upper end and the top rod having a hollow opening in its lower end into which the stem fits forming a witness joint. A small gap is present between the top rod and the larger diameter portion of the bottom rod. A standard extensometer is attached to the top and bottom rods of the witness spanning this small witness gap. When a force is applied to separate the mounting blocks, the gap in the witness expands the same length that the entire test specimen is stretched.

  5. Top-Down and Bottom-Up Approaches in Engineering 1 T Phase Molybdenum Disulfide (MoS2 ): Towards Highly Catalytically Active Materials.

    PubMed

    Chua, Chun Kiang; Loo, Adeline Huiling; Pumera, Martin

    2016-09-26

    The metallic 1 T phase of MoS2 has been widely identified to be responsible for the improved performances of MoS2 in applications including hydrogen evolution reactions and electrochemical supercapacitors. To this aim, various synthetic methods have been reported to obtain 1 T phase-rich MoS2 . Here, the aim is to evaluate the efficiencies of the bottom-up (hydrothermal reaction) and top-down (chemical exfoliation) approaches in producing 1 T phase MoS2 . It is established in this study that the 1 T phase MoS2 produced through the bottom-up approach contains a high proportion of 1 T phase and demonstrates excellent electrochemical and electrical properties. Its performance in the hydrogen evolution reaction and electrochemical supercapacitors also surpassed that of 1 T phase MoS2 produced through a top-down approach. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. HCFC-142b emissions in China: An inventory for 2000 to 2050 basing on bottom-up and top-down methods

    NASA Astrophysics Data System (ADS)

    Han, Jiarui; Li, Li; Su, Shenshen; Hu, Jianxin; Wu, Jing; Wu, Yusheng; Fang, Xuekun

    2014-05-01

    1-Chloro-1,1-difluoroethane (HCFC-142b) is both ozone depleting substance included in the Montreal Protocol on Substances that Deplete the Ozone Layer (Montreal Protocol) and potent greenhouse gas with high global warming potential. As one of the major HCFC-142b consumption and production countries in the world, China's control action will contribute to both mitigating climate change and protecting ozone layer. Estimating China's HCFC-142b emission is a crucial step for understanding its emission status, drawing up phasing-out plan and evaluating mitigation effect. Both the bottom-up and top-down method were adopted in this research to estimate HCFC-142b emissions from China. Results basing on different methods were compared to test the effectiveness of two methods and validate inventory's reliability. Firstly, a national bottom-up emission inventory of HCFC-142b for China during 2000-2012 was established based on the 2006 IPCC Guidelines for National Greenhouse Gas Inventories and the Montreal Protocol, showing that in contrast to the downward trend revealed by existing results, HCFC-142b emissions kept increasing from 0.1 kt/yr in 2000 to the peak of 14.4 kt/yr in 2012. Meanwhile a top-down emission estimation was also developed using interspecies correlation method. By correlating atmospheric mixing ratio data of HCFC-142b and reference substance HCFC-22 sampled from four representative cities (Beijing, Hangzhou, Lanzhou and Guangzhou, for northern, eastern, western and southern China, respectively), China's HCFC-142b emission in 2012 was calculated. It was 16.24(13.90-18.58) kt, equivalent to 1.06 kt ODP and 37 Tg CO2-eq, taking up 9.78% (ODP) of total HCFCs emission in China or 30.5% of global HCFC-142b emission. This result was 12.7% higher than that in bottom-up inventory. Possible explanations were discussed. The consistency of two results lend credit to methods effectiveness and results reliability. Finally, future HCFC-142b emission was projected to 2050. Emission might experience a continuous increase from 14.9 kt/yr to 97.2 kt/yr under business-as-usual (BAU) scenario, while a 90% reduction would be obtained by fulfilling the Montreal Protocol, namely an accumulative mitigation of 1578 kt from 2013 to 2050, equal to 103 kt ODP, and 3504 Tg CO2 emissions. Therefore, China will contribute tremendously to the worldwide ozone protection and global warming mitigation by successfully phasing out HCFC-142b according to the Montreal Protocol schedule.

  7. Diamond anvil cells using boron-doped diamond electrodes covered with undoped diamond insulating layer

    NASA Astrophysics Data System (ADS)

    Matsumoto, Ryo; Yamashita, Aichi; Hara, Hiroshi; Irifune, Tetsuo; Adachi, Shintaro; Takeya, Hiroyuki; Takano, Yoshihiko

    2018-05-01

    Diamond anvil cells using boron-doped metallic diamond electrodes covered with undoped diamond insulating layers have been developed for electrical transport measurements under high pressure. These designed diamonds were grown on a bottom diamond anvil via a nanofabrication process combining microwave plasma-assisted chemical vapor deposition and electron beam lithography. The resistance measurements of a high-quality FeSe superconducting single crystal under high pressure were successfully demonstrated by just putting the sample and gasket on the bottom diamond anvil directly. The superconducting transition temperature of the FeSe single crystal was increased to up to 43 K by applying uniaxial-like pressure.

  8. Impact of spatial proxies on the representation of bottom-up emission inventories: A satellite-based analysis

    NASA Astrophysics Data System (ADS)

    Geng, Guannan; Zhang, Qiang; Martin, Randall V.; Lin, Jintai; Huo, Hong; Zheng, Bo; Wang, Siwen; He, Kebin

    2017-03-01

    Spatial proxies used in bottom-up emission inventories to derive the spatial distributions of emissions are usually empirical and involve additional levels of uncertainty. Although uncertainties in current emission inventories have been discussed extensively, uncertainties resulting from improper spatial proxies have rarely been evaluated. In this work, we investigate the impact of spatial proxies on the representation of gridded emissions by comparing six gridded NOx emission datasets over China developed from the same magnitude of emissions and different spatial proxies. GEOS-Chem-modeled tropospheric NO2 vertical columns simulated from different gridded emission inventories are compared with satellite-based columns. The results show that differences between modeled and satellite-based NO2 vertical columns are sensitive to the spatial proxies used in the gridded emission inventories. The total population density is less suitable for allocating NOx emissions than nighttime light data because population density tends to allocate more emissions to rural areas. Determining the exact locations of large emission sources could significantly strengthen the correlation between modeled and observed NO2 vertical columns. Using vehicle population and an updated road network for the on-road transport sector could substantially enhance urban emissions and improve the model performance. When further applying industrial gross domestic product (IGDP) values for the industrial sector, modeled NO2 vertical columns could better capture pollution hotspots in urban areas and exhibit the best performance of the six cases compared to satellite-based NO2 vertical columns (slope = 1.01 and R2 = 0. 85). This analysis provides a framework for information from satellite observations to inform bottom-up inventory development. In the future, more effort should be devoted to the representation of spatial proxies to improve spatial patterns in bottom-up emission inventories.

  9. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging

    NASA Astrophysics Data System (ADS)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-01

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM+ on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 +/- 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection.Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM+ on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 +/- 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection. Electronic supplementary information (ESI) available: Fig. S1. TEM image of the products generated via an electrochemical method using pure BMIMBF4 aqueous solution as the electrolyte; Fig. S2. TEM and HRTEM (inset) images of the water-dispersed solution of C-dots generated from the EC process using electrolytes with a BMIMPF6/3-methylaminopropionitrile volume ratio of 1 : 9 Fig. S3. The effect of pH on the fluorescence intensity (I) of the C-dots; experimental details for detection of Fe3+ in tap water; Fig. S4. Calibration curve for detection of Fe3+ in tap water using the standard addition method. See DOI: 10.1039/c6nr00023a

  10. The analysis of bottom forming process for hybrid heating device

    NASA Astrophysics Data System (ADS)

    Bałon, Paweł; Świątoniowski, Andrzej; Kiełbasa, Bartłomiej

    2017-10-01

    In this paper the authors present an unusual method for bottom forming applicable for various industrial purposes including the manufacture of water heaters or pressure equipment. This method allows for the fabrication of the bottom of a given piece of stainless steel into a pre-determined shape conforming to the DIN standard which determines the most advantageous dimensions for the bottom cross section in terms of working pressure loading. The authors checked the validity of the method in a numerical and experimental way generating a tool designed to produce bottoms of specified geometry. Many problems are encountered during the design and production of parts, especially excessive sheet wrinkling over a large area of the part. The experiment showed that a lack of experience and numerical analysis in the design of such elements would result in the production of highly wrinkled parts. This defect would render the parts impossible to assemble with the cylindrical part. Many tool shops employ a method for drawing elements with a spherical surface which involves additional spinning, stamping, and grading operations, which greatly increases the cost of parts production. The authors present and compare two forming methods for spherical and parabolic objects, and experimentally confirm the validity of the sheet reversing method with adequate pressure force. The applied method produces parts in one drawing operation and in a following operation that is based on laser or water cutting to obtain a round blank. This reduces the costs of tooling manufacturing by requiring just one tool which can be placed on any hydraulic press with a minimum force of 2 000 kN.

  11. Bottom-up modeling approach for the quantitative estimation of parameters in pathogen-host interactions

    PubMed Central

    Lehnert, Teresa; Timme, Sandra; Pollmächer, Johannes; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo

    2015-01-01

    Opportunistic fungal pathogens can cause bloodstream infection and severe sepsis upon entering the blood stream of the host. The early immune response in human blood comprises the elimination of pathogens by antimicrobial peptides and innate immune cells, such as neutrophils or monocytes. Mathematical modeling is a predictive method to examine these complex processes and to quantify the dynamics of pathogen-host interactions. Since model parameters are often not directly accessible from experiment, their estimation is required by calibrating model predictions with experimental data. Depending on the complexity of the mathematical model, parameter estimation can be associated with excessively high computational costs in terms of run time and memory. We apply a strategy for reliable parameter estimation where different modeling approaches with increasing complexity are used that build on one another. This bottom-up modeling approach is applied to an experimental human whole-blood infection assay for Candida albicans. Aiming for the quantification of the relative impact of different routes of the immune response against this human-pathogenic fungus, we start from a non-spatial state-based model (SBM), because this level of model complexity allows estimating a priori unknown transition rates between various system states by the global optimization method simulated annealing. Building on the non-spatial SBM, an agent-based model (ABM) is implemented that incorporates the migration of interacting cells in three-dimensional space. The ABM takes advantage of estimated parameters from the non-spatial SBM, leading to a decreased dimensionality of the parameter space. This space can be scanned using a local optimization approach, i.e., least-squares error estimation based on an adaptive regular grid search, to predict cell migration parameters that are not accessible in experiment. In the future, spatio-temporal simulations of whole-blood samples may enable timely stratification of sepsis patients by distinguishing hyper-inflammatory from paralytic phases in immune dysregulation. PMID:26150807

  12. Bottom-up modeling approach for the quantitative estimation of parameters in pathogen-host interactions.

    PubMed

    Lehnert, Teresa; Timme, Sandra; Pollmächer, Johannes; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo

    2015-01-01

    Opportunistic fungal pathogens can cause bloodstream infection and severe sepsis upon entering the blood stream of the host. The early immune response in human blood comprises the elimination of pathogens by antimicrobial peptides and innate immune cells, such as neutrophils or monocytes. Mathematical modeling is a predictive method to examine these complex processes and to quantify the dynamics of pathogen-host interactions. Since model parameters are often not directly accessible from experiment, their estimation is required by calibrating model predictions with experimental data. Depending on the complexity of the mathematical model, parameter estimation can be associated with excessively high computational costs in terms of run time and memory. We apply a strategy for reliable parameter estimation where different modeling approaches with increasing complexity are used that build on one another. This bottom-up modeling approach is applied to an experimental human whole-blood infection assay for Candida albicans. Aiming for the quantification of the relative impact of different routes of the immune response against this human-pathogenic fungus, we start from a non-spatial state-based model (SBM), because this level of model complexity allows estimating a priori unknown transition rates between various system states by the global optimization method simulated annealing. Building on the non-spatial SBM, an agent-based model (ABM) is implemented that incorporates the migration of interacting cells in three-dimensional space. The ABM takes advantage of estimated parameters from the non-spatial SBM, leading to a decreased dimensionality of the parameter space. This space can be scanned using a local optimization approach, i.e., least-squares error estimation based on an adaptive regular grid search, to predict cell migration parameters that are not accessible in experiment. In the future, spatio-temporal simulations of whole-blood samples may enable timely stratification of sepsis patients by distinguishing hyper-inflammatory from paralytic phases in immune dysregulation.

  13. A bottom-up approach to assess verbal therapeutic techniques. Development of the Psychodynamic Interventions List (PIL)

    PubMed Central

    Gumz, Antje; Neubauer, Karolin; Horstkotte, Julia Katharina; Geyer, Michael; Löwe, Bernd; Murray, Alexandra M.; Kästner, Denise

    2017-01-01

    Objective Knowing which specific verbal techniques “good” therapists use in their daily work is important for training and evaluation purposes. In order to systematize what is being practiced in the field, our aim was to empirically identify verbal techniques applied in psychodynamic sessions and to differentiate them according to their basic semantic features using a bottom-up, qualitative approach. Method Mixed-Method-Design: In a comprehensive qualitative study, types of techniques were identified at the level of utterances based on transcribed psychodynamic therapy sessions using Qualitative Content Analysis (4211 utterances). The definitions of the identified categories were successively refined and modified until saturation was achieved. In a subsequent quantitative study, inter-rater reliability was assessed both at the level of utterances (n = 8717) and at the session level (n = 38). The convergent validity of the categories was investigated by analyzing associations with the Interpretive and Supportive Technique Scale (ISTS). Results The inductive approach resulted in a classification system with 37 categories (Psychodynamic Interventions List, PIL). According to their semantic content, the categories can be allocated to three dimensions: form (24 categories), thematic content (9) and temporal focus (4). Most categories showed good or excellent inter-rater reliability and expected associations with the ISTS were predominantly confirmed. The rare use of the residual category “Other” suggests that the identified categories might comprehensively describe the breadth of applied techniques. Conclusions The atheoretical orientation and the clear focus on overt linguistic features should enable the PIL to be used without intensive training or prior theoretical knowledge. The PIL can be used to investigate the links between verbal techniques derived from practice and micro-outcomes (at the session level) as well as the overall therapeutic outcomes. This approach might enable us to determine to what extent the outcome of therapy is due to unintended or non-theoretically relevant techniques. PMID:28837582

  14. Utilizing BlueJ to Teach Polymorphism in an Advanced Object-Oriented Programming Course

    ERIC Educational Resources Information Center

    Alkazemi, Basem Y.; Grami, Grami M.

    2012-01-01

    Teaching Polymorphism can be best implemented by using a combination of bottom-up and top-down approaches. However, from our observation and students' self-reporting, the former seems to be the predominant in the Saudi context. We try to investigate whether applying a more balanced approach in teaching the comprehensive concept of Polymorphism…

  15. Multiple constraint analysis of regional land-surface carbon flux

    Treesearch

    D.P. Turner; M. Göckede; B.E. Law; W.D. Ritts; W.B. Cohen; Z. Yang; T. Hudiburg; R. Kennedy; M. Duane

    2011-01-01

    We applied and compared bottom-up (process model-based) and top-down (atmospheric inversion-based) scaling approaches to evaluate the spatial and temporal patterns of net ecosystem production (NEP) over a 2.5 × 105 km2 area (the state of Oregon) in the western United States. Both approaches indicated a carbon sink over this...

  16. The Three Stages of Coding and Decoding in Listening Courses of College Japanese Specialty

    ERIC Educational Resources Information Center

    Yang, Fang

    2008-01-01

    The main focus of research papers on listening teaching published in recent years is the theoretical meanings of decoding on the training of listening comprehension ability. Although in many research papers the bottom-up approach and top-down approach, information processing mode theory, are applied to illustrate decoding and to emphasize the…

  17. Changing Educational Traditions with the Change Laboratory

    ERIC Educational Resources Information Center

    Botha, Louis Royce

    2017-01-01

    This paper outlines the use of a form of research intervention known as the Change Laboratory to illustrate how the processes of organisational change initiated at a secondary school can be applied to develop tools and practices to analyse and potentially re-make educational traditions in a bottom-up manner. In this regard it is shown how a…

  18. Clustering Heart Rate Dynamics Is Associated with β-Adrenergic Receptor Polymorphisms: Analysis by Information-Based Similarity Index

    PubMed Central

    Yang, Albert C.; Tsai, Shih-Jen; Hong, Chen-Jee; Wang, Cynthia; Chen, Tai-Jui; Liou, Ying-Jay; Peng, Chung-Kang

    2011-01-01

    Background Genetic polymorphisms in the gene encoding the β-adrenergic receptors (β-AR) have a pivotal role in the functions of the autonomic nervous system. Using heart rate variability (HRV) as an indicator of autonomic function, we present a bottom-up genotype–phenotype analysis to investigate the association between β-AR gene polymorphisms and heart rate dynamics. Methods A total of 221 healthy Han Chinese adults (59 males and 162 females, aged 33.6±10.8 years, range 19 to 63 years) were recruited and genotyped for three common β-AR polymorphisms: β1-AR Ser49Gly, β2-AR Arg16Gly and β2-AR Gln27Glu. Each subject underwent two hours of electrocardiogram monitoring at rest. We applied an information-based similarity (IBS) index to measure the pairwise dissimilarity of heart rate dynamics among study subjects. Results With the aid of agglomerative hierarchical cluster analysis, we categorized subjects into major clusters, which were found to have significantly different distributions of β2-AR Arg16Gly genotype. Furthermore, the non-randomness index, a nonlinear HRV measure derived from the IBS method, was significantly lower in Arg16 homozygotes than in Gly16 carriers. The non-randomness index was negatively correlated with parasympathetic-related HRV variables and positively correlated with those HRV indices reflecting a sympathovagal shift toward sympathetic activity. Conclusions We demonstrate a bottom-up categorization approach combining the IBS method and hierarchical cluster analysis to detect subgroups of subjects with HRV phenotypes associated with β-AR polymorphisms. Our results provide evidence that β2-AR polymorphisms are significantly associated with the acceleration/deceleration pattern of heart rate oscillation, reflecting the underlying mode of autonomic nervous system control. PMID:21573230

  19. Spreadsheet for designing valid least-squares calibrations: A tutorial.

    PubMed

    Bettencourt da Silva, Ricardo J N

    2016-02-01

    Instrumental methods of analysis are used to define the price of goods, the compliance of products with a regulation, or the outcome of fundamental or applied research. These methods can only play their role properly if reported information is objective and their quality is fit for the intended use. If measurement results are reported with an adequately small measurement uncertainty both of these goals are achieved. The evaluation of the measurement uncertainty can be performed by the bottom-up approach, that involves a detailed description of the measurement process, or using a pragmatic top-down approach that quantify major uncertainty components from global performance data. The bottom-up approach is not so frequently used due to the need to master the quantification of individual components responsible for random and systematic effects that affect measurement results. This work presents a tutorial that can be easily used by non-experts in the accurate evaluation of the measurement uncertainty of instrumental methods of analysis calibrated using least-squares regressions. The tutorial includes the definition of the calibration interval, the assessments of instrumental response homoscedasticity, the definition of calibrators preparation procedure required for least-squares regression model application, the assessment of instrumental response linearity and the evaluation of measurement uncertainty. The developed measurement model is only applicable in calibration ranges where signal precision is constant. A MS-Excel file is made available to allow the easy application of the tutorial. This tool can be useful for cases where top-down approaches cannot produce results with adequately low measurement uncertainty. An example of the application of this tool to the determination of nitrate in water by ion chromatography is presented. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Self-organization of maze-like structures via guided wrinkling.

    PubMed

    Bae, Hyung Jong; Bae, Sangwook; Yoon, Jinsik; Park, Cheolheon; Kim, Kibeom; Kwon, Sunghoon; Park, Wook

    2017-06-01

    Sophisticated three-dimensional (3D) structures found in nature are self-organized by bottom-up natural processes. To artificially construct these complex systems, various bottom-up fabrication methods, designed to transform 2D structures into 3D structures, have been developed as alternatives to conventional top-down lithography processes. We present a different self-organization approach, where we construct microstructures with periodic and ordered, but with random architecture, like mazes. For this purpose, we transformed planar surfaces using wrinkling to directly use randomly generated ridges as maze walls. Highly regular maze structures, consisting of several tessellations with customized designs, were fabricated by precisely controlling wrinkling with the ridge-guiding structure, analogous to the creases in origami. The method presented here could have widespread applications in various material systems with multiple length scales.

  1. Evaluation of bottom-up and downscaled emission inventories for Paris and consequences for estimating urban air pollution increments

    NASA Astrophysics Data System (ADS)

    Timmermans, R.; Denier van der Gon, H.; Segers, A.; Honore, C.; Perrussel, O.; Builtjes, P.; Schaap, M.

    2012-04-01

    Since a major part of the Earth's population lives in cities, it is of great importance to correctly characterise the air pollution levels over these urban areas. Many studies in the past have already been dedicated to this subject and have determined so-called urban increments: the impact of large cities on the air pollution levels. The impact of large cities on air pollution levels usually is determined with models driven by so-called downscaled emission inventories. In these inventories official country total emissions are gridded using information on for example population density and location of industries and roads. The question is how accurate are the downscaled inventories over cities or large urban areas. Within the EU FP 7 project MEGAPOLI project a new emission inventory has been produced including refined local emission data for two European megacities (Paris, London) and two urban conglomerations (the Po valley, Italy and the Rhine-Ruhr region, Germany) based on a bottom-up approach. The inventory has comparable national totals but remarkable difference at the city scale. Such a bottom up inventory is thought to be more accurate as it contains local knowledge. Within this study we compared modelled nitrogen dioxide (NO2) and particulate matter (PM) concentrations from the LOTOS-EUROS chemistry transport model driven by a conventional downscaled emission inventory (TNO-MACC inventory) with the concentrations from the same model driven by the new MEGAPOLI 'bottom-up' emission inventory focusing on the Paris region. Model predictions for Paris significantly improve using the new Megapoli inventory. Both the emissions as well as the simulated average concentrations of PM over urban sites in Paris are much lower due to the different spatial distribution of the anthropogenic emissions. The difference for the nearby rural stations is small implicating that also the urban increment for PM simulated using the bottom-up emission inventory is much smaller than for the downscaled emission inventory. Urban increments for PM calculated with downscaled emissions, as is common practice, might therefore be overestimated. This finding is likely to apply to other European Megacities as well.

  2. Developing a Cognitive Training Strategy for First-Episode Schizophrenia: Integrating Bottom-Up and Top-Down Approaches

    PubMed Central

    Nuechterlein, Keith H.; Ventura, Joseph; Subotnik, Kenneth L.; Hayata, Jacqueline N.; Medalia, Alice; Bell, Morris D.

    2014-01-01

    It is clear that people with schizophrenia typically have cognitive problems in multiple domains as part of their illness. The cognitive deficits are among the main contributors to limitations in their everyday functioning, including their work recovery. Cognitive remediation has been applied successfully to help people with long-term, persistent schizophrenia to improve their cognitive functioning, but it is only beginning to be applied with individuals who have recently had a first episode of psychosis. Several different approaches to cognitive training have been developed. Some approaches emphasize extensive systematic practice with lower-level cognitive processes and building toward higher-level processes (“bottom-up”), while others emphasize greater focus on high-level cognitive processes that normally integrate and organize lower-level processes (“top-down”). Each approach has advantages and disadvantages for a disorder like schizophrenia, with its multiple levels of cognitive dysfunction. In addition, approaches to cognitive remediation differ in the extent to which they systematically facilitate transfer of learning to everyday functioning. We describe in this article the cognitive training approach that was developed for a UCLA study of people with a recent first episode of schizophrenia, a group that may benefit greatly from early intervention that focuses on cognition and recovery of work functioning. This approach integrated bottom-up and top-down computerized cognitive training and incorporated an additional weekly group session to bridge between computerized training and application to everyday work and school functioning. PMID:25489275

  3. Bottom-up fabrication of paper-based microchips by blade coating of cellulose microfibers on a patterned surface.

    PubMed

    Gao, Bingbing; Liu, Hong; Gu, Zhongze

    2014-12-23

    We report a method for the bottom-up fabrication of paper-based capillary microchips by the blade coating of cellulose microfibers on a patterned surface. The fabrication process is similar to the paper-making process in which an aqueous suspension of cellulose microfibers is used as the starting material and is blade-coated onto a polypropylene substrate patterned using an inkjet printer. After water evaporation, the cellulose microfibers form a porous, hydrophilic, paperlike pattern that wicks aqueous solution by capillary action. This method enables simple, fast, inexpensive fabrication of paper-based capillary channels with both width and height down to about 10 μm. When this method is used, the capillary microfluidic chip for the colorimetric detection of glucose and total protein is fabricated, and the assay requires only 0.30 μL of sample, which is 240 times smaller than for paper devices fabricated using photolithography.

  4. A numerical study of circulation driven by mixing over a submarine bank

    NASA Astrophysics Data System (ADS)

    Cummins, Patrick F.; Foreman, Michael G. G.

    1998-04-01

    A primitive equation model is applied to study the spin-up of a linearly stratified, rotating fluid over an isolated topographic bank. The model has vertical eddy mixing coefficients that decay away from the bottom over a specified e-folding scale. No external flows are imposed, and a circulation develops due solely to diffusion over the sea bed. Vertical mixing, coupled with the condition of zero diffusive flux of heat through the sea floor, leads to a distortion of isothermal surfaces near the bottom. The associated radial pressure gradients drive a radial-overturning circulation with upslope flow just above the bottom and downslope flows at greater height. Coriolis forces on the radial flows accelerate a verticallysheared azimuthal (alongslope) circulation. Near the bottom the azimuthal motion is cyclonic (upwelling favourable), while outside the boundary layer, the motion is anticyclonic. Sensitivity experiments show that this pattern is robust and maintained even with constant mixing coefficients. Attention is given to the driving mechanism for the depth-averaged azimuthal motion. An analysis of the relative angular momentum balance determines that the torque associated with bottom stresses drives the anticyclonic depth-averaged flow. In terms of vorticity, the anticyclonic vortex over the bank arises due to the curl of bottom stress divided by the depth. A parameter sensitivity study indicates that the depth-averaged flow is relatively insensitive to variations in the bottom drag coefficient.

  5. Study on an undershot cross-flow water turbine

    NASA Astrophysics Data System (ADS)

    Nishi, Yasuyuki; Inagaki, Terumi; Li, Yanrong; Omiya, Ryota; Fukutomi, Junichiro

    2014-06-01

    This study aims to develop a water turbine suitable for ultra-low heads in open channels, with the end goal being the effective utilization of unutilized hydroelectric energy in agricultural water channels. We performed tests by applying a cross-flow runner to an open channel as an undershot water turbine while attempting to simplify the structure and eliminate the casing. We experimentally investigated the flow fields and performance of water turbines in states where the flow rate was constant for the undershot cross-flow water turbine mentioned above. In addition, we compared existing undershot water turbines with our undershot cross-flow water turbine after attaching a bottom plate to the runner. From the results, we were able to clarify the following. Although the effective head for cross-flow runners with no bottom plate was lower than those found in existing runners equipped with a bottom plate, the power output is greater in the high rotational speed range because of the high turbine efficiency. Also, the runner with no bottom plate differed from runners that had a bottom plate in that no water was being wound up by the blades or retained between the blades, and the former received twice the flow due to the flow-through effect. As a result, the turbine efficiency was greater for runners with no bottom plate in the full rotational speed range compared with that found in runners that had a bottom plate.

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kokkoris, George; Boudouvis, Andreas G.; Gogolides, Evangelos

    An integrated framework for the neutral flux calculation inside trenches and holes during plasma etching is described, and a comparison between the two types of structure in a number of applications is presented. First, a detailed and functional set of equations for the neutral and ion flux calculations inside long trenches and holes with cylindrical symmetry is explicitly formulated. This set is based on early works [T. S. Cale and G. B. Raupp, J. Vac. Sci. Technol. B 8, 1242 (1990); V. K. Singh et al., J. Vac. Sci. Technol. B 10, 1091 (1992)], and includes new equations for themore » case of holes with cylindrical symmetry. Second, a method for the solution of the respective numerical task, i.e., one or a set of linear or nonlinear integral equations, is described. This method includes a coupling algorithm with a surface chemistry model and resolves the singularity problem of the integral equations. Third, the fluxes inside trenches and holes are compared. The flux from reemission is the major portion of the local flux at the bottom of both types of structure. The framework is applied in SiO{sub 2} etching by fluorocarbon plasmas to predict the increased intensity of reactive ion etching lag in SiO{sub 2} holes compared to trenches. It is also applied in deep Si etching: By calculating the flux of F atoms at the bottom of very high aspect ratio (up to 150) Si trenches and holes during the gas chopping process, the aspect ratio at which the flux of F atoms is eliminated and etching practically stops is estimated.« less

  7. Bottom-up or top-down: unit cost estimation of tuberculosis diagnostic tests in India.

    PubMed

    Rupert, S; Vassall, A; Raizada, N; Khaparde, S D; Boehme, C; Salhotra, V S; Sachdeva, K S; Nair, S A; Hoog, A H Van't

    2017-04-01

    Of 18 sites that participated in an implementation study of the Xpert® MTB/RIF assay in India, we selected five microscopy centres and two reference laboratories. To obtain unit costs of diagnostic tests for tuberculosis (TB) and drug-resistant TB. Laboratories were purposely selected to capture regional variations and different laboratory types. Both bottom-up and the top-down methods were used to estimate unit costs. At the microscopy centres, mean bottom-up unit costs were respectively US$0.83 (range US$0.60-US$1.10) and US$12.29 (US$11.61-US$12.89) for sputum smear microscopy and Xpert. At the reference laboratories, mean unit costs were US$1.69 for the decontamination procedure, US$9.83 for a solid culture, US$11.06 for a liquid culture, US$29.88 for a drug susceptibility test, and US$18.18 for a line-probe assay. Top-down mean unit cost estimates were higher for all tests, and for sputum smear microscopy and Xpert these increased to respectively US$1.51 and US$13.58. The difference between bottom-up and top-down estimates was greatest for tests performed at the reference laboratories. These unit costs for TB diagnostics can be used to estimate resource requirements and cost-effectiveness in India, taking into account geographical location, laboratory type and capacity utilisation.

  8. Bottom-up Attention Orienting in Young Children with Autism

    ERIC Educational Resources Information Center

    Amso, Dima; Haas, Sara; Tenenbaum, Elena; Markant, Julie; Sheinkopf, Stephen J.

    2014-01-01

    We examined the impact of simultaneous bottom-up visual influences and meaningful social stimuli on attention orienting in young children with autism spectrum disorders (ASDs). Relative to typically-developing age and sex matched participants, children with ASDs were more influenced by bottom-up visual scene information regardless of whether…

  9. Bottom-up vs. top-down effects on terrestrial insect herbivores: a meta-analysis.

    PubMed

    Vidal, Mayra C; Murphy, Shannon M

    2018-01-01

    Primary consumers are under strong selection from resource ('bottom-up') and consumer ('top-down') controls, but the relative importance of these selective forces is unknown. We performed a meta-analysis to compare the strength of top-down and bottom-up forces on consumer fitness, considering multiple predictors that can modulate these effects: diet breadth, feeding guild, habitat/environment, type of bottom-up effects, type of top-down effects and how consumer fitness effects are measured. We focused our analyses on the most diverse group of primary consumers, herbivorous insects, and found that in general top-down forces were stronger than bottom-up forces. Notably, chewing, sucking and gall-making herbivores were more affected by top-down than bottom-up forces, top-down forces were stronger than bottom-up in both natural and controlled (cultivated) environments, and parasitoids and predators had equally strong top-down effects on insect herbivores. Future studies should broaden the scope of focal consumers, particularly in understudied terrestrial systems, guilds, taxonomic groups and top-down controls (e.g. pathogens), and test for more complex indirect community interactions. Our results demonstrate the surprising strength of forces exerted by natural enemies on herbivorous insects, and thus the necessity of using a tri-trophic approach when studying insect-plant interactions. © 2017 John Wiley & Sons Ltd/CNRS.

  10. Recovery of a top predator mediates negative eutrophic effects on seagrass

    USGS Publications Warehouse

    Hughes, Brent B.; Eby, Ron; Van Dyke, Eric; Tinker, M. Tim; Marks, Corina I.; Johnson, Kenneth S.; Wasson, Kerstin

    2013-01-01

    A fundamental goal of the study of ecology is to determine the drivers of habitat-forming vegetation, with much emphasis given to the relative importance to vegetation of “bottom-up” forces such as the role of nutrients and “top-down” forces such as the influence of herbivores and their predators. For coastal vegetation (e.g., kelp, seagrass, marsh, and mangroves) it has been well demonstrated that alterations to bottom-up forcing can cause major disturbances leading to loss of dominant vegetation. One such process is anthropogenic nutrient loading, which can lead to major changes in the abundance and species composition of primary producers, ultimately affecting important ecosystem services. In contrast, much less is known about the relative importance of apex predators on coastal vegetated ecosystems because most top predator populations have been depleted or lost completely. Here we provide evidence that an unusual four-level trophic cascade applies in one such system, whereby a top predator mitigates the bottom-up influences of nutrient loading. In a study of seagrass beds in an estuarine ecosystem exposed to extreme nutrient loading, we use a combination of a 50-y time series analysis, spatial comparisons, and mesocosm and field experiments to demonstrate that sea otters (Enhydra lutris) promote the growth and expansion of eelgrass (Zostera marina) through a trophic cascade, counteracting the negative effects of agriculturally induced nutrient loading. Our results add to a small but growing body of literature illustrating that significant interactions between bottom-up and top-down forces occur, in this case with consequences for the conservation of valued ecosystem services provided by seagrass.

  11. Recovery of a top predator mediates negative eutrophic effects on seagrass

    PubMed Central

    Hughes, Brent B.; Eby, Ron; Van Dyke, Eric; Tinker, M. Tim; Marks, Corina I.; Johnson, Kenneth S.; Wasson, Kerstin

    2013-01-01

    A fundamental goal of the study of ecology is to determine the drivers of habitat-forming vegetation, with much emphasis given to the relative importance to vegetation of “bottom-up” forces such as the role of nutrients and “top-down” forces such as the influence of herbivores and their predators. For coastal vegetation (e.g., kelp, seagrass, marsh, and mangroves) it has been well demonstrated that alterations to bottom-up forcing can cause major disturbances leading to loss of dominant vegetation. One such process is anthropogenic nutrient loading, which can lead to major changes in the abundance and species composition of primary producers, ultimately affecting important ecosystem services. In contrast, much less is known about the relative importance of apex predators on coastal vegetated ecosystems because most top predator populations have been depleted or lost completely. Here we provide evidence that an unusual four-level trophic cascade applies in one such system, whereby a top predator mitigates the bottom-up influences of nutrient loading. In a study of seagrass beds in an estuarine ecosystem exposed to extreme nutrient loading, we use a combination of a 50-y time series analysis, spatial comparisons, and mesocosm and field experiments to demonstrate that sea otters (Enhydra lutris) promote the growth and expansion of eelgrass (Zostera marina) through a trophic cascade, counteracting the negative effects of agriculturally induced nutrient loading. Our results add to a small but growing body of literature illustrating that significant interactions between bottom-up and top-down forces occur, in this case with consequences for the conservation of valued ecosystem services provided by seagrass. PMID:23983266

  12. The influence of the waterjet propulsion system on the ships' energy consumption and emissions inventories.

    PubMed

    Durán-Grados, Vanesa; Mejías, Javier; Musina, Liliya; Moreno-Gutiérrez, Juan

    2018-08-01

    In this study we consider the problems associated with calculating ships' energy and emission inventories. Various related uncertainties are described in many similar studies published in the last decade, and applying to Europe, the USA and Canada. However, none of them have taken into account the performance of ships' propulsion systems. On the one hand, when a ship uses its propellers, there is no unanimous agreement on the equations used to calculate the main engines load factor and, on the other, the performance of waterjet propulsion systems (for which this variable depends on the speed of the ship) has not been taken into account in any previous studies. This paper proposes that the efficiency of the propulsion system should be included as a new parameter in the equation that defines the actual power delivered by a ship's main engines, as applied to calculate energy consumption and emissions in maritime transport. To highlight the influence of the propulsion system on calculated energy consumption and emissions, the bottom-up method has been applied using data from eight fast ferries operating across the Strait of Gibraltar over the course of one year. This study shows that the uncertainty about the efficiency of the propulsion system should be added as one more uncertainty in the energy and emission inventories for maritime transport as currently prepared. After comparing four methods for this calculation, the authors propose a new method for eight cases. For the calculation of the Main Engine's fuel oil consumption, differences up to 22% between some methods were obtained at low loads. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Improved uncertainty quantification in nondestructive assay for nonproliferation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burr, Tom; Croft, Stephen; Jarman, Ken

    2016-12-01

    This paper illustrates methods to improve uncertainty quantification (UQ) for non-destructive assay (NDA) measurements used in nuclear nonproliferation. First, it is shown that current bottom-up UQ applied to calibration data is not always adequate, for three main reasons: (1) Because there are errors in both the predictors and the response, calibration involves a ratio of random quantities, and calibration data sets in NDA usually consist of only a modest number of samples (3–10); therefore, asymptotic approximations involving quantities needed for UQ such as means and variances are often not sufficiently accurate; (2) Common practice overlooks that calibration implies a partitioningmore » of total error into random and systematic error, and (3) In many NDA applications, test items exhibit non-negligible departures in physical properties from calibration items, so model-based adjustments are used, but item-specific bias remains in some data. Therefore, improved bottom-up UQ using calibration data should predict the typical magnitude of item-specific bias, and the suggestion is to do so by including sources of item-specific bias in synthetic calibration data that is generated using a combination of modeling and real calibration data. Second, for measurements of the same nuclear material item by both the facility operator and international inspectors, current empirical (top-down) UQ is described for estimating operator and inspector systematic and random error variance components. A Bayesian alternative is introduced that easily accommodates constraints on variance components, and is more robust than current top-down methods to the underlying measurement error distributions.« less

  14. FOREWORD: Focus on Novel Nanoelectromechanical 3D Structures: Fabrication and Properties Focus on Novel Nanoelectromechanical 3D Structures: Fabrication and Properties

    NASA Astrophysics Data System (ADS)

    Yamada, Shooji; Yamaguchi, Hiroshi; Ishihara, Sunao

    2009-06-01

    Microelectromechanical systems (MEMS) are widely used small electromechanical systems made of micrometre-sized components. Presently, we are witnessing a transition from MEMS to nanoelectromechanical systems (NEMS), which comprise devices integrating electrical and mechanical functionality on the nanoscale and offer new exciting applications. Similarly to MEMS, NEMS typically include a central transistor-like nanoelectronic unit for data processing, as well as mechanical actuators, pumps, and motors; and they may combine with physical, biological and chemical sensors. In the transition from MEMS to NEMS, component sizes need to be reduced. Therefore, many fabrication methods previously developed for MEMS are unsuitable for the production of high-precision NEMS components. The key challenge in NEMS is therefore the development of new methods for routine and reproducible nanofabrication. Two complementary types of method for NEMS fabrication are available: 'top-down' and 'bottom-up'. The top-down approach uses traditional lithography technologies, whereas bottom-up techniques include molecular self-organization, self-assembly and nanodeposition. The NT2008 conference, held at Ishikawa High-Tech Conference Center, Ishikawa, Japan, between 23-25 October 2008, focused on novel NEMS fabricated from new materials and on process technologies. The topics included compound semiconductors, small mechanical structures, nanostructures for micro-fluid and bio-sensors, bio-hybrid micro-machines, as well as their design and simulation. This focus issue compiles seven articles selected from 13 submitted manuscripts. The articles by Prinz et al and Kehrbusch et al introduce the frontiers of the top-down production of various operational NEMS devices, and Kometani et al present an example of the bottom-up approach, namely ion-beam induced deposition of MEMS and NEMS. The remaining articles report novel technologies for biological sensors. Taira et al have used manganese nanoparticles to improve the chemical analysis of biological samples by laser desorption/ionization mass spectrometry. Matsumoto et al have prepared sugar microarrays via click chemistry and have applied this to the detection and characterization of proteins. Yoshimura et al have expanded the single-nucleotide polymorphism typing method to differentiate genes from various food crops, such as indica and japonica rice. Finally, Takashi et al have designed a nanoparticle-based strip sensor, which can be used for rapid evaluation of the psychological condition of animals and humans. We hope that this focus issue will help readers to understand, from a materials science viewpoint, different aspects of frontier research into NEMS.

  15. A bottom-up route to enhance thermoelectric figures of merit in graphene nanoribbons

    PubMed Central

    Sevinçli, Hâldun; Sevik, Cem; Çaın, Tahir; Cuniberti, Gianaurelio

    2013-01-01

    We propose a hybrid nano-structuring scheme for tailoring thermal and thermoelectric transport properties of graphene nanoribbons. Geometrical structuring and isotope cluster engineering are the elements that constitute the proposed scheme. Using first-principles based force constants and Hamiltonians, we show that the thermal conductance of graphene nanoribbons can be reduced by 98.8% at room temperature and the thermoelectric figure of merit, ZT, can be as high as 3.25 at T = 800 K. The proposed scheme relies on a recently developed bottom-up fabrication method, which is proven to be feasible for synthesizing graphene nanoribbons with an atomic precision. PMID:23390578

  16. In vitro biodegradation testing of Mg-alloy EZK400 and manufacturing of implant prototypes using PM (powder metallurgy) methods.

    PubMed

    Wolff, M; Luczak, M; Schaper, J G; Wiese, B; Dahms, M; Ebel, T; Willumeit-Römer, R; Klassen, T

    2018-09-01

    The study is focussing towards Metal Injection Moulding (MIM) of Mg-alloys for biomedical implant applications. Especially the influence of the sintering processing necessary for the consolidation of the finished part is in focus of this study. In doing so, the chosen high strength EZK400 Mg-alloy powder material was sintered using different sintering support bottom plate materials to evaluate the possibility of iron impurity pick up during sintering. It can be shown that iron pick up took place from the steel bottom plate into the specimen. Despite the fact that a separating boron nitrite (BN) barrier layer was used and the Mg-Fe phase diagram is not predicting any significant solubility to each other. As a result of this study a new bottom plate material not harming the sintering and the biodegradation performance of the as sintered material, namely a carbon plate material, was found.

  17. A simplified fourwall interference assessment procedure for airfoil data obtained in the Langley 0.3-meter transonic cryogenic tunnel

    NASA Technical Reports Server (NTRS)

    Murthy, A. V.

    1987-01-01

    A simplified fourwall interference assessment method has been described, and a computer program developed to facilitate correction of the airfoil data obtained in the Langley 0.3-m Transonic Cryogenic Tunnel (TCT). The procedure adopted is to first apply a blockage correction due to sidewall boundary-layer effects by various methods. The sidewall boundary-layer corrected data are then used to calculate the top and bottom wall interference effects by the method of Capallier, Chevallier and Bouinol, using the measured wall pressure distribution and the model force coefficients. The interference corrections obtained by the present method have been compared with other methods and found to give good agreement for the experimental data obtained in the TCT with slotted top and bottom walls.

  18. The role of technology in reducing health care costs. Final project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sill, A.E.; Warren, S.; Dillinger, J.D.

    1997-08-01

    Sandia National Laboratories applied a systems approach to identifying innovative biomedical technologies with the potential to reduce U.S. health care delivery costs while maintaining care quality. This study was conducted by implementing both top-down and bottom-up strategies. The top-down approach used prosperity gaming methodology to identify future health care delivery needs. This effort provided roadmaps for the development and integration of technology to meet perceived care delivery requirements. The bottom-up approach identified and ranked interventional therapies employed in existing care delivery systems for a host of health-related conditions. Economic analysis formed the basis for development of care pathway interaction modelsmore » for two of the most pervasive, chronic disease/disability conditions: coronary artery disease (CAD) and benign prostatic hypertrophy (BPH). Societal cost-benefit relationships based on these analyses were used to evaluate the effect of emerging technology in these treatment areas. 17 figs., 48 tabs.« less

  19. Status analysis of keyhole bottom in laser-MAG hybrid welding process.

    PubMed

    Wang, Lin; Gao, Xiangdong; Chen, Ziqin

    2018-01-08

    The keyhole status is a determining factor of weld quality in laser-metal active gas arc (MAG) hybrid welding process. For a better evaluation of the hybrid welding process, three different penetration welding experiments: partial penetration, normal penetration (or full penetration), and excessive penetration were conducted in this work. The instantaneous visual phenomena including metallic vapor, spatters and keyhole of bottom surface were used to evaluate the keyhole status by a double high-speed camera system. The Fourier transform was applied on the bottom weld pool image for removing the image noise around the keyhole, and then the bottom weld pool image was reconstructed through the inverse Fourier transform. Lastly, the keyhole bottom was extracted from the de-noised bottom weld pool image. By analyzing the visual features of the laser-MAG hybrid welding process, mechanism of the closed and opened keyhole bottom were revealed. The results show that the stable opened or closed status of keyhole bottom is directly affected by the MAG droplet transition in the normal penetration welding process, and the unstable opened or closed status of keyhole bottom would appear in excessive penetration welding and partial penetration welding. The analysis method proposed in this paper could be used to monitor the keyhole stability in laser-MAG hybrid welding process.

  20. To elute or not to elute in immunocapture bottom-up LC-MS.

    PubMed

    Levernæs, Maren Christin Stillesby; Broughton, Marianne Nordlund; Reubsaet, Léon; Halvorsen, Trine Grønhaug

    2017-06-15

    Immunocapture-based bottom-up LC-MS is a promising technique for the quantification of low abundant proteins. Magnetic immunocapture beads provide efficient enrichment from complex samples through the highly specific interaction between the target protein and its antibody. In this article, we have performed the first thorough comparison between digestion of proteins while bound to antibody coated beads versus after elution from the beads. Two previously validated immunocapture based MS methods for the quantification of pro-gastrin releasing peptide (ProGRP) and human chorionic gonadotropin (hCG) were used as model systems. The tryptic peptide generation was shown to be protein dependent and influenced by protein folding and accessibility towards trypsin both on-beads and in the eluate. The elution of proteins bound to the beads was also shown to be incomplete. In addition, the on-beads digestion suffered from non-specific binding of the trypsin generated peptides. A combination of on-beads digestion and elution may be applied to improve both the quantitative (peak area of the signature peptides) and qualitative yield (number of missed cleavages, total number of identified peptides, coverage, signal intensity and number of zero missed cleavage peptides) of the target proteins. The quantitative yield of signature peptides was shown to be reproducible in all procedures tested. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Fast and robust reconstruction for fluorescence molecular tomography via a sparsity adaptive subspace pursuit method.

    PubMed

    Ye, Jinzuo; Chi, Chongwei; Xue, Zhenwen; Wu, Ping; An, Yu; Xu, Han; Zhang, Shuang; Tian, Jie

    2014-02-01

    Fluorescence molecular tomography (FMT), as a promising imaging modality, can three-dimensionally locate the specific tumor position in small animals. However, it remains challenging for effective and robust reconstruction of fluorescent probe distribution in animals. In this paper, we present a novel method based on sparsity adaptive subspace pursuit (SASP) for FMT reconstruction. Some innovative strategies including subspace projection, the bottom-up sparsity adaptive approach, and backtracking technique are associated with the SASP method, which guarantees the accuracy, efficiency, and robustness for FMT reconstruction. Three numerical experiments based on a mouse-mimicking heterogeneous phantom have been performed to validate the feasibility of the SASP method. The results show that the proposed SASP method can achieve satisfactory source localization with a bias less than 1mm; the efficiency of the method is much faster than mainstream reconstruction methods; and this approach is robust even under quite ill-posed condition. Furthermore, we have applied this method to an in vivo mouse model, and the results demonstrate the feasibility of the practical FMT application with the SASP method.

  2. Alternate Perspectives on Concept Internalization: Learning Top Down Vs. Learning Bottom Up.

    ERIC Educational Resources Information Center

    Pines, A. Leon

    This paper outlines two alternate ways in which concepts are acquired, known as "top down" and "bottom up". "Bottom up" refers to learning the members of a category and then extracting their similarities or differences, the rule or criterial attributes used to make the categorization. In the "top down"…

  3. Exploration into the Effects of the Schema-Based Instruction: A Bottom-Up Approach

    ERIC Educational Resources Information Center

    Fujii, Kazuma

    2016-01-01

    The purpose of this paper is to explore the effective use of the core schema-based instruction (SBI) in a classroom setting. The core schema is a schematic representation of the common underlying meaning of a given lexical item, and was first proposed on the basis of the cognitive linguistic perspectives by the Japanese applied linguists Tanaka,…

  4. A Stakeholder-Based Approach to Leadership Development Training: The Case of Medical Education in Canada

    ERIC Educational Resources Information Center

    Bharwani, Aleem; Kline, Theresa; Patterson, Margaret

    2017-01-01

    This paper reports the use of a stakeholder-based, bottom-up approach to determining leadership training needs and designing leadership training programs which contrasts with the top-down policy that is often applied. The context is a Canadian medical school. Leadership training in medicine is in its infancy. Discussed and outlined in this study…

  5. Combining "Bottom-up" and "Top-down" Approaches to Assess the Impact of Food and Gastric pH on Pictilisib (GDC-0941) Pharmacokinetics.

    PubMed

    Lu, Tong; Fraczkiewicz, Grazyna; Salphati, Laurent; Budha, Nageshwar; Dalziel, Gena; Smelick, Gillian S; Morrissey, Kari M; Davis, John D; Jin, Jin Y; Ware, Joseph A

    2017-11-01

    Pictilisib, a weakly basic compound, is an orally administered, potent, and selective pan-inhibitor of phosphatidylinositol 3-kinases for oncology indications. To investigate the significance of high-fat food and gastric pH on pictilisib pharmacokinetics (PK) and enable label recommendations, a dedicated clinical study was conducted in healthy volunteers, whereby both top-down (population PK, PopPK) and bottom-up (physiologically based PK, PBPK) approaches were applied to enhance confidence of recommendation and facilitate the clinical development through scenario simulations. The PopPK model identified food (for absorption rate constant (K a )) and proton pump inhibitors (PPI, for relative bioavailability (F rel ) and K a ) as significant covariates. Food and PPI also impacted the variability of F rel . The PBPK model accounted for the supersaturation tendency of pictilisib, and gastric emptying physiology successfully predicted the food and PPI effect on pictilisib absorption. Our research highlights the importance of applying both quantitative approaches to address critical drug development questions. © 2017 The Authors CPT: Pharmacometrics & Systems Pharmacology published by Wiley Periodicals, Inc. on behalf of American Society for Clinical Pharmacology and Therapeutics.

  6. Bridging the Gap between the Nanometer-Scale Bottom-Up and Micrometer-Scale Top-Down Approaches for Site-Defined InP/InAs Nanowires.

    PubMed

    Zhang, Guoqiang; Rainville, Christophe; Salmon, Adrian; Takiguchi, Masato; Tateno, Kouta; Gotoh, Hideki

    2015-11-24

    This work presents a method that bridges the gap between the nanometer-scale bottom-up and micrometer-scale top-down approaches for site-defined nanostructures, which has long been a significant challenge for applications that require low-cost and high-throughput manufacturing processes. We realized the bridging by controlling the seed indium nanoparticle position through a self-assembly process. Site-defined InP nanowires were then grown from the indium-nanoparticle array in the vapor-liquid-solid mode through a "seed and grow" process. The nanometer-scale indium particles do not always occupy the same locations within the micrometer-scale open window of an InP exposed substrate due to the scale difference. We developed a technique for aligning the nanometer-scale indium particles on the same side of the micrometer-scale window by structuring the surface of a misoriented InP (111)B substrate. Finally, we demonstrated that the developed method can be used to grow a uniform InP/InAs axial-heterostructure nanowire array. The ability to form a heterostructure nanowire array with this method makes it possible to tune the emission wavelength over a wide range by employing the quantum confinement effect and thus expand the application of this technology to optoelectronic devices. Successfully pairing a controllable bottom-up growth technique with a top-down substrate preparation technique greatly improves the potential for the mass-production and widespread adoption of this technology.

  7. Shallow water bathymetry correction using sea bottom classification with multispectral satellite imagery

    NASA Astrophysics Data System (ADS)

    Kazama, Yoriko; Yamamoto, Tomonori

    2017-10-01

    Bathymetry at shallow water especially shallower than 15m is an important area for environmental monitoring and national defense. Because the depth of shallow water is changeable by the sediment deposition and the ocean waves, the periodic monitoring at shoe area is needed. Utilization of satellite images are well matched for widely and repeatedly monitoring at sea area. Sea bottom terrain model using by remote sensing data have been developed and these methods based on the radiative transfer model of the sun irradiance which is affected by the atmosphere, water, and sea bottom. We adopted that general method of the sea depth extraction to the satellite imagery, WorldView-2; which has very fine spatial resolution (50cm/pix) and eight bands at visible to near-infrared wavelengths. From high-spatial resolution satellite images, there is possibility to know the coral reefs and the rock area's detail terrain model which offers important information for the amphibious landing. In addition, the WorldView-2 satellite sensor has the band at near the ultraviolet wavelength that is transmitted through the water. On the other hand, the previous study showed that the estimation error by the satellite imagery was related to the sea bottom materials such as sand, coral reef, sea alga, and rocks. Therefore, in this study, we focused on sea bottom materials, and tried to improve the depth estimation accuracy. First, we classified the sea bottom materials by the SVM method, which used the depth data acquired by multi-beam sonar as supervised data. Then correction values in the depth estimation equation were calculated applying the classification results. As a result, the classification accuracy of sea bottom materials was 93%, and the depth estimation error using the correction by the classification result was within 1.2m.

  8. Spatial bottom-up controls on fire likelihood vary across western North America

    Treesearch

    Sean A. Parks; Marc-Andre Parisien; Carol Miller

    2012-01-01

    The unique nature of landscapes has challenged our ability to make generalizations about the effects of bottom-up controls on fire regimes. For four geographically distinct fire-prone landscapes in western North America, we used a consistent simulation approach to quantify the influence of three key bottom-up factors, ignitions, fuels, and topography, on spatial...

  9. Effects of the bottom boundary condition in numerical investigations of dense water cascading on a slope

    NASA Astrophysics Data System (ADS)

    Berntsen, Jarle; Alendal, Guttorm; Avlesen, Helge; Thiem, Øyvind

    2018-05-01

    The flow of dense water along continental slopes is considered. There is a large literature on the topic based on observations and laboratory experiments. In addition, there are many analytical and numerical studies of dense water flows. In particular, there is a sequence of numerical investigations using the dynamics of overflow mixing and entrainment (DOME) setup. In these papers, the sensitivity of the solutions to numerical parameters such as grid size and numerical viscosity coefficients and to the choices of methods and models is investigated. In earlier DOME studies, three different bottom boundary conditions and a range of vertical grid sizes are applied. In other parts of the literature on numerical studies of oceanic gravity currents, there are statements that appear to contradict choices made on bottom boundary conditions in some of the DOME papers. In the present study, we therefore address the effects of the bottom boundary condition and vertical resolution in numerical investigations of dense water cascading on a slope. The main finding of the present paper is that it is feasible to capture the bottom Ekman layer dynamics adequately and cost efficiently by using a terrain-following model system using a quadratic drag law with a drag coefficient computed to give near-bottom velocity profiles in agreement with the logarithmic law of the wall. Many studies of dense water flows are performed with a quadratic bottom drag law and a constant drag coefficient. It is shown that when using this bottom boundary condition, Ekman drainage will not be adequately represented. In other studies of gravity flow, a no-slip bottom boundary condition is applied. With no-slip and a very fine resolution near the seabed, the solutions are essentially equal to the solutions obtained with a quadratic drag law and a drag coefficient computed to produce velocity profiles matching the logarithmic law of the wall. However, with coarser resolution near the seabed, there may be a substantial artificial blocking effect when using no-slip.

  10. Top-Down Beta Enhances Bottom-Up Gamma

    PubMed Central

    Thompson, William H.

    2017-01-01

    Several recent studies have demonstrated that the bottom-up signaling of a visual stimulus is subserved by interareal gamma-band synchronization, whereas top-down influences are mediated by alpha-beta band synchronization. These processes may implement top-down control of stimulus processing if top-down and bottom-up mediating rhythms are coupled via cross-frequency interaction. To test this possibility, we investigated Granger-causal influences among awake macaque primary visual area V1, higher visual area V4, and parietal control area 7a during attentional task performance. Top-down 7a-to-V1 beta-band influences enhanced visually driven V1-to-V4 gamma-band influences. This enhancement was spatially specific and largest when beta-band activity preceded gamma-band activity by ∼0.1 s, suggesting a causal effect of top-down processes on bottom-up processes. We propose that this cross-frequency interaction mechanistically subserves the attentional control of stimulus selection. SIGNIFICANCE STATEMENT Contemporary research indicates that the alpha-beta frequency band underlies top-down control, whereas the gamma-band mediates bottom-up stimulus processing. This arrangement inspires an attractive hypothesis, which posits that top-down beta-band influences directly modulate bottom-up gamma band influences via cross-frequency interaction. We evaluate this hypothesis determining that beta-band top-down influences from parietal area 7a to visual area V1 are correlated with bottom-up gamma frequency influences from V1 to area V4, in a spatially specific manner, and that this correlation is maximal when top-down activity precedes bottom-up activity. These results show that for top-down processes such as spatial attention, elevated top-down beta-band influences directly enhance feedforward stimulus-induced gamma-band processing, leading to enhancement of the selected stimulus. PMID:28592697

  11. A variable resolution right TIN approach for gridded oceanographic data

    NASA Astrophysics Data System (ADS)

    Marks, David; Elmore, Paul; Blain, Cheryl Ann; Bourgeois, Brian; Petry, Frederick; Ferrini, Vicki

    2017-12-01

    Many oceanographic applications require multi resolution representation of gridded data such as for bathymetric data. Although triangular irregular networks (TINs) allow for variable resolution, they do not provide a gridded structure. Right TINs (RTINs) are compatible with a gridded structure. We explored the use of two approaches for RTINs termed top-down and bottom-up implementations. We illustrate why the latter is most appropriate for gridded data and describe for this technique how the data can be thinned. While both the top-down and bottom-up approaches accurately preserve the surface morphology of any given region, the top-down method of vertex placement can fail to match the actual vertex locations of the underlying grid in many instances, resulting in obscured topology/bathymetry. Finally we describe the use of the bottom-up approach and data thinning in two applications. The first is to provide thinned, variable resolution bathymetry data for tests of storm surge and inundation modeling, in particular hurricane Katrina. Secondly we consider the use of the approach for an application to an oceanographic data grid of 3-D ocean temperature.

  12. Application of satellite observations for timely updates to global anthropogenic NOx emission inventories

    NASA Astrophysics Data System (ADS)

    Lamsal, L. N.; Martin, R. V.; Padmanabhan, A.; van Donkelaar, A.; Zhang, Q.; Sioris, C. E.; Chance, K.; Kurosu, T. P.; Newchurch, M. J.

    2011-03-01

    Anthropogenic emissions of nitrogen oxides (NOx) can change rapidly due to economic growth or control measures. Bottom-up emissions estimated using source-specific emission factors and activity statistics require years to compile and can become quickly outdated. We present a method to use satellite observations of tropospheric NO2 columns to estimate changes in NOx emissions. We use tropospheric NO2 columns retrieved from the SCIAMACHY satellite instrument for 2003-2009, the response of tropospheric NO2 columns to changes in NOx emissions determined from a global chemical transport model (GEOS-Chem), and the bottom-up anthropogenic NOx emissions for 2006 to hindcast and forecast the inventories. We evaluate our approach by comparing bottom-up and hindcast emissions for 2003. The two inventories agree within 6.0% globally and within 8.9% at the regional scale with consistent trends in western Europe, North America, and East Asia. We go on to forecast emissions for 2009. During 2006-2009, anthropogenic NOx emissions over land increase by 9.2% globally and by 18.8% from East Asia. North American emissions decrease by 5.7%.

  13. Paradigm shift from self-assembly to commanded assembly of functional materials: recent examples in porphyrin/fullerene supramolecular systems

    NASA Astrophysics Data System (ADS)

    Li, Mao; Ishihara, Shinsuke; Ji, Qingmin; Akada, Misaho; Hill, Jonathan P.; Ariga, Katsuhiko

    2012-10-01

    Current nanotechnology based on top-down nanofabrication may encounter a variety of drawbacks in the near future so that development of alternative methods, including the so-called bottom-up approach, has attracted considerable attention. However, the bottom-up strategy, which often relies on spontaneous self-assembly, might be inefficient in the development of the requisite functional materials and systems. Therefore, assembly processes controlled by external stimuli might be a plausible strategy for the development of bottom-up nanotechnology. In this review, we demonstrate a paradigm shift from self-assembly to commanded assembly by describing several examples of assemblies of typical functional molecules, i.e. porphyrins and fullerenes. In the first section, we describe recent progress in the design and study of self-assembled and co-assembled supramolecular architectures of porphyrins and fullerenes. Then, we show examples of assembly induced by external stimuli. We emphasize the paradigm shift from self-assembly to commanded assembly by describing the recently developed electrochemical-coupling layer-by-layer (ECC-LbL) methodology.

  14. Mass production of bulk artificial nacre with excellent mechanical properties.

    PubMed

    Gao, Huai-Ling; Chen, Si-Ming; Mao, Li-Bo; Song, Zhao-Qiang; Yao, Hong-Bin; Cölfen, Helmut; Luo, Xi-Sheng; Zhang, Fu; Pan, Zhao; Meng, Yu-Feng; Ni, Yong; Yu, Shu-Hong

    2017-08-18

    Various methods have been exploited to replicate nacre features into artificial structural materials with impressive structural and mechanical similarity. However, it is still very challenging to produce nacre-mimetics in three-dimensional bulk form, especially for further scale-up. Herein, we demonstrate that large-sized, three-dimensional bulk artificial nacre with comprehensive mimicry of the hierarchical structures and the toughening mechanisms of natural nacre can be facilely fabricated via a bottom-up assembly process based on laminating pre-fabricated two-dimensional nacre-mimetic films. By optimizing the hierarchical architecture from molecular level to macroscopic level, the mechanical performance of the artificial nacre is superior to that of natural nacre and many engineering materials. This bottom-up strategy has no size restriction or fundamental barrier for further scale-up, and can be easily extended to other material systems, opening an avenue for mass production of high-performance bulk nacre-mimetic structural materials in an efficient and cost-effective way for practical applications.Artificial materials that replicate the mechanical properties of nacre represent important structural materials, but are difficult to produce in bulk. Here, the authors exploit the bottom-up assembly of 2D nacre-mimetic films to fabricate 3D bulk artificial nacre with an optimized architecture and excellent mechanical properties.

  15. Progress in Top-Down Proteomics and the Analysis of Proteoforms

    PubMed Central

    Toby, Timothy K.; Fornelli, Luca; Kelleher, Neil L.

    2017-01-01

    From a molecular perspective, enactors of function in biology are intact proteins that can be variably modified at the genetic, transcriptional, or post-translational level. Over the past 30 years, mass spectrometry (MS) has become a powerful method for the analysis of proteomes. Prevailing bottom-up proteomics operates at the level of the peptide, leading to issues with protein inference, connectivity, and incomplete sequence/modification information. Top-down proteomics (TDP), alternatively, applies MS at the proteoform level to analyze intact proteins with diverse sources of intramolecular complexity preserved during analysis. Fortunately, advances in prefractionation workflows, MS instrumentation, and dissociation methods for whole-protein ions have helped TDP emerge as an accessible and potentially disruptive modality with increasingly translational value. In this review, we discuss technical and conceptual advances in TDP, along with the growing power of proteoform-resolved measurements in clinical and translational research. PMID:27306313

  16. On the representability problem and the physical meaning of coarse-grained models

    NASA Astrophysics Data System (ADS)

    Wagner, Jacob W.; Dama, James F.; Durumeric, Aleksander E. P.; Voth, Gregory A.

    2016-07-01

    In coarse-grained (CG) models where certain fine-grained (FG, i.e., atomistic resolution) observables are not directly represented, one can nonetheless identify indirect the CG observables that capture the FG observable's dependence on CG coordinates. Often, in these cases it appears that a CG observable can be defined by analogy to an all-atom or FG observable, but the similarity is misleading and significantly undermines the interpretation of both bottom-up and top-down CG models. Such problems emerge especially clearly in the framework of the systematic bottom-up CG modeling, where a direct and transparent correspondence between FG and CG variables establishes precise conditions for consistency between CG observables and underlying FG models. Here we present and investigate these representability challenges and illustrate them via the bottom-up conceptual framework for several simple analytically tractable polymer models. The examples provide special focus on the observables of configurational internal energy, entropy, and pressure, which have been at the root of controversy in the CG literature, as well as discuss observables that would seem to be entirely missing in the CG representation but can nonetheless be correlated with CG behavior. Though we investigate these problems in the framework of systematic coarse-graining, the lessons apply to top-down CG modeling also, with crucial implications for simulation at constant pressure and surface tension and for the interpretations of structural and thermodynamic correlations for comparison to experiment.

  17. Study of droplet flow in a T-shape microchannel with bottom wall fluctuation

    NASA Astrophysics Data System (ADS)

    Pang, Yan; Wang, Xiang; Liu, Zhaomiao

    2018-03-01

    Droplet generation in a T-shape microchannel, with a main channel width of 50 μm , side channel width of 25 μm, and height of 50 μm, is simulated to study the effects of the forced fluctuation of the bottom wall. The periodic fluctuations of the bottom wall are applied on the near junction part of the main channel in the T-shape microchannel. Effects of bottom wall's shape, fluctuation periods, and amplitudes on the droplet generation are covered in the research of this protocol. In the simulation, the average size is affected a little by the fluctuations, but significantly by the fixed shape of the deformed bottom wall, while the droplet size range is expanded by the fluctuations under most of the conditions. Droplet sizes are distributed in a periodic pattern with small amplitude along the relative time when the fluctuation is forced on the bottom wall near the T-junction, while the droplet emerging frequency is not varied by the fluctuation. The droplet velocity is varied by the bottom wall motion, especially under the shorter period and the larger amplitude. When the fluctuation period is similar to the droplet emerging period, the droplet size is as stable as the non-fluctuation case after a development stage at the beginning of flow, while the droplet velocity is varied by the moving wall with the scope up to 80% of the average velocity under the conditions of this investigation.

  18. Direction with Discretion: Reading Recovery as an Example of Balancing Top-Down Policy and Bottom-Up Decision-Making.

    ERIC Educational Resources Information Center

    Scharer, Patricia L.; Zajano, Nancy C.

    Educational policy analysts have recognized the need for an educational policy that combines the merits of "top-down" mandates with "bottom-up" teacher discretion. This paper describes the Reading Recovery program as an example of an educational program that balances top-down direction and bottom-up discretion by: (1) providing an overall…

  19. A bottom-up method to develop pollution abatement cost curves for coal-fired utility boilers

    EPA Science Inventory

    This paper illustrates a new method to create supply curves for pollution abatement using boiler-level data that explicitly accounts for technology costs and performance. The Coal Utility Environmental Cost (CUECost) model is used to estimate retrofit costs for five different NO...

  20. Disseminating effective clinician communication techniques: Engaging clinicians to want to learn how to engage patients.

    PubMed

    Pollak, Kathryn I; Back, Anthony L; Tulsky, James A

    2017-10-01

    Patient-clinician communication that promotes patient engagement enhances health care quality. Yet, disseminating effective communication interventions to practicing clinicians remains challenging. Current methods do not have large and sustainable effects. In this paper, we argue that both top-down approaches (mandated by institutions) should be coupled with bottom-up approaches that address clinician motivation, confidence, and barriers. We need to engage clinicians in the same way we ask them to engage patients - strategically and with empathy. We discuss potentially innovative strategies to integrate top-down and bottom-up approaches in ways that fit clinicians' busy schedules and can inform policy. Copyright © 2017. Published by Elsevier B.V.

  1. Airborne Quantification of Methane Emissions in the San Francisco Bay Area of California

    NASA Astrophysics Data System (ADS)

    Guha, A.; Newman, S.; Martien, P. T.; Young, A.; Hilken, H.; Faloona, I. C.; Conley, S.

    2017-12-01

    The Bay Area Air Quality Management District, the San Francisco Bay Area's air quality regulatory agency, has set a goal to reduce the region's greenhouse gas (GHG) emissions 80% below 1990 levels by 2050, consistent with the State of California's climate protection goal. The Air District maintains a regional GHG emissions inventory that includes emissions estimates and projections which influence the agency's programs and regulatory activities. The Air District is currently working to better characterize methane emissions in the GHG inventory through source-specific measurements, to resolve differences between top-down regional estimates (Fairley and Fischer, 2015; Jeong et al., 2016) and the bottom-up inventory. The Air District funded and participated in a study in Fall 2016 to quantify methane emissions from a variety of sources from an instrumented Mooney aircraft. This study included 40 hours of cylindrical vertical profile flights that combined methane and wind measurements to derive mass emission rates. Simultaneous measurements of ethane provided source-apportionment between fossil-based and biological methane sources. The facilities sampled included all five refineries in the region, five landfills, two dairy farms and three wastewater treatment plants. The calculated mass emission rates were compared to bottom-up rates generated by the Air District and to those from facility reports to the US EPA as part of the mandatory GHG reporting program. Carbon dioxide emission rates from refineries are found to be similar to bottom-up estimates for all sources, supporting the efficacy of the airborne measurement methodology. However, methane emission estimates from the airborne method showed significant differences for some source categories. For example, methane emission estimates based on airborne measurements were up to an order of magnitude higher for refineries, and up to five times higher for landfills compared to bottom-up methods, suggesting significant underestimation in the inventories and self-reported estimates. Future measurements over the same facilities will reveal if we have seasonal and process-dependent trends in emissions. This will provide a basis for rule making and for designing mitigation and control actions.

  2. Task Decomposition in Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald Laurids; Joe, Jeffrey Clark

    2014-06-01

    In the probabilistic safety assessments (PSAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approachesmore » should arrive at the same set of HFEs. This question remains central as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PSAs tend to be top-down— defined as a subset of the PSA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) are more likely to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.« less

  3. Characterization of a Louisiana Bay Bottom

    NASA Astrophysics Data System (ADS)

    Freeman, A. M.; Roberts, H. H.

    2016-02-01

    This study correlates side-scan sonar and CHIRP water bottom-subbottom acoustic amplitudes with cone penetrometer data to expand the limited understanding of the geotechnical properties of sediments in coastal Louisiana's bays. Standardized analysis procedures were developed to characterize the bay bottom and shallow subsurface of the Sister Lake bay bottom. The CHIRP subbottom acoustic data provide relative amplitude information regarding reflection horizons of the bay bottom and shallow subsurface. An amplitude analysis technique was designed to identify different reflectance regions within the lake from the CHIRP subbottom profile data. This amplitude reflectivity analysis technique provides insight into the relative hardness of the bay bottom and shallow subsurface, useful in identifying areas of erosion versus deposition from storms, as well as areas suitable for cultch plants for state oyster seed grounds, or perhaps other restoration projects. Side-scan and CHIRP amplitude reflectivity results are compared to penetrometer data that quantifies geotechnical properties of surface and near-surface sediments. Initial results indicate distinct penetrometer signatures that characterize different substrate areas including soft bottom, storm-deposited silt-rich sediments, oyster cultch, and natural oyster reef areas. Although amplitude analysis of high resolution acoustic data does not directly quantify the geotechnical properties of bottom sediments, our analysis indicates a close relationship. The analysis procedures developed in this study can be applied in other dynamic coastal environments, "calibrating" the use of synoptic acoustic methods for large-scale water bottom characterization.

  4. Bottom Topographic Changes of Poyang Lake During Past Decade Using Multi-temporal Satellite Images

    NASA Astrophysics Data System (ADS)

    Zhang, S.

    2015-12-01

    Poyang Lake, as a well-known international wetland in the Ramsar Convention List, is the largest freshwater lake in China. It plays crucial ecological role in flood storage and biological diversity. Poyang Lake is facing increasingly serious water crises, including seasonal dry-up, decreased wetland area, and water resource shortage, all of which are closely related to progressive bottom topographic changes over recent years. Time-series of bottom topography would contribute to our understanding of the lake's evolution during the past several decades. However, commonly used methods for mapping bottom topography fail to frequently update quality bathymetric data for Poyang Lake restricted by weather and accessibility. These deficiencies have limited our ability to characterize the bottom topographic changes and understanding lake erosion or deposition trend. To fill the gap, we construct a decadal bottom topography of Poyang Lake with a total of 146 time series medium resolution satellite images based on the Waterline Method. It was found that Poyang Lake has eroded with a rate of -14.4 cm/ yr from 2000 to 2010. The erosion trend was attributed to the impacts of human activities, especially the operation of the Three Gorge Dams, sand excavation, and the implementation of water conservancy project. A decadal quantitative understanding bottom topography of Poyang Lake might provide a foundation to model the lake evolutionary processes and assist both researchers and local policymakers in ecological management, wetland protection and lake navigation safety.

  5. Mapping the Binding Interface of VEGF and a Monoclonal Antibody Fab-1 Fragment with Fast Photochemical Oxidation of Proteins (FPOP) and Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Zhang, Ying; Wecksler, Aaron T.; Molina, Patricia; Deperalta, Galahad; Gross, Michael L.

    2017-05-01

    We previously analyzed the Fab-1:VEGF (vascular endothelial growth factor) system described in this work, with both native top-down mass spectrometry and bottom-up mass spectrometry (carboxyl-group or GEE footprinting) techniques. This work continues bottom-up mass spectrometry analysis using a fast photochemical oxidation of proteins (FPOP) platform to map the solution binding interface of VEGF and a fragment antigen binding region of an antibody (Fab-1). In this study, we use FPOP to compare the changes in solvent accessibility by quantitating the extent of oxidative modification in the unbound versus bound states. Determining the changes in solvent accessibility enables the inference of the protein binding sites (epitope and paratopes) and a comparison to the previously published Fab-1:VEGF crystal structure, adding to the top-down and bottom-up data. Using this method, we investigated peptide-level and residue-level changes in solvent accessibility between the unbound proteins and bound complex. Mapping these data onto the Fab-1:VEGF crystal structure enabled successful characterization of both the binding region and regions of remote conformation changes. These data, coupled with our previous higher order structure (HOS) studies, demonstrate the value of a comprehensive toolbox of methods for identifying the putative epitopes and paratopes for biotherapeutic antibodies.

  6. Bottom-Up or Top-Down: English as a Foreign Language Vocabulary Instruction for Chinese University Students

    ERIC Educational Resources Information Center

    Moskovsky, Christo; Jiang, Guowu; Libert, Alan; Fagan, Seamus

    2015-01-01

    Whereas there has been some research on the role of bottom-up and top-down processing in the learning of a second or foreign language, very little attention has been given to bottom-up and top-down instructional approaches to language teaching. The research reported here used a quasi-experimental design to assess the relative effectiveness of two…

  7. Method for encapsulating hazardous wastes using a staged mold

    DOEpatents

    Unger, Samuel L.; Telles, Rodney W.; Lubowitz, Hyman R.

    1989-01-01

    A staged mold and method for stabilizing hazardous wastes for final disposal by molding an agglomerate of the hazardous wastes and encapsulating the agglomerate. Three stages are employed in the process. In the first stage, a first mold body is positioned on a first mold base, a mixture of the hazardous wastes and a thermosetting plastic is loaded into the mold, the mixture is mechanically compressed, heat is applied to cure the mixture to form a rigid agglomerate, and the first mold body is removed leaving the agglomerate sitting on the first mold base. In the second stage, a clamshell second mold body is positioned around the agglomerate and the first mold base, a powdered thermoplastic resin is poured on top of the agglomerate and in the gap between the sides of the agglomerate and the second mold body, the thermoplastic is compressed, heat is applied to melt the thermoplastic, and the plastic is cooled jacketing the agglomerate on the top and sides. In the third stage, the mold with the jacketed agglomerate is inverted, the first mold base is removed exposing the former bottom of the agglomerate, powdered thermoplastic is poured over the former bottom, the first mold base is replaced to compress the thermoplastic, heat is applied to melt the new thermoplastic and the top part of the jacket on the sides, the plastic is cooled jacketing the bottom and fusing with the jacketing on the sides to complete the seamless encapsulation of the agglomerate.

  8. A kinetic flux vector splitting scheme for shallow water equations incorporating variable bottom topography and horizontal temperature gradients.

    PubMed

    Saleem, M Rehan; Ashraf, Waqas; Zia, Saqib; Ali, Ishtiaq; Qamar, Shamsul

    2018-01-01

    This paper is concerned with the derivation of a well-balanced kinetic scheme to approximate a shallow flow model incorporating non-flat bottom topography and horizontal temperature gradients. The considered model equations, also called as Ripa system, are the non-homogeneous shallow water equations considering temperature gradients and non-uniform bottom topography. Due to the presence of temperature gradient terms, the steady state at rest is of primary interest from the physical point of view. However, capturing of this steady state is a challenging task for the applied numerical methods. The proposed well-balanced kinetic flux vector splitting (KFVS) scheme is non-oscillatory and second order accurate. The second order accuracy of the scheme is obtained by considering a MUSCL-type initial reconstruction and Runge-Kutta time stepping method. The scheme is applied to solve the model equations in one and two space dimensions. Several numerical case studies are carried out to validate the proposed numerical algorithm. The numerical results obtained are compared with those of staggered central NT scheme. The results obtained are also in good agreement with the recently published results in the literature, verifying the potential, efficiency, accuracy and robustness of the suggested numerical scheme.

  9. A kinetic flux vector splitting scheme for shallow water equations incorporating variable bottom topography and horizontal temperature gradients

    PubMed Central

    2018-01-01

    This paper is concerned with the derivation of a well-balanced kinetic scheme to approximate a shallow flow model incorporating non-flat bottom topography and horizontal temperature gradients. The considered model equations, also called as Ripa system, are the non-homogeneous shallow water equations considering temperature gradients and non-uniform bottom topography. Due to the presence of temperature gradient terms, the steady state at rest is of primary interest from the physical point of view. However, capturing of this steady state is a challenging task for the applied numerical methods. The proposed well-balanced kinetic flux vector splitting (KFVS) scheme is non-oscillatory and second order accurate. The second order accuracy of the scheme is obtained by considering a MUSCL-type initial reconstruction and Runge-Kutta time stepping method. The scheme is applied to solve the model equations in one and two space dimensions. Several numerical case studies are carried out to validate the proposed numerical algorithm. The numerical results obtained are compared with those of staggered central NT scheme. The results obtained are also in good agreement with the recently published results in the literature, verifying the potential, efficiency, accuracy and robustness of the suggested numerical scheme. PMID:29851978

  10. A fusion of top-down and bottom-up modeling techniques to constrain regional scale carbon budgets

    NASA Astrophysics Data System (ADS)

    Goeckede, M.; Turner, D. P.; Michalak, A. M.; Vickers, D.; Law, B. E.

    2009-12-01

    The effort to constrain regional scale carbon budgets benefits from assimilating as many high quality data sources as possible in order to reduce uncertainties. Two of the most common approaches used in this field, bottom-up and top-down techniques, both have their strengths and weaknesses, and partly build on very different sources of information to train, drive, and validate the models. Within the context of the ORCA2 project, we follow both bottom-up and top-down modeling strategies with the ultimate objective of reconciling their surface flux estimates. The ORCA2 top-down component builds on a coupled WRF-STILT transport module that resolves the footprint function of a CO2 concentration measurement in high temporal and spatial resolution. Datasets involved in the current setup comprise GDAS meteorology, remote sensing products, VULCAN fossil fuel inventories, boundary conditions from CarbonTracker, and high-accuracy time series of atmospheric CO2 concentrations. Surface fluxes of CO2 are normally provided through a simple diagnostic model which is optimized against atmospheric observations. For the present study, we replaced the simple model with fluxes generated by an advanced bottom-up process model, Biome-BGC, which uses state-of-the-art algorithms to resolve plant-physiological processes, and 'grow' a biosphere based on biogeochemical conditions and climate history. This approach provides a more realistic description of biomass and nutrient pools than is the case for the simple model. The process model ingests various remote sensing data sources as well as high-resolution reanalysis meteorology, and can be trained against biometric inventories and eddy-covariance data. Linking the bottom-up flux fields to the atmospheric CO2 concentrations through the transport module allows evaluating the spatial representativeness of the BGC flux fields, and in that way assimilates more of the available information than either of the individual modeling techniques alone. Bayesian inversion is then applied to assign scaling factors that align the surface fluxes with the CO2 time series. Our project demonstrates how bottom-up and top-down techniques can be reconciled to arrive at a more robust and balanced spatial carbon budget. We will show how to evaluate existing flux products through regionally representative atmospheric observations, i.e. how well the underlying model assumptions represent processes on the regional scale. Adapting process model parameterizations sets for e.g. sub-regions, disturbance regimes, or land cover classes, in order to optimize the agreement between surface fluxes and atmospheric observations can lead to improved understanding of the underlying flux mechanisms, and reduces uncertainties in the regional carbon budgets.

  11. Solid-Phase Extraction Strategies to Surmount Body Fluid Sample Complexity in High-Throughput Mass Spectrometry-Based Proteomics

    PubMed Central

    Bladergroen, Marco R.; van der Burgt, Yuri E. M.

    2015-01-01

    For large-scale and standardized applications in mass spectrometry- (MS-) based proteomics automation of each step is essential. Here we present high-throughput sample preparation solutions for balancing the speed of current MS-acquisitions and the time needed for analytical workup of body fluids. The discussed workflows reduce body fluid sample complexity and apply for both bottom-up proteomics experiments and top-down protein characterization approaches. Various sample preparation methods that involve solid-phase extraction (SPE) including affinity enrichment strategies have been automated. Obtained peptide and protein fractions can be mass analyzed by direct infusion into an electrospray ionization (ESI) source or by means of matrix-assisted laser desorption ionization (MALDI) without further need of time-consuming liquid chromatography (LC) separations. PMID:25692071

  12. Text line extraction in free style document

    NASA Astrophysics Data System (ADS)

    Shen, Xiaolu; Liu, Changsong; Ding, Xiaoqing; Zou, Yanming

    2009-01-01

    This paper addresses to text line extraction in free style document, such as business card, envelope, poster, etc. In free style document, global property such as character size, line direction can hardly be concluded, which reveals a grave limitation in traditional layout analysis. 'Line' is the most prominent and the highest structure in our bottom-up method. First, we apply a novel intensity function found on gradient information to locate text areas where gradient within a window have large magnitude and various directions, and split such areas into text pieces. We build a probability model of lines consist of text pieces via statistics on training data. For an input image, we group text pieces to lines using a simulated annealing algorithm with cost function based on the probability model.

  13. Spatially-resolved aircraft-based quantification of methane emissions from the Fayetteville Shale Gas Play

    NASA Astrophysics Data System (ADS)

    Schwietzke, S.; Petron, G.; Conley, S. A.; Karion, A.; Tans, P. P.; Wolter, S.; King, C. W.; White, A. B.; Coleman, T.; Bianco, L.; Schnell, R. C.

    2016-12-01

    Confidence in basin scale oil and gas industry related methane (CH4) emission estimates hinges on an in-depth understanding, objective evaluation, and continued improvements of both top-down (e.g. aircraft measurement based) and bottom-up (e.g. emission inventories using facility- and/or component-level measurements) approaches. Systematic discrepancies of CH4 emission estimates between both approaches in the literature have highlighted research gaps. This paper is part of a more comprehensive study to expand and improve this reconciliation effort for a US dry shale gas play. This presentation will focus on refinements of the aircraft mass balance method to reduce the number of potential methodological biases (e.g. data and methodology). The refinements include (i) an in-depth exploration of the definition of upwind conditions and their impact on calculated downwind CH4 enhancements and total CH4 emissions, (ii) taking into account small but non-zero vertical and horizontal wind gradients in the boundary layer, and (iii) characterizing the spatial distribution of CH4 emissions in the study area using aircraft measurements. For the first time to our knowledge, we apply the aircraft mass balance method to calculate spatially resolved total CH4 emissions for 10 km x 60 km sub-regions within the study area. We identify higher-emitting sub-regions and localize repeating emission patterns as well as differences between days. The increased resolution of the top-down calculation will for the first time allow for an in-depth comparison with a spatially and temporally resolved bottom-up emission estimate based on measurements, concurrent activity data and other data sources.

  14. The meaning of functional trait composition of food webs for ecosystem functioning.

    PubMed

    Gravel, Dominique; Albouy, Camille; Thuiller, Wilfried

    2016-05-19

    There is a growing interest in using trait-based approaches to characterize the functional structure of animal communities. Quantitative methods have been derived mostly for plant ecology, but it is now common to characterize the functional composition of various systems such as soils, coral reefs, pelagic food webs or terrestrial vertebrate communities. With the ever-increasing availability of distribution and trait data, a quantitative method to represent the different roles of animals in a community promise to find generalities that will facilitate cross-system comparisons. There is, however, currently no theory relating the functional composition of food webs to their dynamics and properties. The intuitive interpretation that more functional diversity leads to higher resource exploitation and better ecosystem functioning was brought from plant ecology and does not apply readily to food webs. Here we appraise whether there are interpretable metrics to describe the functional composition of food webs that could foster a better understanding of their structure and functioning. We first distinguish the various roles that traits have on food web topology, resource extraction (bottom-up effects), trophic regulation (top-down effects), and the ability to keep energy and materials within the community. We then discuss positive effects of functional trait diversity on food webs, such as niche construction and bottom-up effects. We follow with a discussion on the negative effects of functional diversity, such as enhanced competition (both exploitation and apparent) and top-down control. Our review reveals that most of our current understanding of the impact of functional trait diversity on food web properties and functioning comes from an over-simplistic representation of network structure with well-defined levels. We, therefore, conclude with propositions for new research avenues for both theoreticians and empiricists. © 2016 The Author(s).

  15. The meaning of functional trait composition of food webs for ecosystem functioning

    PubMed Central

    Albouy, Camille

    2016-01-01

    There is a growing interest in using trait-based approaches to characterize the functional structure of animal communities. Quantitative methods have been derived mostly for plant ecology, but it is now common to characterize the functional composition of various systems such as soils, coral reefs, pelagic food webs or terrestrial vertebrate communities. With the ever-increasing availability of distribution and trait data, a quantitative method to represent the different roles of animals in a community promise to find generalities that will facilitate cross-system comparisons. There is, however, currently no theory relating the functional composition of food webs to their dynamics and properties. The intuitive interpretation that more functional diversity leads to higher resource exploitation and better ecosystem functioning was brought from plant ecology and does not apply readily to food webs. Here we appraise whether there are interpretable metrics to describe the functional composition of food webs that could foster a better understanding of their structure and functioning. We first distinguish the various roles that traits have on food web topology, resource extraction (bottom-up effects), trophic regulation (top-down effects), and the ability to keep energy and materials within the community. We then discuss positive effects of functional trait diversity on food webs, such as niche construction and bottom-up effects. We follow with a discussion on the negative effects of functional diversity, such as enhanced competition (both exploitation and apparent) and top-down control. Our review reveals that most of our current understanding of the impact of functional trait diversity on food web properties and functioning comes from an over-simplistic representation of network structure with well-defined levels. We, therefore, conclude with propositions for new research avenues for both theoreticians and empiricists. PMID:27114571

  16. Bottom-up synthetic biology: modular design for making artificial platelets

    NASA Astrophysics Data System (ADS)

    Majumder, Sagardip; Liu, Allen P.

    2018-01-01

    Engineering artificial cells to mimic one or multiple fundamental cell biological functions is an emerging area of synthetic biology. Reconstituting functional modules from biological components in vitro is a challenging yet an important essence of bottom-up synthetic biology. Here we describe the concept of building artificial platelets using bottom-up synthetic biology and the four functional modules that together could enable such an ambitious effort.

  17. Change Levers for Unifying Top-Down and Bottom-Up Approaches to the Adoption and Diffusion of e-Learning in Higher Education

    ERIC Educational Resources Information Center

    Singh, Gurmak; Hardaker, Glenn

    2017-01-01

    Using Giddens' theory of structuration as a theoretical framework, this paper outlines how five prominent United Kingdom universities aimed to integrate top-down and bottom-up approaches to the adoption and diffusion of e-learning. The aim of this paper is to examine the major challenges that arise from the convergence of bottom-up perspectives…

  18. New, national bottom-up estimate for tree-based biological ...

    EPA Pesticide Factsheets

    Nitrogen is a limiting nutrient in many ecosystems, but is also a chief pollutant from human activity. Quantifying human impacts on the nitrogen cycle and investigating natural ecosystem nitrogen cycling both require an understanding of the magnitude of nitrogen inputs from biological nitrogen fixation (BNF). A bottom-up approach to estimating BNF—scaling rates up from measurements to broader scales—is attractive because it is rooted in actual BNF measurements. However, bottom-up approaches have been hindered by scaling difficulties, and a recent top-down approach suggested that the previous bottom-up estimate was much too large. Here, we used a bottom-up approach for tree-based BNF, overcoming scaling difficulties with the systematic, immense (>70,000 N-fixing trees) Forest Inventory and Analysis (FIA) database. We employed two approaches to estimate species-specific BNF rates: published ecosystem-scale rates (kg N ha-1 yr-1) and published estimates of the percent of N derived from the atmosphere (%Ndfa) combined with FIA-derived growth rates. Species-specific rates can vary for a variety of reasons, so for each approach we examined how different assumptions influenced our results. Specifically, we allowed BNF rates to vary with stand age, N-fixer density, and canopy position (since N-fixation is known to require substantial light).Our estimates from this bottom-up technique are several orders of magnitude lower than previous estimates indicating

  19. Cruise report for P1-13-LA, U.S. Geological Survey gas hydrates research cruise, R/V Pelican April 18 to May 3, 2013, deepwater Gulf of Mexico

    USGS Publications Warehouse

    Haines, Seth S.; Hart, Patrick E.; Ruppel, Carolyn; O'Brien, Thomas; Baldwin, Wayne; White, Jenny; Moore, Eric; Dal Ferro, Peter; Lemmond, Peter

    2014-01-01

    The U.S. Geological Survey led a seismic acquisition cruise in the Gulf of Mexico from April 18 to May 3, 2013, with the objectives of (1) achieving improved imaging and characterization at two established gas hydrate study sites, and (2) refining geophysical methods for gas hydrate characterization in other locations. We conducted this acquisition aboard the R/V Pelican, and used a pair of 105/105-cubic-inch generator/injector air guns to provide seismic energy that we recorded using a 450-meter 72-channel digital hydrophone streamer and 25 multicomponent ocean-bottom seismometers. In the area of lease block Green Canyon 955, we deployed 21 ocean-bottom seismometers and acquired approximately 400 kilometers of high-resolution two-dimensional streamer seismic data in a grid with line spacing as small as 50 meters and along radial lines that provide source offsets up to 10 kilometers and diverse azimuths for the ocean-bottom seismometers. In the area of lease block Walker Ridge 313, we deployed 25 ocean-bottom seismometers and acquired approximately 450 kilometers of streamer seismic data in a grid pattern with line spacing as small as 250 meters and along radial lines that provide source offsets up to 10 kilometers for the ocean-bottom seismometers. The data acquisition effort was conducted safely and met the scientific objectives.

  20. Climate-mediated changes in marine ecosystem regulation during El Niño.

    PubMed

    Lindegren, Martin; Checkley, David M; Koslow, Julian A; Goericke, Ralf; Ohman, Mark D

    2018-02-01

    The degree to which ecosystems are regulated through bottom-up, top-down, or direct physical processes represents a long-standing issue in ecology, with important consequences for resource management and conservation. In marine ecosystems, the role of bottom-up and top-down forcing has been shown to vary over spatio-temporal scales, often linked to highly variable and heterogeneously distributed environmental conditions. Ecosystem dynamics in the Northeast Pacific have been suggested to be predominately bottom-up regulated. However, it remains unknown to what extent top-down regulation occurs, or whether the relative importance of bottom-up and top-down forcing may shift in response to climate change. In this study, we investigate the effects and relative importance of bottom-up, top-down, and physical forcing during changing climate conditions on ecosystem regulation in the Southern California Current System (SCCS) using a generalized food web model. This statistical approach is based on nonlinear threshold models and a long-term data set (~60 years) covering multiple trophic levels from phytoplankton to predatory fish. We found bottom-up control to be the primary mode of ecosystem regulation. However, our results also demonstrate an alternative mode of regulation represented by interacting bottom-up and top-down forcing, analogous to wasp-waist dynamics, but occurring across multiple trophic levels and only during periods of reduced bottom-up forcing (i.e., weak upwelling, low nutrient concentrations, and primary production). The shifts in ecosystem regulation are caused by changes in ocean-atmosphere forcing and triggered by highly variable climate conditions associated with El Niño. Furthermore, we show that biota respond differently to major El Niño events during positive or negative phases of the Pacific Decadal Oscillation (PDO), as well as highlight potential concerns for marine and fisheries management by demonstrating increased sensitivity of pelagic fish to exploitation during El Niño. © 2017 John Wiley & Sons Ltd.

  1. Study on the Variation of Groundwater Level under Time-varying Recharge

    NASA Astrophysics Data System (ADS)

    Wu, Ming-Chang; Hsieh, Ping-Cheng

    2017-04-01

    The slopes of the suburbs come to important areas by focusing on the work of soil and water conservation in recent years. The water table inside the aquifer is affected by rainfall, geology and topography, which will result in the change of groundwater discharge and water level. Currently, the way to obtain water table information is to set up the observation wells; however, owing to that the cost of equipment and the wells excavated is too expensive, we develop a mathematical model instead, which might help us to simulate the groundwater level variation. In this study, we will discuss the groundwater level change in a sloping unconfined aquifer with impermeable bottom under time-varying rainfall events. Referring to Child (1971), we employ the Boussinesq equation as the governing equation, and apply the General Integral Transforms Method (GITM) to analyzing the groundwater level after linearizing the Boussinesq equation. After comparing the solution with Verhoest & Troch (2000) and Bansal & Das (2010), we get satisfactory results. To sum up, we have presented an alternative approach to solve the linearized Boussinesq equation for the response of groundwater level in a sloping unconfined aquifer. The present analytical results combine the effect of bottom slope and the time-varying recharge pattern on the water table fluctuations. Owing to the limitation and difficulty of measuring the groundwater level directly, we develop such a mathematical model that we can predict or simulate the variation of groundwater level affected by any rainfall events in advance.

  2. Top-down proteomics for the analysis of proteolytic events - Methods, applications and perspectives.

    PubMed

    Tholey, Andreas; Becker, Alexander

    2017-11-01

    Mass spectrometry based proteomics is an indispensable tool for almost all research areas relevant for the understanding of proteolytic processing, ranging from the identification of substrates, products and cleavage sites up to the analysis of structural features influencing protease activity. The majority of methods for these studies are based on bottom-up proteomics performing analysis at peptide level. As this approach is characterized by a number of pitfalls, e.g. loss of molecular information, there is an ongoing effort to establish top-down proteomics, performing separation and MS analysis both at intact protein level. We briefly introduce major approaches of bottom-up proteomics used in the field of protease research and highlight the shortcomings of these methods. We then discuss the present state-of-the-art of top-down proteomics. Together with the discussion of known challenges we show the potential of this approach and present a number of successful applications of top-down proteomics in protease research. This article is part of a Special Issue entitled: Proteolysis as a Regulatory Event in Pathophysiology edited by Stefan Rose-John. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. New-Generation Aluminum Composite with Bottom Ash Industrial Waste

    NASA Astrophysics Data System (ADS)

    Mandal, A. K.; Sinha, O. P.

    2018-02-01

    Industrial waste bottom ash (BA) from a pulverized coal combustion boiler containing hard wear-resistant particles was utilized in this study to form an aluminum composite through a liquid metallurgy route. Composites comprising 5 wt.% and 10 wt.% bottom ash were characterized for their physiochemical, microstructural, mechanical, as well as tribological properties, along with pure aluminum. Scanning electron microscopy (SEM) microstructure revealed uniform distribution of BA particles throughout the matrix of the composite, whereas x-ray diffraction (XRD) analysis confirmed presence of aluminosilicate phase. Addition of 10 wt.% BA improved the Brinell hardness number (BHN) from 13 to 19 and ultimate tensile strength (UTS) from 71 MPa to 87 MPa, whereas ductility was adversely reduced after 5% BA addition. Incorporation of BA particles resulted in reduced dry sliding wear rates examined up to 80 N load compared with aluminum. Hence, such composites having lower cost could be applied as significantly hard, wear-resistant materials in applications in the automotive industry.

  4. New-Generation Aluminum Composite with Bottom Ash Industrial Waste

    NASA Astrophysics Data System (ADS)

    Mandal, A. K.; Sinha, O. P.

    2018-06-01

    Industrial waste bottom ash (BA) from a pulverized coal combustion boiler containing hard wear-resistant particles was utilized in this study to form an aluminum composite through a liquid metallurgy route. Composites comprising 5 wt.% and 10 wt.% bottom ash were characterized for their physiochemical, microstructural, mechanical, as well as tribological properties, along with pure aluminum. Scanning electron microscopy (SEM) microstructure revealed uniform distribution of BA particles throughout the matrix of the composite, whereas x-ray diffraction (XRD) analysis confirmed presence of aluminosilicate phase. Addition of 10 wt.% BA improved the Brinell hardness number (BHN) from 13 to 19 and ultimate tensile strength (UTS) from 71 MPa to 87 MPa, whereas ductility was adversely reduced after 5% BA addition. Incorporation of BA particles resulted in reduced dry sliding wear rates examined up to 80 N load compared with aluminum. Hence, such composites having lower cost could be applied as significantly hard, wear-resistant materials in applications in the automotive industry.

  5. Exploring the underlying structure of mental disorders: cross-diagnostic differences and similarities from a network perspective using both a top-down and a bottom-up approach.

    PubMed

    Wigman, J T W; van Os, J; Borsboom, D; Wardenaar, K J; Epskamp, S; Klippel, A; Viechtbauer, W; Myin-Germeys, I; Wichers, M

    2015-08-01

    It has been suggested that the structure of psychopathology is best described as a complex network of components that interact in dynamic ways. The goal of the present paper was to examine the concept of psychopathology from a network perspective, combining complementary top-down and bottom-up approaches using momentary assessment techniques. A pooled Experience Sampling Method (ESM) dataset of three groups (individuals with a diagnosis of depression, psychotic disorder or no diagnosis) was used (pooled N = 599). The top-down approach explored the network structure of mental states across different diagnostic categories. For this purpose, networks of five momentary mental states ('cheerful', 'content', 'down', 'insecure' and 'suspicious') were compared between the three groups. The complementary bottom-up approach used principal component analysis to explore whether empirically derived network structures yield meaningful higher order clusters. Individuals with a clinical diagnosis had more strongly connected moment-to-moment network structures, especially the depressed group. This group also showed more interconnections specifically between positive and negative mental states than the psychotic group. In the bottom-up approach, all possible connections between mental states were clustered into seven main components that together captured the main characteristics of the network dynamics. Our combination of (i) comparing network structure of mental states across three diagnostically different groups and (ii) searching for trans-diagnostic network components across all pooled individuals showed that these two approaches yield different, complementary perspectives in the field of psychopathology. The network paradigm therefore may be useful to map transdiagnostic processes.

  6. Comparative higher-order structure analysis of antibody biosimilars using combined bottom-up and top-down hydrogen-deuterium exchange mass spectrometry.

    PubMed

    Pan, Jingxi; Zhang, Suping; Borchers, Christoph H

    2016-12-01

    Hydrogen/deuterium exchange (HDX) coupled with mass spectrometry (MS) is a powerful technique for higher-order structural characterization of antibodies. Although the peptide-based bottom-up HDX approach and the protein-based top-down HDX approach have complementary advantages, the work done so far on biosimilars has involved only one or the other approach. Herein we have characterized the structures of two bevacizumab (BEV) biosimilars and compared them to the reference BEV using both methods. A sequence coverage of 87% was obtained for the heavy chain and 74% for the light chain in the bottom-up approach. The deuterium incorporation behavior of the peptic peptides from the three BEVs were compared side by side and showed no differences at various HDX time points. Top-down experiments were carried out using subzero temperature LC-MS, and the deuterium incorporation of the intact light chain and heavy chain were obtained. Top-down ETD was also performed to obtain amino acid-level HDX information that covered 100% of the light chain, but only 50% coverage is possible for the heavy chain. Consistent with the intact subunit level data, no differences were observed in the amino acid level HDX data. All these results indicate that there are no differences between the three BEV samples with respect to their high-order structures. The peptide level information from the bottom-up approach, and the residue level and intact subunit level information from the top-down approach were complementary and covered the entire antibody. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. A Daily Analysis of Physical Activity and Satisfaction with Life in Emerging Adults

    PubMed Central

    Maher, Jaclyn P.; Doerksen, Shawna E.; Elavsky, Steriani; Hyde, Amanda L.; Pincus, Aaron L.; Ram, Nilam; Conroy, David E.

    2014-01-01

    Objective Subjective well-being has well-established positive health consequences. During emerging adulthood, from ages 18 to 25 years, people’s global evaluations of their well-being (i.e., satisfaction with life [SWL]) appear to worsen more than any other time in the adult lifespan, indicating that this population would benefit from strategies to enhance SWL. In these studies, we investigated top-down (i.e., time-invariant, trait-like) and bottom-up (i.e., time-varying, state-like) influences of physical activity (PA) on daily SWL. Methods Two daily diary studies lasting 8 days (N = 190) and 14 days (N = 63) were conducted with samples of emerging adults enrolled in college to evaluate relations between daily PA and SWL while controlling for established and plausible top-down and bottom-up influences on SWL. Results In both studies, multilevel models indicated that people reported greater SWL on days when they were more active (a within-person, bottom-up effect). Top-down effects of PA were not significant in either study. These findings were robust when we controlled for competing top-down influences (e.g., sex, personality traits, self-esteem, body mass index, mental health symptoms, fatigue) and bottom-up influences (e.g., daily self-esteem, daily mental health symptoms, daily fatigue). Conclusions We concluded that SWL was impacted by people’s daily PA rather than their trait level of PA over time. These findings extend evidence that PA is a health behavior with important consequences for daily well-being and should be considered when developing national policies to enhance SWL. PMID:23088171

  8. Correct primary structure assessment and extensive glyco-profiling of cetuximab by a combination of intact, middle-up, middle-down and bottom-up ESI and MALDI mass spectrometry techniques.

    PubMed

    Ayoub, Daniel; Jabs, Wolfgang; Resemann, Anja; Evers, Waltraud; Evans, Catherine; Main, Laura; Baessmann, Carsten; Wagner-Rousset, Elsa; Suckau, Detlev; Beck, Alain

    2013-01-01

    The European Medicines Agency received recently the first marketing authorization application for a biosimilar monoclonal antibody (mAb) and adopted the final guidelines on biosimilar mAbs and Fc-fusion proteins. The agency requires high similarity between biosimilar and reference products for approval. Specifically, the amino acid sequences must be identical. The glycosylation pattern of the antibody is also often considered to be a very important quality attribute due to its strong effect on quality, safety, immunogenicity, pharmacokinetics and potency. Here, we describe a case study of cetuximab, which has been marketed since 2004. Biosimilar versions of the product are now in the pipelines of numerous therapeutic antibody biosimilar developers. We applied a combination of intact, middle-down, middle-up and bottom-up electrospray ionization and matrix assisted laser desorption ionization mass spectrometry techniques to characterize the amino acid sequence and major post-translational modifications of the marketed cetuximab product, with special emphasis on glycosylation. Our results revealed a sequence error in the reported sequence of the light chain in databases and in publications, thus highlighting the potency of mass spectrometry to establish correct antibody sequences. We were also able to achieve a comprehensive identification of cetuximab's glycoforms and glycosylation profile assessment on both Fab and Fc domains. Taken together, the reported approaches and data form a solid framework for the comparability of antibodies and their biosimilar candidates that could be further applied to routine structural assessments of these and other antibody-based products.

  9. Modeling aspects of the surface reconstruction problem

    NASA Astrophysics Data System (ADS)

    Toth, Charles K.; Melykuti, Gabor

    1994-08-01

    The ultimate goal of digital photogrammetry is to automatically produce digital maps which may in turn form the basis of GIS. Virtually all work in surface reconstruction deals with various kinds of approximations and constraints that are applied. In this paper we extend these concepts in various ways. For one, matching is performed in object space. Thus, matching and densification (modeling) is performed in the same reference system. Another extension concerns the solution of the second sub-problem. Rather than simply densifying (interpolating) the surface, we propose to model it. This combined top-down and bottom-up approach is performed in scale space, whereby the model is refined until compatibility between the data and expectations is reached. The paper focuses on the modeling aspects of the surface reconstruction problem. Obviously, the top-down and bottom-up model descriptions ought to be in a form which allows the generation and verification of hypotheses. Another crucial question is the degree of a priori scene knowledge necessary to constrain the solution space.

  10. Efficient O(N) integration for all-electron electronic structure calculation using numeric basis functions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Havu, V.; Fritz Haber Institute of the Max Planck Society, Berlin; Blum, V.

    2009-12-01

    We consider the problem of developing O(N) scaling grid-based operations needed in many central operations when performing electronic structure calculations with numeric atom-centered orbitals as basis functions. We outline the overall formulation of localized algorithms, and specifically the creation of localized grid batches. The choice of the grid partitioning scheme plays an important role in the performance and memory consumption of the grid-based operations. Three different top-down partitioning methods are investigated, and compared with formally more rigorous yet much more expensive bottom-up algorithms. We show that a conceptually simple top-down grid partitioning scheme achieves essentially the same efficiency as themore » more rigorous bottom-up approaches.« less

  11. A method for evaluating the evolution of clogging: application to the Pampulha Campus infiltration system (Brazil).

    PubMed

    Barraud, S; Gonzalez-Merchan, C; Nascimento, N; Moura, P; Silva, A

    2014-01-01

    In order to evaluate the hydraulic performance of stormwater infiltration trenches, a study was undertaken to assess clogging and its distribution between the bottom and the sides. The method used was based on the calibration of the hydraulic resistance event by event according to Bouwer's model and applied to a demonstration trench in Belo-Horizonte monitored in the framework of the European Project Switch. The calibration was performed by minimizing the distance between measured and modeled infiltration flow rates and by using continuous measurements of rainfall, inflow, water temperature and depth in the trench. The study showed that the methodology and particularly Bouwer's model was able to produce satisfactory results. It revealed a significant clogging evolution within a year, with global resistance increasing by a factor of 9. A significant difference between the bottom and the sides was observed; the bottom being more rapidly prone to clogging. Sudden fluctuations of the hydraulic resistance of the bottom were found that could be explained by very high concentrations of total suspended solids from inflows (about 2,000 mg/L). Clogging of the sides evolves over the time but with a very low rate.

  12. Measurement and simulation of top- and bottom-illuminated solar-blind AlGaN metal-semiconductor-metal photodetectors with high external quantum efficiencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brendel, Moritz, E-mail: moritz.brendel@fbh-berlin.de; Helbling, Markus; Knigge, Andrea

    2015-12-28

    A comprehensive study on top- and bottom-illuminated Al{sub 0.5}Ga{sub 0.5}N/AlN metal-semiconductor-metal (MSM) photodetectors having different AlGaN absorber layer thickness is presented. The measured external quantum efficiency (EQE) shows pronounced threshold and saturation behavior as a function of applied bias voltage up to 50 V reaching about 50% for 0.1 μm and 67% for 0.5 μm thick absorber layers under bottom illumination. All experimental findings are in very good accordance with two-dimensional drift-diffusion modeling results. By taking into account macroscopic polarization effects in the hexagonal metal-polar +c-plane AlGaN/AlN heterostructures, new insights into the general device functionality of AlGaN-based MSM photodetectors are obtained. The observedmore » threshold/saturation behavior is caused by a bias-dependent extraction of photoexcited holes from the Al{sub 0.5}Ga{sub 0.5}N/AlN interface. While present under bottom illumination for any AlGaN layer thickness, under top illumination this mechanism influences the EQE-bias characteristics only for thin layers.« less

  13. Simple and Inexpensive Paper-Based Astrocyte Co-culture to Improve Survival of Low-Density Neuronal Networks

    PubMed Central

    Aebersold, Mathias J.; Thompson-Steckel, Greta; Joutang, Adriane; Schneider, Moritz; Burchert, Conrad; Forró, Csaba; Weydert, Serge; Han, Hana; Vörös, János

    2018-01-01

    Bottom-up neuroscience aims to engineer well-defined networks of neurons to investigate the functions of the brain. By reducing the complexity of the brain to achievable target questions, such in vitro bioassays better control experimental variables and can serve as a versatile tool for fundamental and pharmacological research. Astrocytes are a cell type critical to neuronal function, and the addition of astrocytes to neuron cultures can improve the quality of in vitro assays. Here, we present cellulose as an astrocyte culture substrate. Astrocytes cultured on the cellulose fiber matrix thrived and formed a dense 3D network. We devised a novel co-culture platform by suspending the easy-to-handle astrocytic paper cultures above neuronal networks of low densities typically needed for bottom-up neuroscience. There was significant improvement in neuronal viability after 5 days in vitro at densities ranging from 50,000 cells/cm2 down to isolated cells at 1,000 cells/cm2. Cultures exhibited spontaneous spiking even at the very low densities, with a significantly greater spike frequency per cell compared to control mono-cultures. Applying the co-culture platform to an engineered network of neurons on a patterned substrate resulted in significantly improved viability and almost doubled the density of live cells. Lastly, the shape of the cellulose substrate can easily be customized to a wide range of culture vessels, making the platform versatile for different applications that will further enable research in bottom-up neuroscience and drug development. PMID:29535595

  14. On the representability problem and the physical meaning of coarse-grained models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagner, Jacob W.; Dama, James F.; Durumeric, Aleksander E. P.

    2016-07-28

    In coarse-grained (CG) models where certain fine-grained (FG, i.e., atomistic resolution) observables are not directly represented, one can nonetheless identify indirect the CG observables that capture the FG observable’s dependence on CG coordinates. Often, in these cases it appears that a CG observable can be defined by analogy to an all-atom or FG observable, but the similarity is misleading and significantly undermines the interpretation of both bottom-up and top-down CG models. Such problems emerge especially clearly in the framework of the systematic bottom-up CG modeling, where a direct and transparent correspondence between FG and CG variables establishes precise conditions formore » consistency between CG observables and underlying FG models. Here we present and investigate these representability challenges and illustrate them via the bottom-up conceptual framework for several simple analytically tractable polymer models. The examples provide special focus on the observables of configurational internal energy, entropy, and pressure, which have been at the root of controversy in the CG literature, as well as discuss observables that would seem to be entirely missing in the CG representation but can nonetheless be correlated with CG behavior. Though we investigate these problems in the framework of systematic coarse-graining, the lessons apply to top-down CG modeling also, with crucial implications for simulation at constant pressure and surface tension and for the interpretations of structural and thermodynamic correlations for comparison to experiment.« less

  15. Chemistry and temperature-assisted dehydrogenation of C60H30 molecules on TiO2(110) surfaces

    NASA Astrophysics Data System (ADS)

    Sánchez-Sánchez, Carlos; Martínez, José Ignacio; Lanzilotto, Valeria; Biddau, Giulio; Gómez-Lor, Berta; Pérez, Rubén; Floreano, Luca; López, María Francisca; Martín-Gago, José Ángel

    2013-10-01

    The thermal induced on-surface chemistry of large polycyclic aromatic hydrocarbons (PAHs) deposited on dielectric substrates is very rich and complex. We evidence temperature-assisted (cyclo)dehydrogenation reactions for C60H30 molecules and the subsequent bottom-up formation of assembled nanostructures, such as nanodomes, on the TiO2(110) surface. To this aim we have deposited, under ultra-high vacuum, a submonolayer coverage of C60H30 and studied, by a combination of experimental techniques (STM, XPS and NEXAFS) and theoretical methods, the different chemical on-surface interaction stages induced by the increasing temperature. We show that room temperature adsorbed molecules exhibit a weak interaction and freely diffuse on the surface, as previously reported for other aromatics. Nevertheless, a slight annealing induces a transition from this (meta)stable configuration into chemisorbed molecules. This adsorbate-surface interaction deforms the C60H30 molecular structure and quenches surface diffusion. Higher annealing temperatures lead to partial dehydrogenation, in which the molecule loses some of the hydrogen atoms and LUMO levels spread in the gap inducing a net total energy gain. Further annealing, up to around 750 K, leads to complete dehydrogenation. At these temperatures the fully dehydrogenated molecules link between them in a bottom-up coupling, forming nanodomes or fullerene-like monodisperse species readily on the dielectric surface. This work opens the door to the use of on-surface chemistry to generate new bottom-up tailored structures directly on high-K dielectric surfaces.The thermal induced on-surface chemistry of large polycyclic aromatic hydrocarbons (PAHs) deposited on dielectric substrates is very rich and complex. We evidence temperature-assisted (cyclo)dehydrogenation reactions for C60H30 molecules and the subsequent bottom-up formation of assembled nanostructures, such as nanodomes, on the TiO2(110) surface. To this aim we have deposited, under ultra-high vacuum, a submonolayer coverage of C60H30 and studied, by a combination of experimental techniques (STM, XPS and NEXAFS) and theoretical methods, the different chemical on-surface interaction stages induced by the increasing temperature. We show that room temperature adsorbed molecules exhibit a weak interaction and freely diffuse on the surface, as previously reported for other aromatics. Nevertheless, a slight annealing induces a transition from this (meta)stable configuration into chemisorbed molecules. This adsorbate-surface interaction deforms the C60H30 molecular structure and quenches surface diffusion. Higher annealing temperatures lead to partial dehydrogenation, in which the molecule loses some of the hydrogen atoms and LUMO levels spread in the gap inducing a net total energy gain. Further annealing, up to around 750 K, leads to complete dehydrogenation. At these temperatures the fully dehydrogenated molecules link between them in a bottom-up coupling, forming nanodomes or fullerene-like monodisperse species readily on the dielectric surface. This work opens the door to the use of on-surface chemistry to generate new bottom-up tailored structures directly on high-K dielectric surfaces. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr03706a

  16. Model Prediction Results for 2007 Ultrasonic Benchmark Problems

    NASA Astrophysics Data System (ADS)

    Kim, Hak-Joon; Song, Sung-Jin

    2008-02-01

    The World Federation of NDE Centers (WFNDEC) has addressed two types of problems for the 2007 ultrasonic benchmark problems: prediction of side-drilled hole responses with 45° and 60° refracted shear waves, and effects of surface curvatures on the ultrasonic responses of flat-bottomed hole. To solve this year's ultrasonic benchmark problems, we applied multi-Gaussian beam models for calculation of ultrasonic beam fields and the Kirchhoff approximation and the separation of variables method for calculation of far-field scattering amplitudes of flat-bottomed holes and side-drilled holes respectively In this paper, we present comparison results of model predictions to experiments for side-drilled holes and discuss effect of interface curvatures on ultrasonic responses by comparison of peak-to-peak amplitudes of flat-bottomed hole responses with different sizes and interface curvatures.

  17. Microgravity Storage Vessels and Conveying-Line Feeders for Cohesive Regolith

    NASA Technical Reports Server (NTRS)

    Walton, Otis R.; Vollmer, Hubert J.

    2013-01-01

    Under microgravity, the usual methods of placing granular solids into, or extracting them from, containers or storage vessels will not function. Alternative methods are required to provide a motive force to move the material. New configurations for microgravity regolith storage vessels that do not resemble terrestrial silos, hoppers, or tanks are proposed. The microgravity-compatible bulk-material storage vessels and exit feed configurations are designed to reliably empty and feed cohesive material to transfer vessels or conveying ducts or lines without gravity. A controllable motive force drives the cohesive material to the exit opening(s), and provides a reliable means to empty storage vessels and/or to feed microgravity conveying lines. The proposed designs will function equally well in vacuum, or inside of pressurized enclosures. Typical terrestrial granular solids handling and storage equipment will not function under microgravity, since almost all such equipment relies on gravity to at least move material to an exit location or to place it in the bottom of a container. Under microgravity, there effectively are no directions of up or down, and in order to effect movement of material, some other motive force must be applied to the material. The proposed storage vessels utilize dynamic centrifugal force to effect movement of regolith whenever material needs to be removed from the storage vessel. During simple storage, no dynamic motion or forces are required. The rotation rate during emptying can be controlled to ensure that material will move to the desired exit opening, even if the material is highly cohesive, or has acquired an electrostatic charge. The general concept of this Swirl Action Utilized for Centrifugal Ejection of Regolith (SAUCER) microgravity storage unit/dynamic feeder is to have an effective slot-hopper (based on the converging angles of the top and bottom conical section of the vessel) with an exit slot around the entire periphery of the SAUCER. The basic shape of such a unit is like two Chinese straw hats (douli) - one upside down, on the bottom, and another on top; or two wokpans, one upright on the bottom and another inverted on top, with a small gap between the upright and inverted pans or hats (around the periphery). A stationary outer ring, much like an unmounted bicycle tire, surrounds the gap between the two coaxial, nearly conical pieces, forming the top and bottom of the unit.

  18. Injury risk functions based on population-based finite element model responses: Application to femurs under dynamic three-point bending.

    PubMed

    Park, Gwansik; Forman, Jason; Kim, Taewung; Panzer, Matthew B; Crandall, Jeff R

    2018-02-28

    The goal of this study was to explore a framework for developing injury risk functions (IRFs) in a bottom-up approach based on responses of parametrically variable finite element (FE) models representing exemplar populations. First, a parametric femur modeling tool was developed and validated using a subject-specific (SS)-FE modeling approach. Second, principal component analysis and regression were used to identify parametric geometric descriptors of the human femur and the distribution of those factors for 3 target occupant sizes (5th, 50th, and 95th percentile males). Third, distributions of material parameters of cortical bone were obtained from the literature for 3 target occupant ages (25, 50, and 75 years) using regression analysis. A Monte Carlo method was then implemented to generate populations of FE models of the femur for target occupants, using a parametric femur modeling tool. Simulations were conducted with each of these models under 3-point dynamic bending. Finally, model-based IRFs were developed using logistic regression analysis, based on the moment at fracture observed in the FE simulation. In total, 100 femur FE models incorporating the variation in the population of interest were generated, and 500,000 moments at fracture were observed (applying 5,000 ultimate strains for each synthesized 100 femur FE models) for each target occupant characteristics. Using the proposed framework on this study, the model-based IRFs for 3 target male occupant sizes (5th, 50th, and 95th percentiles) and ages (25, 50, and 75 years) were developed. The model-based IRF was located in the 95% confidence interval of the test-based IRF for the range of 15 to 70% injury risks. The 95% confidence interval of the developed IRF was almost in line with the mean curve due to a large number of data points. The framework proposed in this study would be beneficial for developing the IRFs in a bottom-up manner, whose range of variabilities is informed by the population-based FE model responses. Specifically, this method mitigates the uncertainties in applying empirical scaling and may improve IRF fidelity when a limited number of experimental specimens are available.

  19. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    NASA Astrophysics Data System (ADS)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat heterogeneity over various spatial scales. The approaches were also able to integrate various data at differing spatial resolutions. The classification outputs exhibited similar results, including the number of habitat classes generated, the number of species defining the classes, the level of distinction of the biological communities, and dominance by tube-building amphipods. These results indicate that both approaches are able to discern a comparable degree of habitat variability and produce cohesive macrofaunal assemblages. The mapping approaches identify broadly similar benthic habitats at the two study sites and methods were able to distinguish the differing levels of heterogeneity between them. The top-down approach to habitat classification was faster and simpler to accomplish with the data available in this study when compared to the bottom-up approach. Additionally, the top-down approach generated full-coverage habitat classes that are clearly delineated and can easily be interpreted by the map user, which is desirable from a management perspective for providing a more complete assessment of the areas of interest. However, a higher level of biological variability was noted in some of the habitat classes created, indicating that the biological communities present in this area are influenced by factors not captured in the broad-scale geological habitat units used in this approach. The bottom-up approach was valuable in its ability to more clearly define macrofaunal assemblages among habitats, discern finer-scale habitat characteristics, and directly assess the degree of macrofaunal assemblage variability captured by the environmental parameters. From a user perspective, the map is more complex, which may be perceived as a limitation, though likely reflects natural gradations in habitat structure and likely presents a more ecologically realistic portrayal of the study areas. Though more comprehensive, the bottom-up approach in this study was limited by the reliance on full-coverage data to create full-coverage habitat classes. Such classes could only be developed when sediment data was excluded, since this point-sample dataset could not be interpolated due to high spatial heterogeneity of the study areas. Given a higher density of bottom samples, this issue could be rectified. While the top-down approach was more appropriate for this study, both approaches were found to be suitable for mapping and classifying benthic habitats. In the United States, objectives for mapping and classification for renewable energy development have not been well established. Therefore, at this time, the best-suited approach primarily depends on mapping objectives, resource availability, data quality and coverage, and geographical location, as these factors impact the types of data included, the analyses and modeling that can be performed, and the biotic-abiotic relationships identified.

  20. An improved tree height measurement technique tested on mature southern pines

    Treesearch

    Don C. Bragg

    2008-01-01

    Virtually all techniques for tree height determination follow one of two principles: similar triangles or the tangent method. Most people apply the latter approach, which uses the tangents of the angles to the top and bottom and a true horizontal distance to the subject tree. However, few adjust this method for ground slope, tree lean, crown shape, and crown...

  1. Getting in shape: Reconstructing three-dimensional long-track speed skating kinematics by comparing several body pose reconstruction techniques.

    PubMed

    van der Kruk, E; Schwab, A L; van der Helm, F C T; Veeger, H E J

    2018-03-01

    In gait studies body pose reconstruction (BPR) techniques have been widely explored, but no previous protocols have been developed for speed skating, while the peculiarities of the skating posture and technique do not automatically allow for the transfer of the results of those explorations to kinematic skating data. The aim of this paper is to determine the best procedure for body pose reconstruction and inverse dynamics of speed skating, and to what extend this choice influences the estimation of joint power. The results show that an eight body segment model together with a global optimization method with revolute joint in the knee and in the lumbosacral joint, while keeping the other joints spherical, would be the most realistic model to use for the inverse kinematics in speed skating. To determine joint power, this method should be combined with a least-square error method for the inverse dynamics. Reporting on the BPR technique and the inverse dynamic method is crucial to enable comparison between studies. Our data showed an underestimation of up to 74% in mean joint power when no optimization procedure was applied for BPR and an underestimation of up to 31% in mean joint power when a bottom-up inverse dynamics method was chosen instead of a least square error approach. Although these results are aimed at speed skating, reporting on the BPR procedure and the inverse dynamics method, together with setting a golden standard should be common practice in all human movement research to allow comparison between studies. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. A bottom-up approach for controlled deformation of carbon nanotubes through blistering of supporting substrate surface.

    PubMed

    Prudkovskiy, Vladimir; Iacovella, Fabrice; Katin, Konstantin P; Maslov, Mikhail M; Cherkashin, Nikolay

    2018-06-13

    Tuning the band structure and, in particular, gap opening in 1D and 2D materials through their deformation is a promising approach for their application in modern semiconductor devices. However, there is an essential breach between existing laboratory scale methods applied for deformation of low-dimensional materials and the needs of large-scale production. In this work, we propose a novel method which is potentially well compatible with high end technological application: single-walled carbon nanotubes (SWCNTs) first deposited on the flat surface of a supporting wafer, which has been pre-implanted with H+ and He+ ions, are deformed in a controlled and repetitive manner over blisters formed after subsequent thermal annealing. By using resonant Raman spectroscopy, we demonstrate that the SWCNTs clamped by metallic stripes at their ends are deformed over blisters to an average tensile strain of 0.15 ± 0.03 %, which is found to be in a good agreement with the value calculated taking into account blister's dimensions. The principle of the technique may be applied to other 1D and 2D materials in perspective. © 2018 IOP Publishing Ltd.

  3. Colitis detection on abdominal CT scans by rich feature hierarchies

    NASA Astrophysics Data System (ADS)

    Liu, Jiamin; Lay, Nathan; Wei, Zhuoshi; Lu, Le; Kim, Lauren; Turkbey, Evrim; Summers, Ronald M.

    2016-03-01

    Colitis is inflammation of the colon due to neutropenia, inflammatory bowel disease (such as Crohn disease), infection and immune compromise. Colitis is often associated with thickening of the colon wall. The wall of a colon afflicted with colitis is much thicker than normal. For example, the mean wall thickness in Crohn disease is 11-13 mm compared to the wall of the normal colon that should measure less than 3 mm. Colitis can be debilitating or life threatening, and early detection is essential to initiate proper treatment. In this work, we apply high-capacity convolutional neural networks (CNNs) to bottom-up region proposals to detect potential colitis on CT scans. Our method first generates around 3000 category-independent region proposals for each slice of the input CT scan using selective search. Then, a fixed-length feature vector is extracted from each region proposal using a CNN. Finally, each region proposal is classified and assigned a confidence score with linear SVMs. We applied the detection method to 260 images from 26 CT scans of patients with colitis for evaluation. The detection system can achieve 0.85 sensitivity at 1 false positive per image.

  4. Top-down and bottom-up neurodynamic evidence in patients with tinnitus.

    PubMed

    Hong, Sung Kwang; Park, Sejik; Ahn, Min-Hee; Min, Byoung-Kyong

    2016-12-01

    Although a peripheral auditory (bottom-up) deficit is an essential prerequisite for the generation of tinnitus, central cognitive (top-down) impairment has also been shown to be an inherent neuropathological mechanism. Using an auditory oddball paradigm (for top-down analyses) and a passive listening paradigm (for bottom-up analyses) while recording electroencephalograms (EEGs), we investigated whether top-down or bottom-up components were more critical in the neuropathology of tinnitus, independent of peripheral hearing loss. We observed significantly reduced P300 amplitudes (reflecting fundamental cognitive processes such as attention) and evoked theta power (reflecting top-down regulation in memory systems) for target stimuli at the tinnitus frequency of patients with tinnitus but without hearing loss. The contingent negative variation (reflecting top-down expectation of a subsequent event prior to stimulation) and N100 (reflecting auditory bottom-up selective attention) were different between the healthy and patient groups. Interestingly, when tinnitus patients were divided into two subgroups based on their P300 amplitudes, their P170 and N200 components, and annoyance and distress indices to their tinnitus sound were different. EEG theta-band power and its Granger causal neurodynamic results consistently support a double dissociation of these two groups in both top-down and bottom-up tasks. Directed cortical connectivity corroborates that the tinnitus network involves the anterior cingulate and the parahippocampal areas, where higher-order top-down control is generated. Together, our observations provide neurophysiological and neurodynamic evidence revealing a differential engagement of top-down impairment along with deficits in bottom-up processing in patients with tinnitus but without hearing loss. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Multi-scale modelling of elastic moduli of trabecular bone

    PubMed Central

    Hamed, Elham; Jasiuk, Iwona; Yoo, Andrew; Lee, YikHan; Liszka, Tadeusz

    2012-01-01

    We model trabecular bone as a nanocomposite material with hierarchical structure and predict its elastic properties at different structural scales. The analysis involves a bottom-up multi-scale approach, starting with nanoscale (mineralized collagen fibril) and moving up the scales to sub-microscale (single lamella), microscale (single trabecula) and mesoscale (trabecular bone) levels. Continuum micromechanics methods, composite materials laminate theory and finite-element methods are used in the analysis. Good agreement is found between theoretical and experimental results. PMID:22279160

  6. What is Bottom-Up and What is Top-Down in Predictive Coding?

    PubMed Central

    Rauss, Karsten; Pourtois, Gilles

    2013-01-01

    Everyone knows what bottom-up is, and how it is different from top-down. At least one is tempted to think so, given that both terms are ubiquitously used, but only rarely defined in the psychology and neuroscience literature. In this review, we highlight the problems and limitations of our current understanding of bottom-up and top-down processes, and we propose a reformulation of this distinction in terms of predictive coding. PMID:23730295

  7. The Human Face of Health News: A Multi-Method Analysis of Sourcing Practices in Health-Related News in Belgian Magazines.

    PubMed

    De Dobbelaer, Rebeca; Van Leuven, Sarah; Raeymaeckers, Karin

    2018-05-01

    Health journalists are central gatekeepers who select, frame, and communicate health news to a broad audience, but the selection and content of health news are also influenced by the sources journalists, rely on (Hinnant, Len-Rios, & Oh, 2012). In this paper, we examine whether the traditional elitist sourcing practices (e.g., research institutions, government) are still important in a digitalized news environment where bottom-up non-elite actors (e.g., patients, civil society organizations) can act as producers (Bruns, 2003). Our main goal, therefore, is to detect whether sourcing practices in health journalism can be linked with strategies of empowerment. We use a multi-method approach combining quantitative and qualitative research methods. First, two content analyses are developed to examine health-related news in Belgian magazines (popular weeklies, health magazines, general interest magazines, and women's magazines). The analyses highlight sourcing practices as visible in the texts and give an overview of the different stakeholders represented as sources. In the first wave, the content analysis includes 1047 health-related news items in 19 different Belgian magazines (March-June 2013). In the second wave, a smaller sample of 202 health-related items in 10 magazines was studied for follow-up reasons (February 2015). Second, to contextualize the findings of the quantitative analysis, we interviewed 16 health journalists and editors-in-chief. The results illustrate that journalists consider patients and blogs as relevant sources for health news; nonetheless, elitist sourcing practices still prevail at the cost of bottom-up communication. However, the in-depth interviews demonstrate that journalists increasingly consult patients and civil society actors to give health issues a more "human" face. Importantly, the study reveals that this strategy is differently applied by the various types of magazines. While popular weeklies and women's magazines give a voice to ordinary citizens to translate complex issues and connect with their audiences, general interest magazines and health magazines prefer elite sources and use ordinary citizen stories as a way of "window dressing."

  8. A photofunctional bottom-up bis(dipyrrinato)zinc(II) complex nanosheet

    PubMed Central

    Sakamoto, Ryota; Hoshiko, Ken; Liu, Qian; Yagi, Toshiki; Nagayama, Tatsuhiro; Kusaka, Shinpei; Tsuchiya, Mizuho; Kitagawa, Yasutaka; Wong, Wai-Yeung; Nishihara, Hiroshi

    2015-01-01

    Two-dimensional polymeric nanosheets have recently gained much attention, particularly top-down nanosheets such as graphene and metal chalcogenides originating from bulk-layered mother materials. Although molecule-based bottom-up nanosheets manufactured directly from molecular components can exhibit greater structural diversity than top-down nanosheets, the bottom-up nanosheets reported thus far lack useful functionalities. Here we show the design and synthesis of a bottom-up nanosheet featuring a photoactive bis(dipyrrinato)zinc(II) complex motif. A liquid/liquid interfacial synthesis between a three-way dipyrrin ligand and zinc(II) ions results in a multi-layer nanosheet, whereas an air/liquid interfacial reaction produces a single-layer or few-layer nanosheet with domain sizes of >10 μm on one side. The bis(dipyrrinato)zinc(II) metal complex nanosheet is easy to deposit on various substrates using the Langmuir–Schäfer process. The nanosheet deposited on a transparent SnO2 electrode functions as a photoanode in a photoelectric conversion system, and is thus the first photofunctional bottom-up nanosheet. PMID:25831973

  9. Vulnerability Assessment of Water Supply Systems: Status, Gaps and Opportunities

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2015-12-01

    Conventional frameworks for assessing the impacts of climate change on water resource systems use cascades of climate and hydrological models to provide 'top-down' projections of future water availability, but these are subject to high uncertainty and are model and scenario-specific. Hence there has been recent interest in 'bottom-up' frameworks, which aim to evaluate system vulnerability to change in the context of possible future climate and/or hydrological conditions. Such vulnerability assessments are generic, and can be combined with updated information from top-down assessments as they become available. While some vulnerability methods use hydrological models to estimate water availability, fully bottom-up schemes have recently been proposed that directly map system vulnerability as a function of feasible changes in water supply characteristics. These use stochastic algorithms, based on reconstruction or reshuffling methods, by which multiple water supply realizations can be generated under feasible ranges of change in water supply conditions. The paper reports recent successes, and points to areas of future improvement. Advances in stochastic modeling and optimization can address some technical limitations in flow reconstruction, while various data mining and system identification techniques can provide possibilities to better condition realizations for consistency with top-down scenarios. Finally, we show that probabilistic and Bayesian frameworks together can provide a potential basis to combine information obtained from fully bottom-up analyses with projections available from climate and/or hydrological models in a fully integrated risk assessment framework for deep uncertainty.

  10. A reversed-phase capillary ultra-performance liquid chromatography-mass spectrometry (UPLC-MS) method for comprehensive top-down/bottom-up lipid profiling

    PubMed Central

    Gao, Xiaoli; Zhang, Qibin; Meng, Da; Issac, Giorgis; Zhao, Rui; Fillmore, Thomas L.; Chu, Rosey K.; Zhou, Jianying; Tang, Keqi; Hu, Zeping; Moore, Ronald J.; Smith, Richard D.; Katze, Michael G.; Metz, Thomas O.

    2012-01-01

    Lipidomics is a critical part of metabolomics and aims to study all the lipids within a living system. We present here the development and evaluation of a sensitive capillary UPLC-MS method for comprehensive top-down/bottom-up lipid profiling. Three different stationary phases were evaluated in terms of peak capacity, linearity, reproducibility, and limit of quantification (LOQ) using a mixture of lipid standards representative of the lipidome. The relative standard deviations of the retention times and peak abundances of the lipid standards were 0.29% and 7.7%, respectively, when using the optimized method. The linearity was acceptable at >0.99 over 3 orders of magnitude, and the LOQs were sub-fmol. To demonstrate the performance of the method in the analysis of complex samples, we analyzed lipids extracted from a human cell line, rat plasma, and a model human skin tissue, identifying 446, 444, and 370 unique lipids, respectively. Overall, the method provided either higher coverage of the lipidome, greater measurement sensitivity, or both, when compared to other approaches of global, untargeted lipid profiling based on chromatography coupled with MS. PMID:22354571

  11. Capture and detection of T7 bacteriophages on a nanostructured interface.

    PubMed

    Han, Jin-Hee; Wang, Min S; Das, Jayanti; Sudheendra, L; Vonasek, Erica; Nitin, Nitin; Kennedy, Ian M

    2014-04-09

    A highly ordered array of T7 bacteriophages was created by the electrophoretic capture of phages onto a nanostructured array with wells that accommodated the phages. Electrophoresis of bacteriophages was achieved by applying a positive potential on an indium tin oxide electrode at the bottom of the nanowells. Nanoscale arrays of phages with different surface densities were obtained by changing the electric field applied to the bottom of the nanowells. The applied voltage was shown to be the critical factor in generating a well-ordered phage array. The number of wells occupied by a phage, and hence the concentration of phages in a sample solution, could be quantified by using a DNA intercalating dye that rapidly stains the T7 phage. The fluorescence signal was enhanced by the intrinsic photonic effect made available by the geometry of the platform. It was shown that the quantification of phages on the array was 6 orders of magnitude better than could be obtained with a fluorescent plate reader. The device opens up the possibility that phages can be detected directly without enrichment or culturing, and by detecting phages that specifically infect bacteria of interest, rapid pathogen detection becomes possible.

  12. Capture and Detection of T7 Bacteriophages on a Nanostructured Interface

    PubMed Central

    2015-01-01

    A highly ordered array of T7 bacteriophages was created by the electrophoretic capture of phages onto a nanostructured array with wells that accommodated the phages. Electrophoresis of bacteriophages was achieved by applying a positive potential on an indium tin oxide electrode at the bottom of the nanowells. Nanoscale arrays of phages with different surface densities were obtained by changing the electric field applied to the bottom of the nanowells. The applied voltage was shown to be the critical factor in generating a well-ordered phage array. The number of wells occupied by a phage, and hence the concentration of phages in a sample solution, could be quantified by using a DNA intercalating dye that rapidly stains the T7 phage. The fluorescence signal was enhanced by the intrinsic photonic effect made available by the geometry of the platform. It was shown that the quantification of phages on the array was 6 orders of magnitude better than could be obtained with a fluorescent plate reader. The device opens up the possibility that phages can be detected directly without enrichment or culturing, and by detecting phages that specifically infect bacteria of interest, rapid pathogen detection becomes possible. PMID:24650205

  13. Tsunami simulation method initiated from waveforms observed by ocean bottom pressure sensors for real-time tsunami forecast; Applied for 2011 Tohoku Tsunami

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro

    2017-04-01

    After tsunami disaster due to the 2011 Tohoku-oki great earthquake, improvement of the tsunami forecast has been an urgent issue in Japan. National Institute of Disaster Prevention is installing a cable network system of earthquake and tsunami observation (S-NET) at the ocean bottom along the Japan and Kurile trench. This cable system includes 125 pressure sensors (tsunami meters) which are separated by 30 km. Along the Nankai trough, JAMSTEC already installed and operated the cable network system of seismometers and pressure sensors (DONET and DONET2). Those systems are the most dense observation network systems on top of source areas of great underthrust earthquakes in the world. Real-time tsunami forecast has depended on estimation of earthquake parameters, such as epicenter, depth, and magnitude of earthquakes. Recently, tsunami forecast method has been developed using the estimation of tsunami source from tsunami waveforms observed at the ocean bottom pressure sensors. However, when we have many pressure sensors separated by 30km on top of the source area, we do not need to estimate the tsunami source or earthquake source to compute tsunami. Instead, we can initiate a tsunami simulation from those dense tsunami observed data. Observed tsunami height differences with a time interval at the ocean bottom pressure sensors separated by 30 km were used to estimate tsunami height distribution at a particular time. In our new method, tsunami numerical simulation was initiated from those estimated tsunami height distribution. In this paper, the above method is improved and applied for the tsunami generated by the 2011 Tohoku-oki great earthquake. Tsunami source model of the 2011 Tohoku-oki great earthquake estimated using observed tsunami waveforms, coseimic deformation observed by GPS and ocean bottom sensors by Gusman et al. (2012) is used in this study. The ocean surface deformation is computed from the source model and used as an initial condition of tsunami simulation. By assuming that this computed tsunami is a real tsunami and observed at ocean bottom sensors, new tsunami simulation is carried out using the above method. The station distribution (each station is separated by 15 min., about 30 km) observed tsunami waveforms which were actually computed from the source model. Tsunami height distributions are estimated from the above method at 40, 80, and 120 seconds after the origin time of the earthquake. The Near-field Tsunami Inundation forecast method (Gusman et al. 2014) was used to estimate the tsunami inundation along the Sanriku coast. The result shows that the observed tsunami inundation was well explained by those estimated inundation. This also shows that it takes about 10 minutes to estimate the tsunami inundation from the origin time of the earthquake. This new method developed in this paper is very effective for a real-time tsunami forecast.

  14. Self-oscillations in field emission nanowire mechanical resonators: a nanometric dc-ac conversion.

    PubMed

    Ayari, Anthony; Vincent, Pascal; Perisanu, Sorin; Choueib, May; Gouttenoire, Vincent; Bechelany, Mikhael; Cornu, David; Purcell, Stephen T

    2007-08-01

    We report the observation of self-oscillations in a bottom-up nanoelectromechanical system (NEMS) during field emission driven by a constant applied voltage. An electromechanical model is explored that explains the phenomenon and that can be directly used to develop integrated devices. In this first study, we have already achieved approximately 50% dc/ac (direct to alternating current) conversion. Electrical self-oscillations in NEMS open up a new path for the development of high-speed, autonomous nanoresonators and signal generators and show that field emission (FE) is a powerful tool for building new nanocomponents.

  15. On the determination of the carbon balance of continents (Vladimir Ivanovich Vernadsky Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Dolman, Albertus J. Han

    2013-04-01

    The carbon balance of regions, the size of continents, can be determined, albeit with significant uncertainty, by combining several bottom up and top down methods. The bottom up methods use eddy covariance techniques, biometric inventory measurements and modeling, while the top down methods use atmospheric observations and inverse models. There has been considerable progress in the last few years in determining these balances through more or less standard protocols, as highlighted for instance by studies of the REgional Carbon Cycle Assessment and Processes (RECAPP) project of the Global Carbon Project. Important areas where uncertainty creeps in are the scaling of point measurements in the bottom up methods, the sparseness of the observation network and the role of model and other errors in the inversion methods. Typically these balances hold for periods of several years. They therefore do not resolve the impact of anomalies in weather and climate directly. The role of management in these balances also differs for different continents. For instance in Europe management plays a strong role in the carbon balance, whereas for the Russian continent this is less important. Management in the European carbon balance may potentially override climatically driven variability. In contrast, for Russia, the importance of the role of forest is paramount, but there the vulnerability of the Arctic regions and permafrost is a key uncertainty for future behaviour. I hope to show the importance of these different aspects of the terrestrial carbon balance by comparing the two continents, and also discuss the significant uncertainty we still face in determining the carbon budgets of large areas. I will argue that we need to get a clearer picture of the role of management in these budgets, but also of the time variability of the budgets to be able to determine the impact of anomalous weather and the vulnerability in a future climate.

  16. Optimization of synthesis process of thermally-responsive poly-n-isopropylacrylamide nanoparticles for controlled release of antimicrobial hydrophobic compounds

    NASA Astrophysics Data System (ADS)

    Hill, Laura E.; Gomes, Carmen L.

    2014-12-01

    The goal of this study was to develop an effective method to synthesize poly-n-isopropylacrylamide (PNIPAAM) nanoparticles with entrapped cinnamon bark extract (CBE) to improve its delivery to foodborne pathogens and control its release with temperature stimuli. CBE was used as a model for hydrophobic natural antimicrobials. A top-down procedure using crosslinked PNIPAAM was compared to a bottom-up procedure using NIPAAM monomer. Both processes relied on self-assembly of the molecules into micelles around the CBE at 40 °C. Processing conditions were compared including homogenization time of the polymer, hydration time prior to homogenization, lyophilization, and the effect of particle ultrafiltration. The top-down versus bottom-up synthesis methods yielded particles with significantly different characteristics, especially their release profiles and antimicrobial activities. The synthesis methods affected particle size, with the bottom-up procedure resulting in smaller (P < 0.05) diameters than the top-down procedure. The controlled release profile of CBE from nanoparticles was dependent on the release media temperature. A faster, burst release was observed at 40 °C and a slower, more sustained release was observed at lower temperatures. PNIPAAM particles containing CBE were analyzed for their antimicrobial activity against Salmonella enterica serovar Typhimurium LT2 and Listeria monocytogenes Scott A. The PNIPAAM particles synthesized via the top-down procedure had a much faster release, which led to a greater (P < 0.05) antimicrobial activity. Both of the top-down nanoparticles performed similarly, therefore the 7 min homogenization time nanoparticles would be the best for this application, as the process time is shorter and little improvement was seen by using a slightly longer homogenization.

  17. Structurally Defined 3D Nanographene Assemblies via Bottom-Up Chemical Synthesis for Highly Efficient Lithium Storage

    DOE PAGES

    Yen, Hung-Ju; Tsai, Hsinhan; Zhou, Ming; ...

    2016-10-10

    In this paper, functionalized 3D nanographenes with controlled electronic properties have been synthesized through a multistep organic synthesis method and are further used as promising anode materials for lithium-ion batteries, exhibiting a much increased capacity (up to 950 mAh g -1), three times higher than that of the graphite anode (372 mAh g -1).

  18. Force-controlled inorganic crystallization lithography.

    PubMed

    Cheng, Chao-Min; LeDuc, Philip R

    2006-09-20

    Lithography plays a key role in integrated circuits, optics, information technology, biomedical applications, catalysis, and separation technologies. However, inorganic lithography techniques remain of limited utility for applications outside of the typical foci of integrated circuit manufacturing. In this communication, we have developed a novel stamping method that applies pressure on the upper surface of the stamp to regulate the dewetting process of the inorganic buffer and the evaporation rate of the solvent in this buffer between the substrate and the surface of the stamp. We focused on generating inorganic microstructures with specific locations and also on enabling the ability to pattern gradients during the crystallization of the inorganic salts. This approach utilized a combination of lithography with bottom-up growth and assembly of inorganic crystals. This work has potential applications in a variety of fields, including studying inorganic material patterning and small-scale fabrication technology.

  19. 11th Annual CMMI Technology Conference and User Group

    DTIC Science & Technology

    2011-11-17

    Examples of triggers may include: – Cost performance – Schedule performance – Results of management reviews – Occurrence of the risk • as a...Analysis (PHA) – Method 3 – Through bottom- up analysis of design data (e.g., flow diagrams, Failure Mode Effects and Criticality Analysis (FMECA...of formal reviews and the setting up of delta or follow- up reviews can be used to give the organization more places to look at the products as they

  20. A New Method for Coronal Magnetic Field Reconstruction

    NASA Astrophysics Data System (ADS)

    Yi, Sibaek; Choe, Gwang-Son; Cho, Kyung-Suk; Kim, Kap-Sung

    2017-08-01

    A precise way of coronal magnetic field reconstruction (extrapolation) is an indispensable tool for understanding of various solar activities. A variety of reconstruction codes have been developed so far and are available to researchers nowadays, but they more or less bear this and that shortcoming. In this paper, a new efficient method for coronal magnetic field reconstruction is presented. The method imposes only the normal components of magnetic field and current density at the bottom boundary to avoid the overspecification of the reconstruction problem, and employs vector potentials to guarantee the divergence-freeness. In our method, the normal component of current density is imposed, not by adjusting the tangential components of A, but by adjusting its normal component. This allows us to avoid a possible numerical instability that on and off arises in codes using A. In real reconstruction problems, the information for the lateral and top boundaries is absent. The arbitrariness of the boundary conditions imposed there as well as various preprocessing brings about the diversity of resulting solutions. We impose the source surface condition at the top boundary to accommodate flux imbalance, which always shows up in magnetograms. To enhance the convergence rate, we equip our code with a gradient-method type accelerator. Our code is tested on two analytical force-free solutions. When the solution is given only at the bottom boundary, our result surpasses competitors in most figures of merits devised by Schrijver et al. (2006). We have also applied our code to a real active region NOAA 11974, in which two M-class flares and a halo CME took place. The EUV observation shows a sudden appearance of an erupting loop before the first flare. Our numerical solutions show that two entwining flux tubes exist before the flare and their shackling is released after the CME with one of them opened up. We suggest that the erupting loop is created by magnetic reconnection between two entwining flux tubes and later appears in the coronagraph as the major constituent of the observed CME.

  1. Micropatterning stretched and aligned DNA using microfluidics and surface patterning for applications in hybridization-mediated templated assembly of nanostructures

    NASA Astrophysics Data System (ADS)

    Carbeck, Jeffrey; Petit, Cecilia

    2004-03-01

    Current efforts in nanotechnology use one of two basic approaches: top-down fabrication and bottom-up assembly. Top-down strategies use lithography and contact printing to create patterned surfaces and microfluidic channels that, in turn, can corral and organize nanoscale structures. Bottom-up approaches use templates to direct the assembly of atoms, molecules, and nanoparticles through molecular recognition. The goal of this work is to integrate these strategies by first patterning and orienting DNA molecules through top-down tools so that single DNA chains can then serve as templates for the bottom-up construction of hetero-structures composed of proteins and nanoparticles, both metallic and semi-conducting. The first part of this talk focuses on the top-down strategies used to create microscopic patterns of stretched and aligned molecules of DNA. Specifically, it presents a new method in which molecular combing -- a process by which molecules are deposited and stretched onto a surface by the passage of an air-water interface -- is performed in microchannels. This approach demonstrates that the shape and motion of this interface serve as an effective local field directing the chains dynamically as they are stretched onto the surface. The geometry of the microchannel directs the placement of the DNA molecules, while the geometry of the air-water interface directs the local orientation and curvature of the molecules. This ability to control both the placement and orientation of chains has implication for the use of this technique in genetic analysis and in the bottom up approach to nanofabrication.The second half of this talk presents our bottom-up strategy, which allows placement of nanoparticles along individual DNA chains with a theoretical resolution of less than 1 nm. Specifically, we demonstrate the sequence-specific patterning of nanoparticles via the hybridization of functionalized complementary probes to surface-bound chains of double-stranded DNA. Using this technique, we demonstrate the ability to assemble metals, semiconductors, and a composite of both on a single molecule.

  2. SO2 Emissions and Lifetimes: Estimates from Inverse Modeling Using In Situ and Global, Space-Based (SCIAMACHY and OMI) Observations

    NASA Technical Reports Server (NTRS)

    Lee, Chulkyu; Martin Randall V.; vanDonkelaar, Aaron; Lee, Hanlim; Dickerson, RUssell R.; Hains, Jennifer C.; Krotkov, Nickolay; Richter, Andreas; Vinnikov, Konstantine; Schwab, James J.

    2011-01-01

    Top-down constraints on global sulfur dioxide (SO2) emissions are inferred through inverse modeling using SO2 column observations from two satellite instruments (SCIAMACHY and OMI). We first evaluated the S02 column observations with surface SO2 measurements by applying local scaling factors from a global chemical transport model (GEOS-Chem) to SO2 columns retrieved from the satellite instruments. The resulting annual mean surface SO2 mixing ratios for 2006 exhibit a significant spatial correlation (r=0.86, slope=0.91 for SCIAMACHY and r=0.80, slope = 0.79 for OMI) with coincident in situ measurements from monitoring networks throughout the United States and Canada. We evaluate the GEOS-Chem simulation of the SO2 lifetime with that inferred from in situ measurements to verity the applicability of GEOS-Chem for inversion of SO2 columns to emissions. The seasonal mean SO2 lifetime calculated with the GEOS-Chem model over the eastern United States is 13 h in summer and 48 h in winter, compared to lifetimes inferred from in situ measurements of 19 +/- 7 h in summer and 58 +/- 20 h in winter. We apply SO2 columns from SCIAMACHY and OMI to derive a top-down anthropogenic SO2 emission inventory over land by using the local GEOS-Chem relationship between SO2 columns and emissions. There is little seasonal variation in the top-down emissions (<15%) over most major industrial regions providing some confidence in the method. Our global estimate for annual land surface anthropogenic SO2 emissions (52.4 Tg S/yr from SCIAMACHY and 49.9 Tg S / yr from OMI) closely agrees with the bottom-up emissions (54.6 Tg S/yr) in the GEOS-Chem model and exhibits consistency in global distributions with the bottom-up emissions (r = 0.78 for SCIAMACHY, and r = 0.77 for OMI). However, there are significant regional differences.

  3. 46 CFR 171.106 - Wells in double bottoms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Wells in double bottoms. 171.106 Section 171.106... PERTAINING TO VESSELS CARRYING PASSENGERS Additional Subdivision Requirements § 171.106 Wells in double bottoms. (a) This section applies to each vessel that has a well installed in a double bottom required by...

  4. 46 CFR 171.106 - Wells in double bottoms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Wells in double bottoms. 171.106 Section 171.106... PERTAINING TO VESSELS CARRYING PASSENGERS Additional Subdivision Requirements § 171.106 Wells in double bottoms. (a) This section applies to each vessel that has a well installed in a double bottom required by...

  5. 46 CFR 171.106 - Wells in double bottoms.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 7 2013-10-01 2013-10-01 false Wells in double bottoms. 171.106 Section 171.106... PERTAINING TO VESSELS CARRYING PASSENGERS Additional Subdivision Requirements § 171.106 Wells in double bottoms. (a) This section applies to each vessel that has a well installed in a double bottom required by...

  6. 46 CFR 171.106 - Wells in double bottoms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Wells in double bottoms. 171.106 Section 171.106... PERTAINING TO VESSELS CARRYING PASSENGERS Additional Subdivision Requirements § 171.106 Wells in double bottoms. (a) This section applies to each vessel that has a well installed in a double bottom required by...

  7. Sarcoptic mange breaks up bottom-up regulation of body condition in a large herbivore population.

    PubMed

    Carvalho, João; Granados, José E; López-Olvera, Jorge R; Cano-Manuel, Francisco Javier; Pérez, Jesús M; Fandos, Paulino; Soriguer, Ramón C; Velarde, Roser; Fonseca, Carlos; Ráez, Arian; Espinosa, José; Pettorelli, Nathalie; Serrano, Emmanuel

    2015-11-06

    Both parasitic load and resource availability can impact individual fitness, yet little is known about the interplay between these parameters in shaping body condition, a key determinant of fitness in wild mammals inhabiting seasonal environments. Using partial least square regressions (PLSR), we explored how temporal variation in climatic conditions, vegetation dynamics and sarcoptic mange (Sarcoptes scabiei) severity impacted body condition of 473 Iberian ibexes (Capra pyrenaica) harvested between 1995 and 2008 in the highly seasonal Alpine ecosystem of Sierra Nevada Natural Space (SNNS), southern Spain. Bottom-up regulation was found to only occur in healthy ibexes; the condition of infected ibexes was independent of primary productivity and snow cover. No link between ibex abundance and ibex body condition could be established when only considering infected individuals. The pernicious effects of mange on Iberian ibexes overcome the benefits of favorable environmental conditions. Even though the increase in primary production exerts a positive effect on the body condition of healthy ibexes, the scabietic individuals do not derive any advantage from increased resource availability. Further applied research coupled with continuous sanitary surveillance are needed to address remaining knowledge gaps associated with the transmission dynamics and management of sarcoptic mange in free-living populations.

  8. Biomimetic proteolipid vesicles for targeting inflamed tissues

    NASA Astrophysics Data System (ADS)

    Molinaro, R.; Corbo, C.; Martinez, J. O.; Taraballi, F.; Evangelopoulos, M.; Minardi, S.; Yazdi, I. K.; Zhao, P.; De Rosa, E.; Sherman, M. B.; de Vita, A.; Toledano Furman, N. E.; Wang, X.; Parodi, A.; Tasciotti, E.

    2016-09-01

    A multitude of micro- and nanoparticles have been developed to improve the delivery of systemically administered pharmaceuticals, which are subject to a number of biological barriers that limit their optimal biodistribution. Bioinspired drug-delivery carriers formulated by bottom-up or top-down strategies have emerged as an alternative approach to evade the mononuclear phagocytic system and facilitate transport across the endothelial vessel wall. Here, we describe a method that leverages the advantages of bottom-up and top-down strategies to incorporate proteins derived from the leukocyte plasma membrane into lipid nanoparticles. The resulting proteolipid vesicles--which we refer to as leukosomes--retained the versatility and physicochemical properties typical of liposomal formulations, preferentially targeted inflamed vasculature, enabled the selective and effective delivery of dexamethasone to inflamed tissues, and reduced phlogosis in a localized model of inflammation.

  9. Pressurized Pepsin Digestion in Proteomics: An Automatable Alternative to Trypsin for Integrated Top-down Bottom-up Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lopez-Ferrer, Daniel; Petritis, Konstantinos; Robinson, Errol W.

    2011-02-01

    Integrated top-down bottom-up proteomics combined with online digestion has great potential to improve the characterization of protein isoforms in biological systems and is amendable to highthroughput proteomics experiments. Bottom-up proteomics ultimately provides the peptide sequences derived from the tandem MS analyses of peptides after the proteome has been digested. Top-down proteomics conversely entails the MS analyses of intact proteins for more effective characterization of genetic variations and/or post-translational modifications (PTMs). Herein, we describe recent efforts towards efficient integration of bottom-up and top-down LCMS based proteomic strategies. Since most proteomic platforms (i.e. LC systems) operate in acidic environments, we exploited themore » compatibility of the pepsin (i.e. the enzyme’s natural acidic activity) for the integration of bottom-up and top-down proteomics. Pressure enhanced pepsin digestions were successfully performed and characterized with several standard proteins in either an offline mode using a Barocycler or an online mode using a modified high pressure LC system referred to as a fast online digestion system (FOLDS). FOLDS was tested using pepsin and a whole microbial proteome, and the results compared against traditional trypsin digestions on the same platform. Additionally, FOLDS was integrated with a RePlay configuration to demonstrate an ultra-rapid integrated bottom-up top-down proteomic strategy employing a standard mixture of proteins and a monkey pox virus proteome.« less

  10. Combined contributions of feedforward and feedback inputs to bottom-up attention

    PubMed Central

    Khorsand, Peyman; Moore, Tirin; Soltani, Alireza

    2015-01-01

    In order to deal with a large amount of information carried by visual inputs entering the brain at any given point in time, the brain swiftly uses the same inputs to enhance processing in one part of visual field at the expense of the others. These processes, collectively called bottom-up attentional selection, are assumed to solely rely on feedforward processing of the external inputs, as it is implied by the nomenclature. Nevertheless, evidence from recent experimental and modeling studies points to the role of feedback in bottom-up attention. Here, we review behavioral and neural evidence that feedback inputs are important for the formation of signals that could guide attentional selection based on exogenous inputs. Moreover, we review results from a modeling study elucidating mechanisms underlying the emergence of these signals in successive layers of neural populations and how they depend on feedback from higher visual areas. We use these results to interpret and discuss more recent findings that can further unravel feedforward and feedback neural mechanisms underlying bottom-up attention. We argue that while it is descriptively useful to separate feedforward and feedback processes underlying bottom-up attention, these processes cannot be mechanistically separated into two successive stages as they occur at almost the same time and affect neural activity within the same brain areas using similar neural mechanisms. Therefore, understanding the interaction and integration of feedforward and feedback inputs is crucial for better understanding of bottom-up attention. PMID:25784883

  11. Network Learning for Educational Change. Professional Learning

    ERIC Educational Resources Information Center

    Veugelers, Wiel, Ed.; O'Hair, Mary John, Ed.

    2005-01-01

    School-university networks are becoming an important method to enhance educational renewal and student achievement. Networks go beyond tensions of top-down versus bottom-up, school development and professional development of individuals, theory and practice, and formal and informal organizational structures. The theoretical base of networking…

  12. Publisher Correction: Bottom-up linking of carbon markets under far-sighted cap coordination and reversibility

    NASA Astrophysics Data System (ADS)

    Heitzig, Jobst; Kornek, Ulrike

    2018-06-01

    In the PDF version of this Article originally published, in equation (6) gi' was incorrectly formatted as gi', and at the end of the Methods section wi was incorrectly formatted as wi. These have now been corrected.

  13. Patterning and manipulating microparticles into a three-dimensional matrix using standing surface acoustic waves

    NASA Astrophysics Data System (ADS)

    Nguyen, T. D.; Tran, V. T.; Fu, Y. Q.; Du, H.

    2018-05-01

    A method based on standing surface acoustic waves (SSAWs) is proposed to pattern and manipulate microparticles into a three-dimensional (3D) matrix inside a microchamber. An optical prism is used to observe the 3D alignment and patterning of the microparticles in the vertical and horizontal planes simultaneously. The acoustic radiation force effectively patterns the microparticles into lines of 3D space or crystal-lattice-like matrix patterns. A microparticle can be positioned precisely at a specified vertical location by balancing the forces of acoustic radiation, drag, buoyancy, and gravity acting on the microparticle. Experiments and finite-element numerical simulations both show that the acoustic radiation force increases gradually from the bottom of the chamber to the top, and microparticles can be moved up or down simply by adjusting the applied SSAW power. Our method has great potential for acoustofluidic applications, building the large-scale structures associated with biological objects and artificial neuron networks.

  14. Dried Blood Spot Proteomics: Surface Extraction of Endogenous Proteins Coupled with Automated Sample Preparation and Mass Spectrometry Analysis

    NASA Astrophysics Data System (ADS)

    Martin, Nicholas J.; Bunch, Josephine; Cooper, Helen J.

    2013-08-01

    Dried blood spots offer many advantages as a sample format including ease and safety of transport and handling. To date, the majority of mass spectrometry analyses of dried blood spots have focused on small molecules or hemoglobin. However, dried blood spots are a potentially rich source of protein biomarkers, an area that has been overlooked. To address this issue, we have applied an untargeted bottom-up proteomics approach to the analysis of dried blood spots. We present an automated and integrated method for extraction of endogenous proteins from the surface of dried blood spots and sample preparation via trypsin digestion by use of the Advion Biosciences Triversa Nanomate robotic platform. Liquid chromatography tandem mass spectrometry of the resulting digests enabled identification of 120 proteins from a single dried blood spot. The proteins identified cross a concentration range of four orders of magnitude. The method is evaluated and the results discussed in terms of the proteins identified and their potential use as biomarkers in screening programs.

  15. Assessing the Hydraulic Criticality of Deep Ocean Overflows

    NASA Astrophysics Data System (ADS)

    Pratt, L. J.; Helfrich, K. R.

    2004-12-01

    Two methods for assessing the hydraulic criticality of a modelled or observed deep overflow are discussed. The methods should be of use in determining the position of the control section, which is needed to establish the transport relation helpful for long-term monitoring from upstream. Both approaches are based on a multiple streamtube idealization in which the observed flow at a particular section is divided up into subsections (streamtubes). There are no restrictions on the bottom topography or potential vorticity distribution. The first criteria involves evauation of a generalized Jacobian condition based on the conservation laws for each streamtube; the second involves direct calculation of the long-wave phase speeds. We also comment on the significance of the local Froude number F of the flow and argue that F must pass through unity across a section of hydraulic control. These criteria are applied to some numerically modelled flows and are used in the companion presentation (Girton, et al.) to evaluate the hydraulic criticality of the Faroe Bank Channel.

  16. Bottom-up and top-down emotion generation: implications for emotion regulation

    PubMed Central

    Misra, Supriya; Prasad, Aditya K.; Pereira, Sean C.; Gross, James J.

    2012-01-01

    Emotion regulation plays a crucial role in adaptive functioning and mounting evidence suggests that some emotion regulation strategies are often more effective than others. However, little attention has been paid to the different ways emotions can be generated: from the ‘bottom-up’ (in response to inherently emotional perceptual properties of the stimulus) or ‘top-down’ (in response to cognitive evaluations). Based on a process priming principle, we hypothesized that mode of emotion generation would interact with subsequent emotion regulation. Specifically, we predicted that top-down emotions would be more successfully regulated by a top-down regulation strategy than bottom-up emotions. To test this hypothesis, we induced bottom-up and top-down emotions, and asked participants to decrease the negative impact of these emotions using cognitive reappraisal. We observed the predicted interaction between generation and regulation in two measures of emotional responding. As measured by self-reported affect, cognitive reappraisal was more successful on top-down generated emotions than bottom-up generated emotions. Neurally, reappraisal of bottom-up generated emotions resulted in a paradoxical increase of amygdala activity. This interaction between mode of emotion generation and subsequent regulation should be taken into account when comparing of the efficacy of different types of emotion regulation, as well as when reappraisal is used to treat different types of clinical disorders. PMID:21296865

  17. Solid-State Nanopore.

    PubMed

    Yuan, Zhishan; Wang, Chengyong; Yi, Xin; Ni, Zhonghua; Chen, Yunfei; Li, Tie

    2018-02-20

    Solid-state nanopore has captured the attention of many researchers due to its characteristic of nanoscale. Now, different fabrication methods have been reported, which can be summarized into two broad categories: "top-down" etching technology and "bottom-up" shrinkage technology. Ion track etching method, mask etching method chemical solution etching method, and high-energy particle etching and shrinkage method are exhibited in this report. Besides, we also discussed applications of solid-state nanopore fabrication technology in DNA sequencing, protein detection, and energy conversion.

  18. Solid-State Nanopore

    NASA Astrophysics Data System (ADS)

    Yuan, Zhishan; Wang, Chengyong; Yi, Xin; Ni, Zhonghua; Chen, Yunfei; Li, Tie

    2018-02-01

    Solid-state nanopore has captured the attention of many researchers due to its characteristic of nanoscale. Now, different fabrication methods have been reported, which can be summarized into two broad categories: "top-down" etching technology and "bottom-up" shrinkage technology. Ion track etching method, mask etching method chemical solution etching method, and high-energy particle etching and shrinkage method are exhibited in this report. Besides, we also discussed applications of solid-state nanopore fabrication technology in DNA sequencing, protein detection, and energy conversion.

  19. De novo protein sequencing by combining top-down and bottom-up tandem mass spectra.

    PubMed

    Liu, Xiaowen; Dekker, Lennard J M; Wu, Si; Vanduijn, Martijn M; Luider, Theo M; Tolić, Nikola; Kou, Qiang; Dvorkin, Mikhail; Alexandrova, Sonya; Vyatkina, Kira; Paša-Tolić, Ljiljana; Pevzner, Pavel A

    2014-07-03

    There are two approaches for de novo protein sequencing: Edman degradation and mass spectrometry (MS). Existing MS-based methods characterize a novel protein by assembling tandem mass spectra of overlapping peptides generated from multiple proteolytic digestions of the protein. Because each tandem mass spectrum covers only a short peptide of the target protein, the key to high coverage protein sequencing is to find spectral pairs from overlapping peptides in order to assemble tandem mass spectra to long ones. However, overlapping regions of peptides may be too short to be confidently identified. High-resolution mass spectrometers have become accessible to many laboratories. These mass spectrometers are capable of analyzing molecules of large mass values, boosting the development of top-down MS. Top-down tandem mass spectra cover whole proteins. However, top-down tandem mass spectra, even combined, rarely provide full ion fragmentation coverage of a protein. We propose an algorithm, TBNovo, for de novo protein sequencing by combining top-down and bottom-up MS. In TBNovo, a top-down tandem mass spectrum is utilized as a scaffold, and bottom-up tandem mass spectra are aligned to the scaffold to increase sequence coverage. Experiments on data sets of two proteins showed that TBNovo achieved high sequence coverage and high sequence accuracy.

  20. The bottom-up approach to integrative validity: a new perspective for program evaluation.

    PubMed

    Chen, Huey T

    2010-08-01

    The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  1. The effects of urbanization on trophic interactions in a desert landscape

    USDA-ARS?s Scientific Manuscript database

    Background/Question/Methods: Trophic systems can be affected through top-down (predators) and bottom-up (resources) impacts. Human activity can alter trophic systems by causing predators to avoid areas (top-down) or by providing increased resources through irrigation and decorative plants that attra...

  2. Bottom-Up Syntheses and Characterization of One Dimensional Nanomaterials

    NASA Astrophysics Data System (ADS)

    Yeh, Yao-Wen

    Nanomaterials, materials having at least one dimension below 100 nm, have been creating exciting opportunities for fundamental quantum confinement studies and applications in electronic devices and energy technologies. One obvious and important aspect of nanomaterials is their production. Although nanostructures can be obtained by top-down reductive e-beam lithography and focused ion beam processes, further development of these processes is needed before these techniques can become practical routes to large scale production. On the other hand, bottom-up syntheses, with advantages in material diversity, throughput, and the potential for large volume production, may provide an alternative strategy for creating nanostructures. In this work, we explore syntheses of one dimensional nanostructures based on hydrothermal and arc discharge methods. The first project presented in this thesis involves syntheses of technologically important nanomaterials and their potential application in energy harvesting. In particular, it was demonstrated that single crystal ferroelectric lead magnesium niobate lead titanate (PMN-PT) nanowires can be synthesized by a hydrothermal route. The chemical composition of the synthesized nanowires is near the rhombohedral-monoclinic boundary of PMN-PT, which leads to a high piezoelectric coefficient of 381 pm/V. Finally, the potential use of PMN-PT nanowires in energy harvesting applications was also demonstrated. The second part of this thesis involves the synthesis of carbon and boron nitride nanotubes by dc arc discharges. In particular, we investigated how local plasma related properties affected the synthesis of carbon nanostructures. Finally, we investigated the anodic nature of the arc and how a dc arc discharge can be applied to synthesize boron nitride nanotubes.

  3. Bioinspired Surface Treatments for Improved Decontamination: Fluoro-Plasma Treatment

    DTIC Science & Technology

    2017-07-21

    methyl salicylate (right) immediately following liquid application (top) and 5 min after liquid application (bottom): painted coupon (A), C2F6, 50 W...applied at 0° after which the supporting platform angle was gradually increased up to 60°. Sliding angles for each of the liquids were identified as the...angle for which movement of the droplet was identified. Shedding angles for each liquid were determined using 12 µL droplets initiated 2.5 cm above

  4. Ecological hierarchies and self-organisation - Pattern analysis, modelling and process integration across scales

    USGS Publications Warehouse

    Reuter, H.; Jopp, F.; Blanco-Moreno, J. M.; Damgaard, C.; Matsinos, Y.; DeAngelis, D.L.

    2010-01-01

    A continuing discussion in applied and theoretical ecology focuses on the relationship of different organisational levels and on how ecological systems interact across scales. We address principal approaches to cope with complex across-level issues in ecology by applying elements of hierarchy theory and the theory of complex adaptive systems. A top-down approach, often characterised by the use of statistical techniques, can be applied to analyse large-scale dynamics and identify constraints exerted on lower levels. Current developments are illustrated with examples from the analysis of within-community spatial patterns and large-scale vegetation patterns. A bottom-up approach allows one to elucidate how interactions of individuals shape dynamics at higher levels in a self-organisation process; e.g., population development and community composition. This may be facilitated by various modelling tools, which provide the distinction between focal levels and resulting properties. For instance, resilience in grassland communities has been analysed with a cellular automaton approach, and the driving forces in rodent population oscillations have been identified with an agent-based model. Both modelling tools illustrate the principles of analysing higher level processes by representing the interactions of basic components.The focus of most ecological investigations on either top-down or bottom-up approaches may not be appropriate, if strong cross-scale relationships predominate. Here, we propose an 'across-scale-approach', closely interweaving the inherent potentials of both approaches. This combination of analytical and synthesising approaches will enable ecologists to establish a more coherent access to cross-level interactions in ecological systems. ?? 2010 Gesellschaft f??r ??kologie.

  5. Results of Radiocarbon Dating of Holocene Deposits from the Sea of Azov

    NASA Astrophysics Data System (ADS)

    Matishov, G. G.; Kovaleva, G. V.; Arslanov, Kh. A.; Dyuzhova, K. V.; Polshin, V. V.; Zolotareva, A. E.

    2018-04-01

    New data on the absolute age of Quaternary bottom deposits from the Sea of Azov based on the results of radiocarbon analysis (14C) are presented. Overall, 67 radiocarbon dating of bottom deposits of New and Ancient Azov Ages were obtained. The thickness of sediments of the New Azov Age and their distribution over different areas of the Sea of Azov was determined during the study; the results obtained were compared with the reference data available. An integrated approach to the study of deposits, based on the combination of the biostratigraphy methods and the results of absolute age dating, was applied.

  6. Top-down Estimates of Greenhouse Gas Intensities and Emissions for Individual Oil Sands Facilities in Alberta Canada

    NASA Astrophysics Data System (ADS)

    Liggio, J.; Li, S. M.; Staebler, R. M.; Hayden, K. L.; Mittermeier, R. L.; McLaren, R.; Baray, S.; Darlington, A.; Worthy, D.; O'Brien, J.

    2017-12-01

    The oil sands (OS) region of Alberta contributes approximately 10% to Canada's overall anthropogenic greenhouse gas (GHG) emissions. Such emissions have traditionally been estimated through "bottom-up" methods which seek to account for all individual sources of GHGs within a given facility. However, it is recognized that bottom-up approaches for complex industrial facilities can be subject to uncertainties associated with incomplete or inaccurate emission factor and/or activity data. In order to quantify air pollutant emissions from oil sands activities an aircraft-based measurement campaign was performed in the summer of 2013. The aircraft measurements could also be used to quantify GHG emissions for comparison to the bottom up emissions estimates. Utilizing specific flight patterns, together with an emissions estimation algorithm and measurements of CO2 and methane, a "top-down" estimate of GHG intensities for several large surface mining operations was obtained. The results demonstrate that there is a wide variation in emissions intensities (≈80 - 220 kg CO2/barrel oil) across OS facilities, which in some cases agree with calculated intensities, and in other cases are larger than that estimated using industry reported GHG emission and oil production data. When translated to annual GHG emissions, the "top-down" approach results in a CO2 emission of approximately 41 Mega Tonnes (MT) CO2/year for the 4 OS facilities investigated, in contrast to the ≈26 MT CO2/year reported by industry. The results presented here highlight the importance of using "top-down" approaches as a complimentary method in evaluating GHG emissions from large industrial sources.

  7. Substructure hybrid testing of reinforced concrete shear wall structure using a domain overlapping technique

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Pan, Peng; Gong, Runhua; Wang, Tao; Xue, Weichen

    2017-10-01

    An online hybrid test was carried out on a 40-story 120-m high concrete shear wall structure. The structure was divided into two substructures whereby a physical model of the bottom three stories was tested in the laboratory and the upper 37 stories were simulated numerically using ABAQUS. An overlapping domain method was employed for the bottom three stories to ensure the validity of the boundary conditions of the superstructure. Mixed control was adopted in the test. Displacement control was used to apply the horizontal displacement, while two controlled force actuators were applied to simulate the overturning moment, which is very large and cannot be ignored in the substructure hybrid test of high-rise buildings. A series of tests with earthquake sources of sequentially increasing intensities were carried out. The test results indicate that the proposed hybrid test method is a solution to reproduce the seismic response of high-rise concrete shear wall buildings. The seismic performance of the tested precast high-rise building satisfies the requirements of the Chinese seismic design code.

  8. Variation and Defect Tolerance for Nano Crossbars

    NASA Astrophysics Data System (ADS)

    Tunc, Cihan

    With the extreme shrinking in CMOS technology, quantum effects and manufacturing issues are getting more crucial. Hence, additional shrinking in CMOS feature size seems becoming more challenging, difficult, and costly. On the other hand, emerging nanotechnology has attracted many researchers since additional scaling down has been demonstrated by manufacturing nanowires, Carbon nanotubes as well as molecular switches using bottom-up manufacturing techniques. In addition to the progress in manufacturing, developments in architecture show that emerging nanoelectronic devices will be promising for the future system designs. Using nano crossbars, which are composed of two sets of perpendicular nanowires with programmable intersections, it is possible to implement logic functions. In addition, nano crossbars present some important features as regularity, reprogrammability, and interchangeability. Combining these features, researchers have presented different effective architectures. Although bottom-up nanofabrication can greatly reduce manufacturing costs, due to low controllability in the manufacturing process, some critical issues occur. Bottom- up nanofabrication process results in high variation compared to conventional top- down lithography used in CMOS technology. In addition, an increased failure rate is expected. Variation and defect tolerance methods used for conventional CMOS technology seem inadequate for adapting to emerging nano technology because the variation and the defect rate for emerging nano technology is much more than current CMOS technology. Therefore, variations and defect tolerance methods for emerging nano technology are necessary for a successful transition. In this work, in order to tolerate variations for crossbars, we introduce a framework that is established based on reprogrammability and interchangeability features of nano crossbars. This framework is shown to be applicable for both FET-based and diode-based nano crossbars. We present a characterization testing method which requires minimal number of test vectors. We formulate the variation optimization problem using Simulated Annealing with different optimization goals. Furthermore, we extend the framework for defect tolerance. Experimental results and comparison of proposed framework with exhaustive methods confirm its effectiveness for both variation and defect tolerance.

  9. Heated fiber optic distributed temperature sensing: a tool for measuring soil water content

    NASA Astrophysics Data System (ADS)

    Rodriguez-Sinobas, Leonor; Zubelzu, Sergio; Sánchez-Calvo, Raúl; Horcajo, Daniel

    2016-04-01

    The use of Distributed Fiber Optic Temperature Measurement (DFOT) method for estimating temperature variation along a cable of fiber optic has been assessed in multiple environmental applications. Recently, the application of DFOT combined with an active heating pulses technique has been reported as a sensor to estimate soil moisture. This method applies a known amount of heat to the soil and monitors the temperature evolution, which mainly depends on the soil moisture content . This study presents the application of the Active Heated DFOT method to determine the soil water retention curve under experimental conditions. The experiment was conducted in a rectangular methacrylate box of 2.5 m x 0.25 m x 0.25 m which was introduced in a larger box 2.8 m x 0.3 m x 0.3 m of the same material. The inner box was filled with a sandy loamy soil collected from the nearest garden and dried under ambient temperature for 30 days. Care was taking to fill up the box maintaining the soil bulk density determined "in-situ". The cable was deployed along the box at 10 cm depth. At the beginning of the experiment, the box was saturated bottom-up, by filling the outer box with water, and then it kept dried for two months. The circulation of heated air at the bottom box accelerated the drying process. In addition, fast growing turf was also sowed to dry it fast. The DTS unit was a SILIXA ULTIMA SR (Silixa Ltd, UK) and has spatial and temporal resolution of 0.29 m and 5 s, respectively. In this study, heat pulses of 7 W/m for 2 1/2 min were applied uniformly along the fiber optic cable and the thermal response on an adjacent cable was monitored in different soil water status. Then, the heating and drying phase integer (called Tcum) was determined following the approach of Sayde et al., (2010). For each water status,  was measured by the gravimetric method in several soil samples collected in three box locations at the same depth that the fiber optic cable and after each heat pulse. Finally, the soil water retention curve was estimated by fitting pairs of Tcum- values. Results showed the feasibility of heated fiber optics with distributed temperature sensing to estimate soil water content, and suggest its potential for its application under field conditions

  10. Development of a Contactless Technique for Electrodeposition and Porous Silicon Formation

    NASA Astrophysics Data System (ADS)

    Zhao, Mingrui

    One of the key active manufacturing technologies for 3D integration is through silicon vias (TSVs), which involves etching of deep vias in a silicon substrate that are filled with an electrodeposited metal, and subsequent removal of excess metal by chemical mechanical planarization (CMP). Electrodeposition often results in undesired voids in the TSV metal fill as well as a thick overburden layer. These via plating defects can severely degrade interconnect properties and lead to variation in via resistance, electrically open vias, and trapped plating chemicals that present a reliability hazard. Thick overburden layers result in lengthy and expensive CMP processing. We are proposing a technique that pursues a viable method of depositing a high quality metal inside vias with true bottom-up filling, using an additive-free deposition solution. The mechanism is based on a novel concept of electrochemical oxidation of backside silicon that releases electrons, and subsequent chemical etching of silicon dioxide for regeneration of the surface. Electrons are transported through the bulk silicon to the interface of the via bottom and the deposition solution, where the metal ions accept these electrons and electrodeposit resulting in the bottom-up filling of the large aspect ratio vias. With regions outside the vias covered bydielectric, no metal electrodeposition should occur in these regions. Our new bottom-up technique was initially examined and successfully demonstrated on blanket silicon wafers and shown to supply electrons to provide bottom-up filling advantage of through-hole plating and the depth tailorability of blind vias. We have also conducted a fundamental study that investigated the effect of various process parameters on the characteristics of deposited Cu and Ni and established correlations between metal filling properties and various electrochemical and solution variables. A copper sulfate solution with temperature of about 65°C was shown to be suitable for achieving stable and high values of current density that translated to copper deposition rates of 2.4 mum/min with good deposition uniformity. The importance of backside silicon oxidation and subsequent oxide etching on the kinetics of metal deposition on front side silicon has also been highlighted. Further, a process model was also developed to simulate the through silicon via copper filling process using conventional and contactless electrodeposition methods with no additives being used in the electrolyte solution. A series of electrochemical measurements were employed and integrated in the development of the comprehensive process simulator. The experimental data not only provided the necessary parameters for the model but also validated the simulation accuracy. From the simulation results, the "pinch-off" effect was observed for the additive-free conventional deposition process, which further causes partial filling and void formation. By contrast, a void-free filling with higher deposition rates was achieved by the use of the contactless technique. Moreover, experimental results of contactless electrodeposition on patterned wafers showed fast rate bottom-up filling ( 3.3 mum/min) in vias of 4 mum diameter and 50 mum depth (aspect ratio = 12.5) without void formation and no copper overburden in the regions outside the vias. Efforts were also made to extend the use of the contactless technique to other applications such as synthesis of porous silicon. We were able to fabricate porous silicon with a morphological gradient using a novel design of the experimental cell. The resulted porous silicon layers show a large distribution in porosity, pore size and depth along the radius of the samples. Symmetrical arrangements were attributed to decreasing current density radially inward on the silicon surface exposed to surfactant containing HF based etchant solution. The formation mechanism as well as morphological properties and their dependence on different process parameters has been investigated in detail. In the presence of surfactants, an increase in the distribution range of porosity, pore diameter and depth was observed by increasing HF concentration or lowering pH of the etchant solution, as the formation of pores was considered to be limited by the etch rates of silicon dioxide. Gradient porous silicon was also found to be successfully formulated both at high and low current densities. Interestingly, the morphological gradient was not developed when dimethyl sulfoxide (instead of surfactants) was used in etchant solution potentially due to limitations in the availability of oxidizing species at the silicon-etchant solution interface. In the last part of the dissertation, we have discussed the gradient bottom up filling of Cu in porous silicon substrates using the contactless electrochemical method. The radially symmetric current that gradually varied across the radius of the sample area was achieved by utilizing the modified cell design, which resulted in gradient filling in the vias. Effect of different deposition parameters such as applied current density, copper sulfate concentration and etching to deposition area ratio has been examined and discussed. (Abstract shortened by ProQuest.).

  11. Mental models for cognitive control

    NASA Astrophysics Data System (ADS)

    Schilling, Malte; Cruse, Holk; Schmitz, Josef

    2007-05-01

    Even so called "simple" organisms as insects are able to fastly adapt to changing conditions of their environment. Their behaviour is affected by many external influences and only its variability and adaptivity permits their survival. An intensively studied example concerns hexapod walking. 1,2 Complex walking behaviours in stick insects have been analysed and the results were used to construct a reactive model that controls walking in a robot. This model is now extended by higher levels of control: as a bottom-up approach the low-level reactive behaviours are modulated and activated through a medium level. In addition, the system grows up to an upper level for cognitive control of the robot: Cognition - as the ability to plan ahead - and cognitive skills involve internal representations of the subject itself and its environment. These representations are used for mental simulations: In difficult situations, for which neither motor primitives, nor whole sequences of these exist, available behaviours are varied and applied in the internal model while the body itself is decoupled from the controlling modules. The result of the internal simulation is evaluated. Successful actions are learned and applied to the robot. This constitutes a level for planning. Its elements (movements, behaviours) are embodied in the lower levels, whereby their meaning arises directly from these levels. The motor primitives are situation models represented as neural networks. The focus of this work concerns the general architecture of the framework as well as the reactive basic layer of the bottom-up architecture and its connection to higher level functions and its application on an internal model.

  12. Dissociable Effects of Aging and Mild Cognitive Impairment on Bottom-Up Audiovisual Integration.

    PubMed

    Festa, Elena K; Katz, Andrew P; Ott, Brian R; Tremont, Geoffrey; Heindel, William C

    2017-01-01

    Effective audiovisual sensory integration involves dynamic changes in functional connectivity between superior temporal sulcus and primary sensory areas. This study examined whether disrupted connectivity in early Alzheimer's disease (AD) produces impaired audiovisual integration under conditions requiring greater corticocortical interactions. Audiovisual speech integration was examined in healthy young adult controls (YC), healthy elderly controls (EC), and patients with amnestic mild cognitive impairment (MCI) using McGurk-type stimuli (providing either congruent or incongruent audiovisual speech information) under conditions differing in the strength of bottom-up support and the degree of top-down lexical asymmetry. All groups accurately identified auditory speech under congruent audiovisual conditions, and displayed high levels of visual bias under strong bottom-up incongruent conditions. Under weak bottom-up incongruent conditions, however, EC and amnestic MCI groups displayed opposite patterns of performance, with enhanced visual bias in the EC group and reduced visual bias in the MCI group relative to the YC group. Moreover, there was no overlap between the EC and MCI groups in individual visual bias scores reflecting the change in audiovisual integration from the strong to the weak stimulus conditions. Top-down lexicality influences on visual biasing were observed only in the MCI patients under weaker bottom-up conditions. Results support a deficit in bottom-up audiovisual integration in early AD attributable to disruptions in corticocortical connectivity. Given that this deficit is not simply an exacerbation of changes associated with healthy aging, tests of audiovisual speech integration may serve as sensitive and specific markers of the earliest cognitive change associated with AD.

  13. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    PubMed

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  14. Reconciling Top-Down and Bottom-Up Estimates of Oil and Gas Methane Emissions in the Barnett Shale

    NASA Astrophysics Data System (ADS)

    Hamburg, S.

    2015-12-01

    Top-down approaches that use aircraft, tower, or satellite-based measurements of well-mixed air to quantify regional methane emissions have typically estimated higher emissions from the natural gas supply chain when compared to bottom-up inventories. A coordinated research campaign in October 2013 used simultaneous top-down and bottom-up approaches to quantify total and fossil methane emissions in the Barnett Shale region of Texas. Research teams have published individual results including aircraft mass-balance estimates of regional emissions and a bottom-up, 25-county region spatially-resolved inventory. This work synthesizes data from the campaign to directly compare top-down and bottom-up estimates. A new analytical approach uses statistical estimators to integrate facility emission rate distributions from unbiased and targeted high emission site datasets, which more rigorously incorporates the fat-tail of skewed distributions to estimate regional emissions of well pads, compressor stations, and processing plants. The updated spatially-resolved inventory was used to estimate total and fossil methane emissions from spatial domains that match seven individual aircraft mass balance flights. Source apportionment of top-down emissions between fossil and biogenic methane was corroborated with two independent analyses of methane and ethane ratios. Reconciling top-down and bottom-up estimates of fossil methane emissions leads to more accurate assessment of natural gas supply chain emission rates and the relative contribution of high emission sites. These results increase our confidence in our understanding of the climate impacts of natural gas relative to more carbon-intensive fossil fuels and the potential effectiveness of mitigation strategies.

  15. Towards a tolerance toolkit: Gene expression signatures enabling the emergence of resistant bacterial strains

    NASA Astrophysics Data System (ADS)

    Erickson, Keesha; Chatterjee, Anushree

    2014-03-01

    Microbial pathogens are able to rapidly acquire tolerance to chemical toxins. Developing next-generation antibiotics that impede the emergence of resistance will help avoid a world-wide health crisis. Conversely, the ability to induce rapid tolerance gains could lead to high-yielding strains for sustainable production of biofuels and commodity chemicals. Achieving these goals requires an understanding of the general mechanisms allowing microbes to become resistant to diverse toxins. We apply top-down and bottom-up methodologies to identify biological network changes leading to adaptation and tolerance. Using a top-down approach, we perform evolution experiments to isolate resistant strains, collect samples for transcriptomic and proteomic analysis, and use the omics data to inform mathematical gene regulatory models. Using a bottom-up approach, we build and test synthetic genetic devices that enable increased or decreased expression of selected genes. Unique patterns in gene expression are identified in cultures actively gaining resistance, especially in pathways known to be involved with stress response, efflux, and mutagenesis. Genes correlated with tolerance could potentially allow the design of resistance-free antibiotics or robust chemical production strains.

  16. Bottom-up construction of artificial molecules for superconducting quantum processors

    NASA Astrophysics Data System (ADS)

    Poletto, Stefano; Rigetti, Chad; Gambetta, Jay M.; Merkel, Seth; Chow, Jerry M.; Corcoles, Antonio D.; Smolin, John A.; Rozen, Jim R.; Keefe, George A.; Rothwell, Mary B.; Ketchen, Mark B.; Steffen, Matthias

    2012-02-01

    Recent experiments on transmon qubits capacitively coupled to superconducting 3-dimensional cavities have shown coherence times much longer than transmons coupled to more traditional planar resonators. For the implementation of a quantum processor this approach has clear advantages over traditional techniques but it poses the challenge of scalability. We are currently implementing multi-qubits experiments based on a bottom-up scaling approach. First, transmon qubits are fabricated on individual chips and are independently characterized. Second, an artificial molecule is assembled by selecting a particular set of previously characterized single-transmon chips. We present recent data on a two-qubit artificial molecule constructed in this way. The two qubits are chosen to generate a strong Z-Z interaction by matching the 0-1 transition energy of one qubit with the 1-2 transition of the other. Single qubit manipulations and state tomography cannot be done with ``traditional'' single tone microwave pulses but instead specifically shaped pulses have to be simultaneously applied on both qubits. Coherence times, coupling strength, and optimal pulses for decoupling the two qubits and perform state tomography are presented

  17. Optical imaging of the rat brain suggests a previously missing link between top-down and bottom-up nervous system function.

    PubMed

    Greenfield, Susan A; Badin, Antoine-Scott; Ferrati, Giovanni; Devonshire, Ian M

    2017-07-01

    Optical imaging with voltage-sensitive dyes enables the visualization of extensive yet highly transient coalitions of neurons (assemblies) operating throughout the brain on a subsecond time scale. We suggest that operating at the mesoscale level of brain organization, neuronal assemblies may provide a functional link between "bottom-up" cellular mechanisms and "top-down" cognitive ones within anatomically defined regions. We demonstrate in ex vivo rat brain slices how varying spatiotemporal dynamics of assemblies reveal differences not previously appreciated between: different stages of development in cortical versus subcortical brain areas, different sensory modalities (hearing versus vision), different classes of psychoactive drugs (anesthetics versus analgesics), different effects of anesthesia linked to hyperbaric conditions and, in vivo , depths of anesthesia. The strategy of voltage-sensitive dye imaging is therefore as powerful as it is versatile and as such can now be applied to the evaluation of neurochemical signaling systems and the screening of related new drugs, as well as to mathematical modeling and, eventually, even theories of consciousness.

  18. The changing contribution of top-down and bottom-up limitation of mesopredators during 220 years of land use and climate change.

    PubMed

    Pasanen-Mortensen, Marianne; Elmhagen, Bodil; Lindén, Harto; Bergström, Roger; Wallgren, Märtha; van der Velde, Ype; Cousins, Sara A O

    2017-05-01

    Apex predators may buffer bottom-up driven ecosystem change, as top-down suppression may dampen herbivore and mesopredator responses to increased resource availability. However, theory suggests that for this buffering capacity to be realized, the equilibrium abundance of apex predators must increase. This raises the question: will apex predators maintain herbivore/mesopredator limitation, if bottom-up change relaxes resource constraints? Here, we explore changes in mesopredator (red fox Vulpes vulpes) abundance over 220 years in response to eradication and recovery of an apex predator (Eurasian lynx Lynx lynx), and changes in land use and climate which are linked to resource availability. A three-step approach was used. First, recent data from Finland and Sweden were modelled to estimate linear effects of lynx density, land use and winter temperature on fox density. Second, lynx density, land use and winter temperature was estimated in a 22 650 km 2 focal area in boreal and boreo-nemoral Sweden in the years 1830, 1920, 2010 and 2050. Third, the models and estimates were used to project historic and future fox densities in the focal area. Projected fox density was lowest in 1830 when lynx density was high, winters cold and the proportion of cropland low. Fox density peaked in 1920 due to lynx eradication, a mesopredator release boosted by favourable bottom-up changes - milder winters and cropland expansion. By 2010, lynx recolonization had reduced fox density, but it remained higher than in 1830, partly due to the bottom-up changes. Comparing 1830 to 2010, the contribution of top-down limitation decreased, while environment enrichment relaxed bottom-up limitation. Future scenarios indicated that by 2050, lynx density would have to increase by 79% to compensate for a projected climate-driven increase in fox density. We highlight that although top-down limitation in theory can buffer bottom-up change, this requires compensatory changes in apex predator abundance. Hence apex predator recolonization/recovery to historical levels would not be sufficient to compensate for widespread changes in climate and land use, which have relaxed the resource constraints for many herbivores and mesopredators. Variation in bottom-up conditions may also contribute to context dependence in apex predator effects. © 2017 The Authors. Journal of Animal Ecology © 2017 British Ecological Society.

  19. Differentiating Co-Occurring Behavior Problems in Children with ADHD: Patterns of Emotional Reactivity and Executive Functioning

    ERIC Educational Resources Information Center

    Graziano, Paulo A.; McNamara, Joseph P.; Geffken, Gary R.; Reid, Adam M.

    2013-01-01

    Objective:This study examined whether "top-down" and "bottom-up" control processes can differentiate children with ADHD who exhibit co-occurring aggression and/or internalizing symptoms. Method: Participants included 74 children ("M" age = 10.7 years) with a "Diagnostic and Statistical…

  20. Reversible storage of lithium in a rambutan-like tin-carbon electrode.

    PubMed

    Deng, Da; Lee, Jim Yang

    2009-01-01

    Fruity electrodes: A simple bottom-up self-assembly method was used to fabricate rambutan-like tin-carbon (Sn@C) nanoarchitecture (see scheme, green Sn) to improve the reversible storage of lithium in tin. The mechanism of the growth of the pear-like hairs is explored.

  1. Removal of boron from ceramic industry wastewater by adsorption-flocculation mechanism using palm oil mill boiler (POMB) bottom ash and polymer.

    PubMed

    Chong, Mei Fong; Lee, Kah Peng; Chieng, Hui Jiun; Syazwani Binti Ramli, Ili Izyan

    2009-07-01

    Boron is extensively used in the ceramic industry for enhancing mechanical strength of the tiles. The discharge of boron containing wastewater to the environment causes severe pollution problems. Boron is also dangerous for human consumption and causes organisms' reproductive impediments if the safe intake level is exceeded. Current methods to remove boron include ion-exchange, membrane filtration, precipitation-coagulation, biological and chemical treatment. These methods are costly to remove boron from the wastewater and hence infeasible for industrial wastewater treatment. In the present research, adsorption-flocculation mechanism is proposed for boron removal from ceramic wastewater by using Palm Oil Mill Boiler (POMB) bottom ash and long chain polymer or flocculant. Ceramic wastewater is turbid and milky in color which contains 15 mg/L of boron and 2000 mg/L of suspended solids. The optimum operating conditions for boron adsorption on POMB bottom ash and flocculation using polymer were investigated in the present research. Adsorption isotherm of boron on bottom ash was also investigated to evaluate the adsorption capacity. Adsorption isotherm modeling was conducted based on Langmuir and Freundlich isotherms. The results show that coarse POMB bottom ash with particle size larger than 2 mm is a suitable adsorbent where boron is removed up to 80% under the optimum conditions (pH=8.0, dosage=40 g bottom ash/300 ml wastewater, residence time=1h). The results also show that KP 1200 B cationic polymer is effective in flocculating the suspended solids while AP 120 C anionic polymer is effective in flocculating the bottom ash. The combined cationic and anionic polymers are able to clarify the ceramic wastewater under the optimum conditions (dosage of KP 1200 B cationic polymer=100 mg/L, dosage of AP 120 C anionic polymer=50 mg/L, mixing speed=200 rpm). Under the optimum operating conditions, the boron and suspended solids concentration of the treated wastewater were reduced to 3 mg/L and 5 mg/L respectively, satisfying the discharge requirement by Malaysia Department of Environment (DOE). The modeling study shows that the adsorption isotherm of boron onto POMB bottom ash conformed to the Freundlich Isotherm. The proposed method is suitable for boron removal in ceramic wastewater especially in regions where POMB bottom ash is abundant.

  2. Emissions of nitrogen oxides from US urban areas: estimation from Ozone Monitoring Instrument retrievals for 2005-2014

    DOE PAGES

    Lu, Z.; Streets, D. G.; de Foy, B.; ...

    2015-05-28

    Satellite remote sensing of tropospheric nitrogen dioxide (NO 2) can provide valuable information for estimating surface nitrogen oxides (NO x) emissions. Using an exponentially-modified Gaussian (EMG) method and taking into account the effect of wind on observed NO 2 distributions, we estimate three-year moving-average emissions of summertime NO x from 35 US urban areas directly from NO 2 retrievals of the Ozone Monitoring Instrument (OMI) during 2005–2014. Following the conclusions of previous studies that the EMG method provides robust and accurate emission estimates under strong-wind conditions, we derive top-down NO x emissions from each urban area by applying the EMGmore » method to OMI data with wind speeds greater than 3–5 m s -1. Meanwhile, we find that OMI NO 2 observations under weak-wind conditions (i.e., < 3 m s -1) are qualitatively better correlated with the surface NO x source strength in comparison to all-wind OMI maps; and therefore we use them to calculate the satellite-observed NO 2 burdens of urban areas and compare with NO x emission estimates. The EMG results show that OMI-derived NO x emissions are highly correlated ( R > 0.93) with weak-wind OMI NO 2 burdens as well as bottom-up NO x emission estimates over 35 urban areas, implying a linear response of the OMI observations to surface emissions under weak-wind conditions. The simultaneous, EMG-obtained, effective NO 2 lifetimes (~3.5 ± 1.3 h), however, are biased low in comparison to the summertime NO 2 chemical lifetimes. In general, isolated urban areas with NO x emission intensities greater than ~ 2 Mg h -1 produce statistically significant weak-wind signals in three-year average OMI data. From 2005 to 2014, we estimate that total OMI-derived NO x emissions over all selected US urban areas decreased by 49%, consistent with reductions of 43, 47, 49, and 44% in the total bottom-up NO x emissions, the sum of weak-wind OMI NO 2 columns, the total weak-wind OMI NO 2 burdens, and the averaged NO 2 concentrations, respectively, reflecting the success of NO x control programs for both mobile sources and power plants. The decrease rates of these NO x-related quantities are found to be faster (i.e., -6.8 to -9.3% yr -1) before 2010 and slower (i.e., -3.4 to -4.9% yr -1) after 2010. For individual urban areas, we calculate the R values of pair-wise trends among the OMI-derived and bottom-up NO x emissions, the weak-wind OMI NO 2 burdens, and ground-based NO 2 measurements; and high correlations are found for all urban areas (median R = 0.8), particularly large ones ( R up to 0.97). The results of the current work indicate that using the EMG method and considering the wind effect, the OMI data allow for the estimation of NO x emissions from urban areas and the direct constraint of emission trends with reasonable accuracy.« less

  3. Emissions of nitrogen oxides from US urban areas: estimation from Ozone Monitoring Instrument retrievals for 2005-2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Z.; Streets, D. G.; de Foy, B.

    Satellite remote sensing of tropospheric nitrogen dioxide (NO 2) can provide valuable information for estimating surface nitrogen oxides (NO x) emissions. Using an exponentially-modified Gaussian (EMG) method and taking into account the effect of wind on observed NO 2 distributions, we estimate three-year moving-average emissions of summertime NO x from 35 US urban areas directly from NO 2 retrievals of the Ozone Monitoring Instrument (OMI) during 2005–2014. Following the conclusions of previous studies that the EMG method provides robust and accurate emission estimates under strong-wind conditions, we derive top-down NO x emissions from each urban area by applying the EMGmore » method to OMI data with wind speeds greater than 3–5 m s -1. Meanwhile, we find that OMI NO 2 observations under weak-wind conditions (i.e., < 3 m s -1) are qualitatively better correlated with the surface NO x source strength in comparison to all-wind OMI maps; and therefore we use them to calculate the satellite-observed NO 2 burdens of urban areas and compare with NO x emission estimates. The EMG results show that OMI-derived NO x emissions are highly correlated ( R > 0.93) with weak-wind OMI NO 2 burdens as well as bottom-up NO x emission estimates over 35 urban areas, implying a linear response of the OMI observations to surface emissions under weak-wind conditions. The simultaneous, EMG-obtained, effective NO 2 lifetimes (~3.5 ± 1.3 h), however, are biased low in comparison to the summertime NO 2 chemical lifetimes. In general, isolated urban areas with NO x emission intensities greater than ~ 2 Mg h -1 produce statistically significant weak-wind signals in three-year average OMI data. From 2005 to 2014, we estimate that total OMI-derived NO x emissions over all selected US urban areas decreased by 49%, consistent with reductions of 43, 47, 49, and 44% in the total bottom-up NO x emissions, the sum of weak-wind OMI NO 2 columns, the total weak-wind OMI NO 2 burdens, and the averaged NO 2 concentrations, respectively, reflecting the success of NO x control programs for both mobile sources and power plants. The decrease rates of these NO x-related quantities are found to be faster (i.e., -6.8 to -9.3% yr -1) before 2010 and slower (i.e., -3.4 to -4.9% yr -1) after 2010. For individual urban areas, we calculate the R values of pair-wise trends among the OMI-derived and bottom-up NO x emissions, the weak-wind OMI NO 2 burdens, and ground-based NO 2 measurements; and high correlations are found for all urban areas (median R = 0.8), particularly large ones ( R up to 0.97). The results of the current work indicate that using the EMG method and considering the wind effect, the OMI data allow for the estimation of NO x emissions from urban areas and the direct constraint of emission trends with reasonable accuracy.« less

  4. Two-dimensional hexagonally oriented CdCl2.H2O nanorod assembly: formation and replication.

    PubMed

    Deng, Zhaoxiang; Mao, Chengde

    2004-09-14

    This paper reports a simple bottom-up method that can controllably fabricate 2D hexagonally oriented and randomly distributed CdCl(2).H(2)O nanorods on mica surfaces. The as-formed nanorod assemblies have been successfully replicated into various matrixes, including gold, poly(dimethylsiloxane), and polyurethane. Thus, this method is compatible with soft-lithography towards further applications.

  5. Bottom-up meets top-down: tailored raspberry-like Fe 3 O 4 –Pt nanocrystal superlattices

    DOE PAGES

    Qiu, Fen; Vervuurt, René H. J.; Verheijen, Marcel A.; ...

    2018-01-01

    Bottom up colloidal synthesis is combined with top down atomic layer deposition to achieve raspberry-like Pt-decorated Fe 3 O 4 nanoparticle superlattices with good metal–oxide–metal contact for photoelectrocatalysis.

  6. Bottom-up meets top-down: tailored raspberry-like Fe 3 O 4 –Pt nanocrystal superlattices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qiu, Fen; Vervuurt, René H. J.; Verheijen, Marcel A.

    Bottom up colloidal synthesis is combined with top down atomic layer deposition to achieve raspberry-like Pt-decorated Fe 3 O 4 nanoparticle superlattices with good metal–oxide–metal contact for photoelectrocatalysis.

  7. Heat-resistant DNA tile arrays constructed by template-directed photoligation through 5-carboxyvinyl-2′-deoxyuridine

    PubMed Central

    Tagawa, Miho; Shohda, Koh-ichiroh; Fujimoto, Kenzo; Sugawara, Tadashi; Suyama, Akira

    2007-01-01

    Template-directed DNA photoligation has been applied to a method to construct heat-resistant two-dimensional (2D) DNA arrays that can work as scaffolds in bottom-up assembly of functional biomolecules and nano-electronic components. DNA double-crossover AB-staggered (DXAB) tiles were covalently connected by enzyme-free template-directed photoligation, which enables a specific ligation reaction in an extremely tight space and under buffer conditions where no enzymes work efficiently. DNA nanostructures created by self-assembly of the DXAB tiles before and after photoligation have been visualized by high-resolution, tapping mode atomic force microscopy in buffer. The improvement of the heat tolerance of 2D DNA arrays was confirmed by heating and visualizing the DNA nanostructures. The heat-resistant DNA arrays may expand the potential of DNA as functional materials in biotechnology and nanotechnology. PMID:17982178

  8. Nanoscale device architectures derived from biological assemblies: The case of tobacco mosaic virus and (apo)ferritin

    NASA Astrophysics Data System (ADS)

    Calò, Annalisa; Eiben, Sabine; Okuda, Mitsuhiro; Bittner, Alexander M.

    2016-03-01

    Virus particles and proteins are excellent examples of naturally occurring structures with well-defined nanoscale architectures, for example, cages and tubes. These structures can be employed in a bottom-up assembly strategy to fabricate repetitive patterns of hybrid organic-inorganic materials. In this paper, we review methods of assembly that make use of protein and virus scaffolds to fabricate patterned nanostructures with very high spatial control. We chose (apo)ferritin and tobacco mosaic virus (TMV) as model examples that have already been applied successfully in nanobiotechnology. Their interior space and their exterior surfaces can be mineralized with inorganic layers or nanoparticles. Furthermore, their native assembly abilities can be exploited to generate periodic architectures for integration in electrical and magnetic devices. We introduce the state of the art and describe recent advances in biomineralization techniques, patterning and device production with (apo)ferritin and TMV.

  9. State of the art of nanocrystals technology for delivery of poorly soluble drugs

    NASA Astrophysics Data System (ADS)

    Zhou, Yuqi; Du, Juan; Wang, Lulu; Wang, Yancai

    2016-09-01

    Formulation of nanocrystals is a distinctive approach which can effectively improve the delivery of poorly water-soluble drugs, thus enticing the development of the nanocrystals technology. The characteristics of nanocrystals resulted in an exceptional drug delivery conductance, including saturation solubility, dissolution velocity, adhesiveness, and affinity. Nanocrystals were treated as versatile pharmaceuticals that could be delivered through almost all routes of administration. In the current review, oral, pulmonary, and intravenous routes of administration were presented. Also, the targeting of drug nanocrystals, as well as issues of efficacy and safety, were also discussed. Several methods were applied for nanocrystals production including top-down production strategy (media milling, high-pressure homogenization), bottom-up production strategy (antisolvent precipitation, supercritical fluid process, and precipitation by removal of solvent), and the combination approaches. Moreover, this review also described the evaluation and characterization of the drug nanocrystals and summarized the current commercial pharmaceutical products utilizing nanocrystals technology.

  10. Controlling Charged Particles with Inhomogeneous Electrostatic Fields

    NASA Technical Reports Server (NTRS)

    Herrero, Federico A. (Inventor)

    2016-01-01

    An energy analyzer for a charged-particle spectrometer may include a top deflection plate and a bottom deflection plate. The top and bottom deflection plates may be non-symmetric and configured to generate an inhomogeneous electrostatic field when a voltage is applied to one of the top or bottom deflection plates. In some instances, the top and bottom deflection plates may be L-shaped deflection plates.

  11. Risk assessment strategies as a tool in the application of the Appropriate Level of Protection (ALOP) and Food Safety Objective (FSO) by risk managers.

    PubMed

    Gkogka, E; Reij, M W; Gorris, L G M; Zwietering, M H

    2013-10-01

    In the course of the last decade, the Appropriate Level of Protection (ALOP), the Food Safety Objective (FSO) and their associated metrics have been proposed by the World Trade Organization and Codex Alimentarius as a means for competent authorities to ultimately translate governmental public health policy regarding food safety into risk-based targets for the food industry. The industry needs to meet these targets through the effective choice of control measures that are part of its operational food safety management system. The aim of this study was to put the practical application of ALOP and FSO to the test in the case of Salmonella in chicken meat in the Netherlands. Two different risk assessment approaches were applied to derive potential ALOP and FSO values, a 'top-down' approach based on epidemiological data and a 'bottom-up' approach based on food supply chain data. To this end, two stochastic models specific to the Dutch situation were built. Comparisons between 23 countries in Europe were also made using the top-down model. The mean estimated current Level Of Protection values were similar for the two approaches applied, with the bottom-up model yielding 87 cases per 100,000 inhabitants per year (95% CI: 0.03, 904) and the top-down model 71 (95% CI: 9.9, 155). The estimated FSO values on the other hand were considerably different with the mean 'top down' FSO being -4.6 log CFU/g (95% CI: -5.4, -4.1) and the mean 'bottom-up' FSO -6.0 log CFU/g (95% CI: -8.1, -2.9) reflecting major differences in the output distributions of this parameter obtained with the two approaches. Significant differences were observed between current LOP values for different EU countries, although it was not clear whether this was due to actual differences in the factors influencing the risk of salmonellosis or due to the quality of the available data. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. What are the fluxes of greenhouse gases from the greater Los Angeles area as inferred from top-down remote sensing studies?

    NASA Astrophysics Data System (ADS)

    Hedelius, J.; Wennberg, P. O.; Wunch, D.; Roehl, C. M.; Podolske, J. R.; Hillyard, P.; Iraci, L. T.

    2017-12-01

    Greenhouse gas (GHG) emissions from California's South Coast Air Basin (SoCAB) have been studied extensively using a variety of tower, aircraft, remote sensing, emission inventory, and modeling studies. It is impractical to survey GHG fluxes from all urban areas and hot-spots to the extent the SoCAB has been studied, but it can serve as a test location for scaling methods globally. We use a combination of remote sensing measurements from ground (Total Carbon Column Observing Network, TCCON) and space-based (Observing Carbon Observatory-2, OCO-2) sensors in an inversion to obtain the carbon dioxide flux from the SoCAB. We also perform a variety of sensitivity tests to see how the inversion performs using different model parameterizations. Fluxes do not significantly depend on the mixed layer depth, but are sensitive to the model surface layers (<5 m). Carbon dioxide fluxes are larger than those from bottom-up inventories by about 20%, and along with CO has a significant weekend:weekday effect. Methane fluxes have little weekend changes. Results also include flux estimates from sub-regions of the SoCAB. Larger top-down than bottom-up fluxes highlight the need for additional work on both approaches. Higher top-down fluxes could arise from sampling bias, model bias, or may show bottom-up values underestimate sources. Lessons learned here may help in scaling up inversions to hundreds of urban systems using space-based observations.

  13. Three-dimensional profile extraction from CD-SEM image and top/bottom CD measurement by line-edge roughness analysis

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Atsuko; Ohashi, Takeyoshi; Kawasaki, Takahiro; Inoue, Osamu; Kawada, Hiroki

    2013-04-01

    A new method for calculating critical dimension (CDs) at the top and bottom of three-dimensional (3D) pattern profiles from a critical-dimension scanning electron microscope (CD-SEM) image, called as "T-sigma method", is proposed and evaluated. Without preparing a library of database in advance, T-sigma can estimate a feature of a pattern sidewall. Furthermore, it supplies the optimum edge-definition (i.e., threshold level for determining edge position from a CDSEM signal) to detect the top and bottom of the pattern. This method consists of three steps. First, two components of line-edge roughness (LER); noise-induced bias (i.e., LER bias) and unbiased component (i.e., bias-free LER) are calculated with set threshold level. Second, these components are calculated with various threshold values, and the threshold-dependence of these two components, "T-sigma graph", is obtained. Finally, the optimum threshold value for the top and the bottom edge detection are given by the analysis of T-sigma graph. T-sigma was applied to CD-SEM images of three kinds of resist-pattern samples. In addition, reference metrology was performed with atomic force microscope (AFM) and scanning transmission electron microscope (STEM). Sensitivity of CD measured by T-sigma to the reference CD was higher than or equal to that measured by the conventional edge definition. Regarding the absolute measurement accuracy, T-sigma showed better results than the conventional definition. Furthermore, T-sigma graphs were calculated from CD-SEM images of two kinds of resist samples and compared with corresponding STEM observation results. Both bias-free LER and LER bias increased as the detected edge point moved from the bottom to the top of the pattern in the case that the pattern had a straight sidewall and a round top. On the other hand, they were almost constant in the case that the pattern had a re-entrant profile. T-sigma will be able to reveal a re-entrant feature. From these results, it is found that T-sigma method can provide rough cross-sectional pattern features and achieve quick, easy and accurate measurements of top and bottom CD.

  14. Modelization of highly nonlinear waves in coastal regions

    NASA Astrophysics Data System (ADS)

    Gouin, Maïté; Ducrozet, Guillaume; Ferrant, Pierre

    2015-04-01

    The proposed work deals with the development of a highly non-linear model for water wave propagation in coastal regions. The accurate modelization of surface gravity waves is of major interest in ocean engineering, especially in the field of marine renewable energy. These marine structures are intended to be settled in coastal regions where the effect of variable bathymetry may be significant on local wave conditions. This study presents a numerical model for the wave propagation with complex bathymetry. It is based on High-Order Spectral (HOS) method, initially limited to the propagation of non-linear wave fields over flat bottom. Such a model has been developed and validated at the LHEEA Lab. (Ecole Centrale Nantes) over the past few years and the current developments will enlarge its application range. This new numerical model will keep the interesting numerical properties of the original pseudo-spectral approach (convergence, efficiency with the use of FFTs, …) and enable the possibility to propagate highly non-linear wave fields over long time and large distance. Different validations will be provided in addition to the presentation of the method. At first, Bragg reflection will be studied with the proposed approach. If the Bragg condition is satisfied, the reflected wave generated by a sinusoidal bottom patch should be amplified as a result of resonant quadratic interactions between incident wave and bottom. Comparisons will be provided with experiments and reference solutions. Then, the method will be used to consider the transformation of a non-linear monochromatic wave as it propagates up and over a submerged bar. As the waves travel up the front slope of the bar, it steepens and high harmonics are generated due to non-linear interactions. Comparisons with experimental data will be provided. The different test cases will assess the accuracy and efficiency of the method proposed.

  15. Residual interference assessment in adaptive wall wind tunnels

    NASA Technical Reports Server (NTRS)

    Murthy, A. V.

    1989-01-01

    A two-variable method is presented which is suitable for on-line calculation of residual interference in airfoil testing in the Langley 0.3-Meter Transonic Cryogenic Tunnel (0.3-M TCT). The method applies the Cauchy's integral formula to the closed contour formed by the contoured top and bottom walls, and the upstream and downstream ends. The measured top and bottom wall pressures and position are used to calculate the correction to the test Mach number and the airfoil angle of attack. Application to specific data obtained in the 0.3-M TCT adaptive wall test section demonstrates the need to assess residual interference to ensure that the desired level of wall streamlining is achieved. A FORTRAN computer program was developed for on-line calculation of the residual corrections during airfoil tests in the 0.3-M TCT.

  16. Method to implement the CCD timing generator based on FPGA

    NASA Astrophysics Data System (ADS)

    Li, Binhua; Song, Qian; He, Chun; Jin, Jianhui; He, Lin

    2010-07-01

    With the advance of the PFPA technology, the design methodology of digital systems is changing. In recent years we develop a method to implement the CCD timing generator based on FPGA and VHDL. This paper presents the principles and implementation skills of the method. Taking a developed camera as an example, we introduce the structure, input and output clocks/signals of a timing generator implemented in the camera. The generator is composed of a top module and a bottom module. The bottom one is made up of 4 sub-modules which correspond to 4 different operation modes. The modules are implemented by 5 VHDL programs. Frame charts of the architecture of these programs are shown in the paper. We also describe implementation steps of the timing generator in Quartus II, and the interconnections between the generator and a Nios soft core processor which is the controller of this generator. Some test results are presented in the end.

  17. Relative importance of top-down and bottom-up forces in food webs of Sarracenia pitcher communities at a northern and a southern site.

    PubMed

    Hoekman, David

    2011-04-01

    The relative importance of resources (bottom-up forces) and natural enemies (top-down forces) for regulating food web dynamics has been debated, and both forces have been found to be critical for determining food web structure. How the relative importance of top-down and bottom-up forces varies between sites with different abiotic conditions is not well understood. Using the pitcher plant inquiline community as a model system, I examine how the relative importance of top-down and bottom-up effects differs between two disparate sites. Resources (ant carcasses) and top predators (mosquito larvae) were manipulated in two identical 4 × 4 factorial press experiments, conducted at two geographically distant sites (Michigan and Florida) within the range of the purple pitcher plant, Sarracenia purpurea, and the aquatic community that resides in its leaves. Overall, top predators reduced the density of prey populations while additional resources bolstered them, and the relative importance of top-down and bottom-up forces varied between sites and for different trophic levels. Specifically, top-down effects on protozoa were stronger in Florida than in Michigan, while the opposite pattern was found for rotifers. These findings experimentally demonstrate that the strength of predator-prey interactions, even those involving the same species, vary across space. While only two sites are compared in this study, I hypothesize that site differences in temperature, which influences metabolic rate, may be responsible for variation in consumer-resource interactions. These findings warrant further investigation into the specific factors that modify the relative importance of top-down and bottom-up effects.

  18. The fabrication of a double-layer atom chip with through silicon vias for an ultra-high-vacuum cell

    NASA Astrophysics Data System (ADS)

    Chuang, Ho-Chiao; Lin, Yun-Siang; Lin, Yu-Hsin; Huang, Chi-Sheng

    2014-04-01

    This study presents a double-layer atom chip that provides users with increased diversity in the design of the wire patterns and flexibility in the design of the magnetic field. It is more convenient for use in atomic physics experiments. A negative photoresist, SU-8, was used as the insulating layer between the upper and bottom copper wires. The electrical measurement results show that the upper and bottom wires with a width of 100 µm can sustain a 6 A current without burnout. Another focus of this study is the double-layer atom chips integrated with the through silicon via (TSV) technique, and anodically bonded to a Pyrex glass cell, which makes it a desired vacuum chamber for atomic physics experiments. Thus, the bonded glass cell not only significantly reduces the overall size of the ultra-high-vacuum (UHV) chamber but also conducts the high current from the backside to the front side of the atom chip via the TSV under UHV (9.5 × 10-10 Torr). The TSVs with a diameter of 70 µm were etched through by the inductively coupled plasma ion etching and filled by the bottom-up copper electroplating method. During the anodic bonding process, the electroplated copper wires and TSVs on atom chips also need to pass the examination of the required bonding temperature of 250 °C, under an applied voltage of 1000 V. Finally, the UHV test of the double-layer atom chips with TSVs at room temperature can be reached at 9.5 × 10-10 Torr, thus satisfying the requirements of atomic physics experiments under an UHV environment.

  19. A three-dimensional numerical simulation of cell behavior in a flow chamber based on fluid-solid interaction.

    PubMed

    Bai, Long; Cui, Yuhong; Zhang, Yixia; Zhao, Na

    2014-01-01

    The mechanical behavior of blood cells in the vessels has a close relationship with the physical characteristics of the blood and the cells. In this paper, a numerical simulation method was proposed to understand a single-blood cell's behavior in the vessels based on fluid-solid interaction method, which was conducted under adaptive time step and fixed time step, respectively. The main programme was C++ codes, which called FLUENT and ANSYS software, and UDF and APDL acted as a messenger to connect FLUENT and ANSYS for exchanging data. The computing results show: (1) the blood cell moved towards the bottom of the flow chamber in the beginning due to the influence of gravity, then it began to jump up when reached a certain height rather than touching the bottom. It could move downwards again after jump up, the blood cell could keep this way of moving like dancing continuously in the vessels; (2) the blood cell was rolling and deforming all the time; the rotation had oscillatory changes and the deformation became conspicuously when the blood cell was dancing. This new simulation method and results can be widely used in the researches of cytology, blood, cells, etc.

  20. Educational Outcomes and Socioeconomic Status: A Decomposition Analysis for Middle-Income Countries

    ERIC Educational Resources Information Center

    Nieto, Sandra; Ramos, Raúl

    2015-01-01

    This article analyzes the factors that explain the gap in educational outcomes between the top and bottom quartile of students in different countries, according to their socioeconomic status. To do so, it uses PISA microdata for 10 middle-income and 2 high-income countries, and applies the Oaxaca-Blinder decomposition method. Its results show that…

  1. Engineering neural systems for high-level problem solving.

    PubMed

    Sylvester, Jared; Reggia, James

    2016-07-01

    There is a long-standing, sometimes contentious debate in AI concerning the relative merits of a symbolic, top-down approach vs. a neural, bottom-up approach to engineering intelligent machine behaviors. While neurocomputational methods excel at lower-level cognitive tasks (incremental learning for pattern classification, low-level sensorimotor control, fault tolerance and processing of noisy data, etc.), they are largely non-competitive with top-down symbolic methods for tasks involving high-level cognitive problem solving (goal-directed reasoning, metacognition, planning, etc.). Here we take a step towards addressing this limitation by developing a purely neural framework named galis. Our goal in this work is to integrate top-down (non-symbolic) control of a neural network system with more traditional bottom-up neural computations. galis is based on attractor networks that can be "programmed" with temporal sequences of hand-crafted instructions that control problem solving by gating the activity retention of, communication between, and learning done by other neural networks. We demonstrate the effectiveness of this approach by showing that it can be applied successfully to solve sequential card matching problems, using both human performance and a top-down symbolic algorithm as experimental controls. Solving this kind of problem makes use of top-down attention control and the binding together of visual features in ways that are easy for symbolic AI systems but not for neural networks to achieve. Our model can not only be instructed on how to solve card matching problems successfully, but its performance also qualitatively (and sometimes quantitatively) matches the performance of both human subjects that we had perform the same task and the top-down symbolic algorithm that we used as an experimental control. We conclude that the core principles underlying the galis framework provide a promising approach to engineering purely neurocomputational systems for problem-solving tasks that in people require higher-level cognitive functions. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Migrating a lecture in nursing informatics to a blended learning format--A bottom-up approach to implement an open-source web-based learning management system.

    PubMed

    Schrader, Ulrich

    2006-01-01

    At the university of applied sciences in Germany a learning management system has been implemented. The migration of classic courses to a web-enhances curriculum can be categorized into three phases independent of the technology used. The first two phases "dedicated website" and "database supported content management system" are mainly concerned with bringing the learning material and current information online and making it available to the students. The goal is here to make the maintenance of the learning material easier. The third phase characterized by the use of a learning management system offers the support of more modern didactic principles like social constructionism or problem-oriented learning. In this papers the phases as they occurred with the migration of a course of nursing informatics are described and experiences discussed.. The absence of institutional goals associated with the use of a learning management system led to a bottom-up approach triggered by faculty activities that can be described by a promoter model rather than by a process management model. The use of an open source learning management systems made this process easier to realize since no financial commitment is required up front.

  3. Bottom-up GGM algorithm for constructing multiple layered hierarchical gene regulatory networks

    USDA-ARS?s Scientific Manuscript database

    Multilayered hierarchical gene regulatory networks (ML-hGRNs) are very important for understanding genetics regulation of biological pathways. However, there are currently no computational algorithms available for directly building ML-hGRNs that regulate biological pathways. A bottom-up graphic Gaus...

  4. Voluntary task switching under load: contribution of top-down and bottom-up factors in goal-directed behavior.

    PubMed

    Demanet, Jelle; Verbruggen, Frederick; Liefooghe, Baptist; Vandierendonck, André

    2010-06-01

    The present study investigated the relative contribution of bottom-up and top-down control to task selection in the voluntary task-switching (VTS) procedure. In order to manipulate the efficiency of top-down control, a concurrent working memory load was imposed during VTS. In three experiments, bottom-up factors, such as stimulus repetitions, repetition of irrelevant information, and stimulus-task associations, were introduced in order to investigate their influence on task selection. We observed that the tendency to repeat tasks was stronger under load, suggesting that top-down control counteracts the automatic tendency to repeat tasks. The results also indicated that task selection can be guided by several elements in the environment, but that only the influence of stimulus repetitions depends on the efficiency of top-down control. The theoretical implications of these findings are discussed within the interplay between top-down and bottom-up control that underlies the voluntary selection of tasks.

  5. When Art Moves the Eyes: A Behavioral and Eye-Tracking Study

    PubMed Central

    Massaro, Davide; Savazzi, Federica; Di Dio, Cinzia; Freedberg, David; Gallese, Vittorio; Gilli, Gabriella; Marchetti, Antonella

    2012-01-01

    The aim of this study was to investigate, using eye-tracking technique, the influence of bottom-up and top-down processes on visual behavior while subjects, naïve to art criticism, were presented with representational paintings. Forty-two subjects viewed color and black and white paintings (Color) categorized as dynamic or static (Dynamism) (bottom-up processes). Half of the images represented natural environments and half human subjects (Content); all stimuli were displayed under aesthetic and movement judgment conditions (Task) (top-down processes). Results on gazing behavior showed that content-related top-down processes prevailed over low-level visually-driven bottom-up processes when a human subject is represented in the painting. On the contrary, bottom-up processes, mediated by low-level visual features, particularly affected gazing behavior when looking at nature-content images. We discuss our results proposing a reconsideration of the definition of content-related top-down processes in accordance with the concept of embodied simulation in art perception. PMID:22624007

  6. Self-assembled nanostructured resistive switching memory devices fabricated by templated bottom-up growth

    PubMed Central

    Song, Ji-Min; Lee, Jang-Sik

    2016-01-01

    Metal-oxide-based resistive switching memory device has been studied intensively due to its potential to satisfy the requirements of next-generation memory devices. Active research has been done on the materials and device structures of resistive switching memory devices that meet the requirements of high density, fast switching speed, and reliable data storage. In this study, resistive switching memory devices were fabricated with nano-template-assisted bottom up growth. The electrochemical deposition was adopted to achieve the bottom-up growth of nickel nanodot electrodes. Nickel oxide layer was formed by oxygen plasma treatment of nickel nanodots at low temperature. The structures of fabricated nanoscale memory devices were analyzed with scanning electron microscope and atomic force microscope (AFM). The electrical characteristics of the devices were directly measured using conductive AFM. This work demonstrates the fabrication of resistive switching memory devices using self-assembled nanoscale masks and nanomateirals growth from bottom-up electrochemical deposition. PMID:26739122

  7. When art moves the eyes: a behavioral and eye-tracking study.

    PubMed

    Massaro, Davide; Savazzi, Federica; Di Dio, Cinzia; Freedberg, David; Gallese, Vittorio; Gilli, Gabriella; Marchetti, Antonella

    2012-01-01

    The aim of this study was to investigate, using eye-tracking technique, the influence of bottom-up and top-down processes on visual behavior while subjects, naïve to art criticism, were presented with representational paintings. Forty-two subjects viewed color and black and white paintings (Color) categorized as dynamic or static (Dynamism) (bottom-up processes). Half of the images represented natural environments and half human subjects (Content); all stimuli were displayed under aesthetic and movement judgment conditions (Task) (top-down processes). Results on gazing behavior showed that content-related top-down processes prevailed over low-level visually-driven bottom-up processes when a human subject is represented in the painting. On the contrary, bottom-up processes, mediated by low-level visual features, particularly affected gazing behavior when looking at nature-content images. We discuss our results proposing a reconsideration of the definition of content-related top-down processes in accordance with the concept of embodied simulation in art perception.

  8. Bottom-Up Tri-gate Transistors and Submicrosecond Photodetectors from Guided CdS Nanowalls.

    PubMed

    Xu, Jinyou; Oksenberg, Eitan; Popovitz-Biro, Ronit; Rechav, Katya; Joselevich, Ernesto

    2017-11-08

    Tri-gate transistors offer better performance than planar transistors by exerting additional gate control over a channel from two lateral sides of semiconductor nanowalls (or "fins"). Here we report the bottom-up assembly of aligned CdS nanowalls by a simultaneous combination of horizontal catalytic vapor-liquid-solid growth and vertical facet-selective noncatalytic vapor-solid growth and their parallel integration into tri-gate transistors and photodetectors at wafer scale (cm 2 ) without postgrowth transfer or alignment steps. These tri-gate transistors act as enhancement-mode transistors with an on/off current ratio on the order of 10 8 , 4 orders of magnitude higher than the best results ever reported for planar enhancement-mode CdS transistors. The response time of the photodetector is reduced to the submicrosecond level, 1 order of magnitude shorter than the best results ever reported for photodetectors made of bottom-up semiconductor nanostructures. Guided semiconductor nanowalls open new opportunities for high-performance 3D nanodevices assembled from the bottom up.

  9. Comparing Individual Tree Segmentation Based on High Resolution Multispectral Image and Lidar Data

    NASA Astrophysics Data System (ADS)

    Xiao, P.; Kelly, M.; Guo, Q.

    2014-12-01

    This study compares the use of high-resolution multispectral WorldView images and high density Lidar data for individual tree segmentation. The application focuses on coniferous and deciduous forests in the Sierra Nevada Mountains. The tree objects are obtained in two ways: a hybrid region-merging segmentation method with multispectral images, and a top-down and bottom-up region-growing method with Lidar data. The hybrid region-merging method is used to segment individual tree from multispectral images. It integrates the advantages of global-oriented and local-oriented region-merging strategies into a unified framework. The globally most-similar pair of regions is used to determine the starting point of a growing region. The merging iterations are constrained within the local vicinity, thus the segmentation is accelerated and can reflect the local context. The top-down region-growing method is adopted in coniferous forest to delineate individual tree from Lidar data. It exploits the spacing between the tops of trees to identify and group points into a single tree based on simple rules of proximity and likely tree shape. The bottom-up region-growing method based on the intensity and 3D structure of Lidar data is applied in deciduous forest. It segments tree trunks based on the intensity and topological relationships of the points, and then allocate other points to exact tree crowns according to distance. The accuracies for each method are evaluated with field survey data in several test sites, covering dense and sparse canopy. Three types of segmentation results are produced: true positive represents a correctly segmented individual tree, false negative represents a tree that is not detected and assigned to a nearby tree, and false positive represents that a point or pixel cluster is segmented as a tree that does not in fact exist. They respectively represent correct-, under-, and over-segmentation. Three types of index are compared for segmenting individual tree from multispectral image and Lidar data: recall, precision and F-score. This work explores the tradeoff between the expensive Lidar data and inexpensive multispectral image. The conclusion will guide the optimal data selection in different density canopy areas for individual tree segmentation, and contribute to the field of forest remote sensing.

  10. The study of the stress - strain state of the tank with bottom water drainage during operation

    NASA Astrophysics Data System (ADS)

    Shchipkova, Yu V.; Tokarev, V. V.

    2018-04-01

    Bottom drainage from tank is a current problem in modern tank usage. This article proposes the use of the bottom drainage system from the tank with the shape of the sloped cone to the centre of it. Changing the bottom design alters the stress - strain state to be analyzed in the Ansys. The analysis concluded that the proposed drainage system should be applied.

  11. Top-Down and Bottom-Up Identification of Proteins by Liquid Extraction Surface Analysis Mass Spectrometry of Healthy and Diseased Human Liver Tissue

    NASA Astrophysics Data System (ADS)

    Sarsby, Joscelyn; Martin, Nicholas J.; Lalor, Patricia F.; Bunch, Josephine; Cooper, Helen J.

    2014-09-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  12. Biased Competition in Visual Processing Hierarchies: A Learning Approach Using Multiple Cues.

    PubMed

    Gepperth, Alexander R T; Rebhan, Sven; Hasler, Stephan; Fritsch, Jannik

    2011-03-01

    In this contribution, we present a large-scale hierarchical system for object detection fusing bottom-up (signal-driven) processing results with top-down (model or task-driven) attentional modulation. Specifically, we focus on the question of how the autonomous learning of invariant models can be embedded into a performing system and how such models can be used to define object-specific attentional modulation signals. Our system implements bi-directional data flow in a processing hierarchy. The bottom-up data flow proceeds from a preprocessing level to the hypothesis level where object hypotheses created by exhaustive object detection algorithms are represented in a roughly retinotopic way. A competitive selection mechanism is used to determine the most confident hypotheses, which are used on the system level to train multimodal models that link object identity to invariant hypothesis properties. The top-down data flow originates at the system level, where the trained multimodal models are used to obtain space- and feature-based attentional modulation signals, providing biases for the competitive selection process at the hypothesis level. This results in object-specific hypothesis facilitation/suppression in certain image regions which we show to be applicable to different object detection mechanisms. In order to demonstrate the benefits of this approach, we apply the system to the detection of cars in a variety of challenging traffic videos. Evaluating our approach on a publicly available dataset containing approximately 3,500 annotated video images from more than 1 h of driving, we can show strong increases in performance and generalization when compared to object detection in isolation. Furthermore, we compare our results to a late hypothesis rejection approach, showing that early coupling of top-down and bottom-up information is a favorable approach especially when processing resources are constrained.

  13. An object-based visual attention model for robotic applications.

    PubMed

    Yu, Yuanlong; Mann, George K I; Gosine, Raymond G

    2010-10-01

    By extending integrated competition hypothesis, this paper presents an object-based visual attention model, which selects one object of interest using low-dimensional features, resulting that visual perception starts from a fast attentional selection procedure. The proposed attention model involves seven modules: learning of object representations stored in a long-term memory (LTM), preattentive processing, top-down biasing, bottom-up competition, mediation between top-down and bottom-up ways, generation of saliency maps, and perceptual completion processing. It works in two phases: learning phase and attending phase. In the learning phase, the corresponding object representation is trained statistically when one object is attended. A dual-coding object representation consisting of local and global codings is proposed. Intensity, color, and orientation features are used to build the local coding, and a contour feature is employed to constitute the global coding. In the attending phase, the model preattentively segments the visual field into discrete proto-objects using Gestalt rules at first. If a task-specific object is given, the model recalls the corresponding representation from LTM and deduces the task-relevant feature(s) to evaluate top-down biases. The mediation between automatic bottom-up competition and conscious top-down biasing is then performed to yield a location-based saliency map. By combination of location-based saliency within each proto-object, the proto-object-based saliency is evaluated. The most salient proto-object is selected for attention, and it is finally put into the perceptual completion processing module to yield a complete object region. This model has been applied into distinct tasks of robots: detection of task-specific stationary and moving objects. Experimental results under different conditions are shown to validate this model.

  14. Top-down and bottom-up identification of proteins by liquid extraction surface analysis mass spectrometry of healthy and diseased human liver tissue.

    PubMed

    Sarsby, Joscelyn; Martin, Nicholas J; Lalor, Patricia F; Bunch, Josephine; Cooper, Helen J

    2014-11-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  15. Drug Delivery for Peripheral Nerve Regeneration

    DTIC Science & Technology

    2015-11-01

    components are shown in Figure 2. The solvent casting method was used for manufacturing devices, with the 75/25 PLGA pellets dissolved in acetone at a...bottom-up process that minimizes wrinkling of the sheet as it expands and contracts. Following 4 hours on a hot plate, the sheet was submerged into...glass transition temperature (40-60°C), eliminating many traditional sterilization methods like autoclaving. We evaluated deformation of the device

  16. Proteomics goes forensic: Detection and mapping of blood signatures in fingermarks.

    PubMed

    Deininger, Lisa; Patel, Ekta; Clench, Malcolm R; Sears, Vaughn; Sammon, Chris; Francese, Simona

    2016-06-01

    A bottom up in situ proteomic method has been developed enabling the mapping of multiple blood signatures on the intact ridges of blood fingermarks by Matrix Assisted Laser Desorption Mass Spectrometry Imaging (MALDI-MSI). This method, at a proof of concept stage, builds upon recently published work demonstrating the opportunity to profile and identify multiple blood signatures in bloodstains via a bottom up proteomic approach. The present protocol addresses the limitation of the previously developed profiling method with respect to destructivity; destructivity should be avoided for evidence such as blood fingermarks, where the ridge detail must be preserved in order to provide the associative link between the biometric information and the events of bloodshed. Using a blood mark reference model, trypsin concentration and spraying conditions have been optimised within the technical constraints of the depositor eventually employed; the application of MALDI-MSI and Ion Mobility MS have enabled the detection, confirmation and visualisation of blood signatures directly onto the ridge pattern. These results are to be considered a first insight into a method eventually informing investigations (and judicial debates) of violent crimes in which the reliable and non-destructive detection and mapping of blood in fingermarks is paramount to reconstruct the events of bloodshed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Method for photolithographic definition of recessed features on a semiconductor wafer utilizing auto-focusing alignment

    DOEpatents

    Farino, A.J.; Montague, S.; Sniegowski, J.J.; Smith, J.H.; McWhorter, P.J.

    1998-07-21

    A method is disclosed for photolithographically defining device features up to the resolution limit of an auto-focusing projection stepper when the device features are to be formed in a wafer cavity at a depth exceeding the depth of focus of the stepper. The method uses a focusing cavity located in a die field at the position of a focusing light beam from the auto-focusing projection stepper, with the focusing cavity being of the same depth as one or more adjacent cavities wherein a semiconductor device is to be formed. The focusing cavity provides a bottom surface for referencing the focusing light beam and focusing the stepper at a predetermined depth below the surface of the wafer, whereat the device features are to be defined. As material layers are deposited in each device cavity to build up a semiconductor structure such as a microelectromechanical system (MEMS) device, the same material layers are deposited in the focusing cavity, raising the bottom surface and re-focusing the stepper for accurately defining additional device features in each succeeding material layer. The method is especially applicable for forming MEMS devices within a cavity or trench and integrating the MEMS devices with electronic circuitry fabricated on the wafer surface. 15 figs.

  18. Method for photolithographic definition of recessed features on a semiconductor wafer utilizing auto-focusing alignment

    DOEpatents

    Farino, Anthony J.; Montague, Stephen; Sniegowski, Jeffry J.; Smith, James H.; McWhorter, Paul J.

    1998-01-01

    A method is disclosed for photolithographically defining device features up to the resolution limit of an auto-focusing projection stepper when the device features are to be formed in a wafer cavity at a depth exceeding the depth of focus of the stepper. The method uses a focusing cavity located in a die field at the position of a focusing light beam from the auto-focusing projection stepper, with the focusing cavity being of the same depth as one or more adjacent cavities wherein a semiconductor device is to be formed. The focusing cavity provides a bottom surface for referencing the focusing light beam and focusing the stepper at a predetermined depth below the surface of the wafer, whereat the device features are to be defined. As material layers are deposited in each device cavity to build up a semiconductor structure such as a microelectromechanical system (MEMS) device, the same material layers are deposited in the focusing cavity, raising the bottom surface and re-focusing the stepper for accurately defining additional device features in each succeeding material layer. The method is especially applicable for forming MEMS devices within a cavity or trench and integrating the MEMS devices with electronic circuitry fabricated on the wafer surface.

  19. Access-enabling architectures : new hybrid multi-modal spatial prototypes towards resource and social sustainability : USDOT Region V Regional University Transportation Center final report.

    DOT National Transportation Integrated Search

    2016-12-19

    The efforts of this project aim to capture and engage these potentials through a design-research method that incorporates a top down, data-driven approach with bottom-up stakeholder perspectives to develop prototypical scenario-based design solutions...

  20. Optimization of mass spectrometric parameters improve the identification performance of capillary zone electrophoresis for single-shot bottom-up proteomics analysis.

    PubMed

    Zhang, Zhenbin; Dovichi, Norman J

    2018-02-25

    The effects of MS1 injection time, MS2 injection time, dynamic exclusion time, intensity threshold, and isolation width were investigated on the numbers of peptide and protein identifications for single-shot bottom-up proteomics analysis using CZE-MS/MS analysis of a Xenopus laevis tryptic digest. An electrokinetically pumped nanospray interface was used to couple a linear-polyacrylamide coated capillary to a Q Exactive HF mass spectrometer. A sensitive method that used a 1.4 Th isolation width, 60,000 MS2 resolution, 110 ms MS2 injection time, and a top 7 fragmentation produced the largest number of identifications when the CZE loading amount was less than 100 ng. A programmable autogain control method (pAGC) that used a 1.4 Th isolation width, 15,000 MS2 resolution, 110 ms MS2 injection time, and top 10 fragmentation produced the largest number of identifications for CZE loading amounts greater than 100 ng; 7218 unique peptides and 1653 protein groups were identified from 200 ng by using the pAGC method. The effect of mass spectrometer conditions on the performance of UPLC-MS/MS was also investigated. A fast method that used a 1.4 Th isolation width, 30,000 MS2 resolution, 45 ms MS2 injection time, and top 12 fragmentation produced the largest number of identifications for 200 ng UPLC loading amount (6025 unique peptides and 1501 protein groups). This is the first report where the identification number for CZE surpasses that of the UPLC at the 200 ng loading level. However, more peptides (11476) and protein groups (2378) were identified by using UPLC-MS/MS when the sample loading amount was increased to 2 μg with the fast method. To exploit the fast scan speed of the Q-Exactive HF mass spectrometer, higher sample loading amounts are required for single-shot bottom-up proteomics analysis using CZE-MS/MS. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Application of cause-and-effect analysis to potentiometric titration.

    PubMed

    Kufelnicki, A; Lis, S; Meinrath, G

    2005-08-01

    A first attempt has been made to interpret physicochemical data from potentiometric titration analysis in accordance with the complete measurement-uncertainty budget approach (bottom-up) of ISO and Eurachem. A cause-and-effect diagram is established and discussed. Titration data for arsenazo III are used as a basis for this discussion. The commercial software Superquad is used and applied within a computer-intensive resampling framework. The cause-and-effect diagram is applied to evaluation of seven protonation constants of arsenazo III in the pH range 2-10.7. The data interpretation is based on empirical probability distributions and their analysis by second-order correct confidence estimates. The evaluated data are applied in the calculation of a speciation diagram including uncertainty estimates using the probabilistic speciation software Ljungskile.

  2. Multi-angle backscatter classification and sub-bottom profiling for improved seafloor characterization

    NASA Astrophysics Data System (ADS)

    Alevizos, Evangelos; Snellen, Mirjam; Simons, Dick; Siemes, Kerstin; Greinert, Jens

    2018-06-01

    This study applies three classification methods exploiting the angular dependence of acoustic seafloor backscatter along with high resolution sub-bottom profiling for seafloor sediment characterization in the Eckernförde Bay, Baltic Sea Germany. This area is well suited for acoustic backscatter studies due to its shallowness, its smooth bathymetry and the presence of a wide range of sediment types. Backscatter data were acquired using a Seabeam1180 (180 kHz) multibeam echosounder and sub-bottom profiler data were recorded using a SES-2000 parametric sonar transmitting 6 and 12 kHz. The high density of seafloor soundings allowed extracting backscatter layers for five beam angles over a large part of the surveyed area. A Bayesian probability method was employed for sediment classification based on the backscatter variability at a single incidence angle, whereas Maximum Likelihood Classification (MLC) and Principal Components Analysis (PCA) were applied to the multi-angle layers. The Bayesian approach was used for identifying the optimum number of acoustic classes because cluster validation is carried out prior to class assignment and class outputs are ordinal categorical values. The method is based on the principle that backscatter values from a single incidence angle express a normal distribution for a particular sediment type. The resulting Bayesian classes were well correlated to median grain sizes and the percentage of coarse material. The MLC method uses angular response information from five layers of training areas extracted from the Bayesian classification map. The subsequent PCA analysis is based on the transformation of these five layers into two principal components that comprise most of the data variability. These principal components were clustered in five classes after running an external cluster validation test. In general both methods MLC and PCA, separated the various sediment types effectively, showing good agreement (kappa >0.7) with the Bayesian approach which also correlates well with ground truth data (r2 > 0.7). In addition, sub-bottom data were used in conjunction with the Bayesian classification results to characterize acoustic classes with respect to their geological and stratigraphic interpretation. The joined interpretation of seafloor and sub-seafloor data sets proved to be an efficient approach for a better understanding of seafloor backscatter patchiness and to discriminate acoustically similar classes in different geological/bathymetric settings.

  3. Achieving Campus Sustainability: Top-Down, Bottom-Up, or Neither?

    ERIC Educational Resources Information Center

    Brinkhurst, Marena; Rose, Peter; Maurice, Gillian; Ackerman, Josef Daniel

    2011-01-01

    Purpose: The dynamics of organizational change related to environmental sustainability on university campuses are examined in this article. Whereas case studies of campus sustainability efforts tend to classify leadership as either "top-down" or "bottom-up", this classification neglects consideration of the leadership roles of…

  4. Bottom-Up Engineering of Well-Defined 3D Microtissues Using Microplatforms and Biomedical Applications.

    PubMed

    Lee, Geon Hui; Lee, Jae Seo; Wang, Xiaohong; Lee, Sang Hoon

    2016-01-07

    During the last decades, the engineering of well-defined 3D tissues has attracted great attention because it provides in vivo mimicking environment and can be a building block for the engineering of bioartificial organs. In this Review, diverse engineering methods of 3D tissues using microscale devices are introduced. Recent progress of microtechnologies has enabled the development of microplatforms for bottom-up assembly of diverse shaped 3D tissues consisting of various cells. Micro hanging-drop plates, microfluidic chips, and arrayed microwells are the typical examples. The encapsulation of cells in hydrogel microspheres and microfibers allows the engineering of 3D microtissues with diverse shapes. Applications of 3D microtissues in biomedical fields are described, and the future direction of microplatform-based engineering of 3D micro-tissues is discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Sequential bottom-up assembly of mechanically stabilized synthetic cells by microfluidics

    NASA Astrophysics Data System (ADS)

    Weiss, Marian; Frohnmayer, Johannes Patrick; Benk, Lucia Theresa; Haller, Barbara; Janiesch, Jan-Willi; Heitkamp, Thomas; Börsch, Michael; Lira, Rafael B.; Dimova, Rumiana; Lipowsky, Reinhard; Bodenschatz, Eberhard; Baret, Jean-Christophe; Vidakovic-Koch, Tanja; Sundmacher, Kai; Platzman, Ilia; Spatz, Joachim P.

    2018-01-01

    Compartments for the spatially and temporally controlled assembly of biological processes are essential towards cellular life. Synthetic mimics of cellular compartments based on lipid-based protocells lack the mechanical and chemical stability to allow their manipulation into a complex and fully functional synthetic cell. Here, we present a high-throughput microfluidic method to generate stable, defined sized liposomes termed `droplet-stabilized giant unilamellar vesicles (dsGUVs)’. The enhanced stability of dsGUVs enables the sequential loading of these compartments with biomolecules, namely purified transmembrane and cytoskeleton proteins by microfluidic pico-injection technology. This constitutes an experimental demonstration of a successful bottom-up assembly of a compartment with contents that would not self-assemble to full functionality when simply mixed together. Following assembly, the stabilizing oil phase and droplet shells are removed to release functional self-supporting protocells to an aqueous phase, enabling them to interact with physiologically relevant matrices.

  6. Bottom-up synthesis of multifunctional nanoporous graphene

    NASA Astrophysics Data System (ADS)

    Moreno, César; Vilas-Varela, Manuel; Kretz, Bernhard; Garcia-Lekue, Aran; Costache, Marius V.; Paradinas, Markos; Panighel, Mirko; Ceballos, Gustavo; Valenzuela, Sergio O.; Peña, Diego; Mugarza, Aitor

    2018-04-01

    Nanosize pores can turn semimetallic graphene into a semiconductor and, from being impermeable, into the most efficient molecular-sieve membrane. However, scaling the pores down to the nanometer, while fulfilling the tight structural constraints imposed by applications, represents an enormous challenge for present top-down strategies. Here we report a bottom-up method to synthesize nanoporous graphene comprising an ordered array of pores separated by ribbons, which can be tuned down to the 1-nanometer range. The size, density, morphology, and chemical composition of the pores are defined with atomic precision by the design of the molecular precursors. Our electronic characterization further reveals a highly anisotropic electronic structure, where orthogonal one-dimensional electronic bands with an energy gap of ∼1 electron volt coexist with confined pore states, making the nanoporous graphene a highly versatile semiconductor for simultaneous sieving and electrical sensing of molecular species.

  7. Associated production of a Higgs boson decaying into bottom quarks at the LHC in full NNLO QCD

    NASA Astrophysics Data System (ADS)

    Ferrera, Giancarlo; Somogyi, Gábor; Tramontano, Francesco

    2018-05-01

    We consider the production of a Standard Model Higgs boson decaying to bottom quarks in association with a vector boson W± / Z in hadron collisions. We present a fully exclusive calculation of QCD radiative corrections both for the production cross section and for the Higgs boson decay rate up to next-to-next-to-leading order (NNLO) accuracy. Our calculation also includes the leptonic decay of the vector boson with finite-width effects and spin correlations. We consider typical kinematical cuts applied in the experimental analyses at the Large Hadron Collider (LHC) and we find that the full NNLO QCD corrections significantly decrease the accepted cross section and have a substantial impact on the shape of distributions. We point out that these additional effects are essential to obtain precise theoretical predictions to be compared with the LHC data.

  8. Numerical modeling of marine Gravity data for tsunami hazard zone mapping

    NASA Astrophysics Data System (ADS)

    Porwal, Nipun

    2012-07-01

    Tsunami is a series of ocean wave with very high wavelengths ranges from 10 to 500 km. Therefore tsunamis act as shallow water waves and hard to predict from various methods. Bottom Pressure Recorders of Poseidon class considered as a preeminent method to detect tsunami waves but Acoustic Modem in Ocean Bottom Pressure (OBP) sensors placed in the vicinity of trenches having depth of more than 6000m fails to propel OBP data to Surface Buoys. Therefore this paper is developed for numerical modeling of Gravity field coefficients from Bureau Gravimetric International (BGI) which do not play a central role in the study of geodesy, satellite orbit computation, & geophysics but by mathematical transformation of gravity field coefficients using Normalized Legendre Polynomial high resolution ocean bottom pressure (OBP) data is generated. Real time sea level monitored OBP data of 0.3° by 1° spatial resolution using Kalman filter (kf080) for past 10 years by Estimating the Circulation and Climate of the Ocean (ECCO) has been correlated with OBP data from gravity field coefficients which attribute a feasible study on future tsunami detection system from space and in identification of most suitable sites to place OBP sensors near deep trenches. The Levitus Climatological temperature and salinity are assimilated into the version of the MITGCM using the ad-joint method to obtain the sea height segment. Then TOPEX/Poseidon satellite altimeter, surface momentum, heat, and freshwater fluxes from NCEP reanalysis product and the dynamic ocean topography DOT_DNSCMSS08_EGM08 is used to interpret sea-bottom elevation. Then all datasets are associated under raster calculator in ArcGIS 9.3 using Boolean Intersection Algebra Method and proximal analysis tools with high resolution sea floor topographic map. Afterward tsunami prone area and suitable sites for set up of BPR as analyzed in this research is authenticated by using Passive microwave radiometry system for Tsunami Hazard Zone Mapping by network of seismometers. Thus using such methodology for early Tsunami Hazard Zone Mapping also increase accuracy and reduce time period for tsunami predictions. KEYWORDS:, Tsunami, Gravity Field Coefficients, Ocean Bottom Pressure, ECCO, BGI, Sea Bottom Temperature, Sea Floor Topography.

  9. Pressurized Pepsin Digestion in Proteomics

    PubMed Central

    López-Ferrer, Daniel; Petritis, Konstantinos; Robinson, Errol W.; Hixson, Kim K.; Tian, Zhixin; Lee, Jung Hwa; Lee, Sang-Won; Tolić, Nikola; Weitz, Karl K.; Belov, Mikhail E.; Smith, Richard D.; Paša-Tolić, Ljiljana

    2011-01-01

    Integrated top-down bottom-up proteomics combined with on-line digestion has great potential to improve the characterization of protein isoforms in biological systems and is amendable to high throughput proteomics experiments. Bottom-up proteomics ultimately provides the peptide sequences derived from the tandem MS analyses of peptides after the proteome has been digested. Top-down proteomics conversely entails the MS analyses of intact proteins for more effective characterization of genetic variations and/or post-translational modifications. Herein, we describe recent efforts toward efficient integration of bottom-up and top-down LC-MS-based proteomics strategies. Since most proteomics separations utilize acidic conditions, we exploited the compatibility of pepsin (where the optimal digestion conditions are at low pH) for integration into bottom-up and top-down proteomics work flows. Pressure-enhanced pepsin digestions were successfully performed and characterized with several standard proteins in either an off-line mode using a Barocycler or an on-line mode using a modified high pressure LC system referred to as a fast on-line digestion system (FOLDS). FOLDS was tested using pepsin and a whole microbial proteome, and the results were compared against traditional trypsin digestions on the same platform. Additionally, FOLDS was integrated with a RePlay configuration to demonstrate an ultrarapid integrated bottom-up top-down proteomics strategy using a standard mixture of proteins and a monkey pox virus proteome. PMID:20627868

  10. Variability at Multiple Scales: Using an Array of Current and Pressure Sensor Equipped Inverted Echo Sounders to Measure the Ocean

    DTIC Science & Technology

    2016-11-29

    travel time between the seafloor and the sea surface; bottom pressure and temperature; and near-bottom horizontal currents hourly for up to 5 years...pressure and current sensors (CPIESs). CPIESs (Figure 1) are moored instruments that measure (1) the round-trip acoustic travel time between the...measurements of surface-to-bottom round-trip acoustic- travel time (’c), bottom pressure and temperature, and near-bottom horizontal currents

  11. Variability at Multiple Scales: Using an Array of Current- and Pressure-Sensor Equipped Inverted Echo Sounders to Measure the Ocean

    DTIC Science & Technology

    2016-11-29

    travel time between the seafloor and the sea surface; bottom pressure and temperature; and near-bottom horizontal currents hourly for up to 5 years...pressure and current sensors (CPIESs). CPIESs (Figure 1) are moored instruments that measure (1) the round-trip acoustic travel time between the...measurements of surface-to-bottom round-trip acoustic- travel time (’c), bottom pressure and temperature, and near-bottom horizontal currents

  12. An Analysis Model for Water Cone Subsidence in Bottom Water Drive Reservoirs

    NASA Astrophysics Data System (ADS)

    Wang, Jianjun; Xu, Hui; Wu, Shucheng; Yang, Chao; Kong, lingxiao; Zeng, Baoquan; Xu, Haixia; Qu, Tailai

    2017-12-01

    Water coning in bottom water drive reservoirs, which will result in earlier water breakthrough, rapid increase in water cut and low recovery level, has drawn tremendous attention in petroleum engineering field. As one simple and effective method to inhibit bottom water coning, shut-in coning control is usually preferred in oilfield to control the water cone and furthermore to enhance economic performance. However, most of the water coning researchers just have been done on investigation of the coning behavior as it grows up, the reported studies for water cone subsidence are very scarce. The goal of this work is to present an analytical model for water cone subsidence to analyze the subsidence of water cone when the well shut in. Based on Dupuit critical oil production rate formula, an analytical model is developed to estimate the initial water cone shape at the point of critical drawdown. Then, with the initial water cone shape equation, we propose an analysis model for water cone subsidence in bottom water reservoir reservoirs. Model analysis and several sensitivity studies are conducted. This work presents accurate and fast analytical model to perform the water cone subsidence in bottom water drive reservoirs. To consider the recent interests in development of bottom drive reservoirs, our approach provides a promising technique for better understanding the subsidence of water cone.

  13. Particle detection for patterned wafers of 100nm design rule by evanescent light illumination: analysis of evanescent light scattering using Finite-Difference Time-Domain (FDTD) method

    NASA Astrophysics Data System (ADS)

    Yoshioka, Toshie; Miyoshi, Takashi; Takaya, Yasuhiro

    2005-12-01

    To realize high productivity and reliability of the semiconductor, patterned wafers inspection technology to maintain high yield becomes essential in modern semiconductor manufacturing processes. As circuit feature is scaled below 100nm, the conventional imaging and light scattering methods are impossible to apply to the patterned wafers inspection technique, because of diffraction limit and lower S/N ratio. So, we propose a new particle detection method using annular evanescent light illumination. In this method, a converging annular light used as a light source is incident on a micro-hemispherical lens. When the converging angle is larger than critical angle, annular evanescent light is generated under the bottom surface of the hemispherical lens. Evanescent light is localized near by the bottom surface and decays exponentially away from the bottom surface. So, the evanescent light selectively illuminates the particles on the patterned wafer surface, because it can't illuminate the patterned wafer surface. The proposed method evaluates particles on a patterned wafer surface by detecting scattered evanescent light distribution from particles. To analyze the fundamental characteristics of the proposed method, the computer simulation was performed using FDTD method. The simulation results show that the proposed method is effective for detecting 100nm size particle on patterned wafer of 100nm lines and spaces, particularly under the condition that the evanescent light illumination with p-polarization and parallel incident to the line orientation. Finally, the experiment results suggest that 220nm size particle on patterned wafer of about 200nm lines and spaces can be detected.

  14. Advanced dynamic statistical parametric mapping with MEG in localizing epileptogenicity of the bottom of sulcus dysplasia.

    PubMed

    Nakajima, Midori; Wong, Simeon; Widjaja, Elysa; Baba, Shiro; Okanishi, Tohru; Takada, Lynne; Sato, Yosuke; Iwata, Hiroki; Sogabe, Maya; Morooka, Hikaru; Whitney, Robyn; Ueda, Yuki; Ito, Tomoshiro; Yagyu, Kazuyori; Ochi, Ayako; Carter Snead, O; Rutka, James T; Drake, James M; Doesburg, Sam; Takeuchi, Fumiya; Shiraishi, Hideaki; Otsubo, Hiroshi

    2018-06-01

    To investigate whether advanced dynamic statistical parametric mapping (AdSPM) using magnetoencephalography (MEG) can better localize focal cortical dysplasia at bottom of sulcus (FCDB). We analyzed 15 children with diagnosis of FCDB in surgical specimen and 3 T MRI by using MEG. Using AdSPM, we analyzed a ±50 ms epoch relative to each single moving dipole (SMD) and applied summation technique to estimate the source activity. The most active area in AdSPM was defined as the location of AdSPM spike source. We compared spatial congruence between MRI-visible FCDB and (1) dipole cluster in SMD method; and (2) AdSPM spike source. AdSPM localized FCDB in 12 (80%) of 15 children whereas dipole cluster localized six (40%). AdSPM spike source was concordant within seizure onset zone in nine (82%) of 11 children with intracranial video EEG. Eleven children with resective surgery achieved seizure freedom with follow-up period of 1.9 ± 1.5 years. Ten (91%) of them had an AdSPM spike source in the resection area. AdSPM can noninvasively and neurophysiologically localize epileptogenic FCDB, whether it overlaps with the dipole cluster or not. This is the first study to localize epileptogenic FCDB using MEG. Copyright © 2018 International Federation of Clinical Neurophysiology. Published by Elsevier B.V. All rights reserved.

  15. Merging Bottom-Up with Top-Down: Continuous Lamellar Networks and Block Copolymer Lithography

    NASA Astrophysics Data System (ADS)

    Campbell, Ian Patrick

    Block copolymer lithography is an emerging nanopatterning technology with capabilities that may complement and eventually replace those provided by existing optical lithography techniques. This bottom-up process relies on the parallel self-assembly of macromolecules composed of covalently linked, chemically distinct blocks to generate periodic nanostructures. Among the myriad potential morphologies, lamellar structures formed by diblock copolymers with symmetric volume fractions have attracted the most interest as a patterning tool. When confined to thin films and directed to assemble with interfaces perpendicular to the substrate, two-dimensional domains are formed between the free surface and the substrate, and selective removal of a single block creates a nanostructured polymeric template. The substrate exposed between the polymeric features can subsequently be modified through standard top-down microfabrication processes to generate novel nanostructured materials. Despite tremendous progress in our understanding of block copolymer self-assembly, continuous two-dimensional materials have not yet been fabricated via this robust technique, which may enable nanostructured material combinations that cannot be fabricated through bottom-up methods. This thesis aims to study the effects of block copolymer composition and processing on the lamellar network morphology of polystyrene-block-poly(methyl methacrylate) (PS-b-PMMA) and utilize this knowledge to fabricate continuous two-dimensional materials through top-down methods. First, block copolymer composition was varied through homopolymer blending to explore the physical phenomena surrounding lamellar network continuity. After establishing a framework for tuning the continuity, the effects of various processing parameters were explored to engineer the network connectivity via defect annihilation processes. Precisely controlling the connectivity and continuity of lamellar networks through defect engineering and optimizing the block copolymer lithography process thus enabled the top-down fabrication of continuous two-dimensional gold networks with nanoscale properties. The lamellar structure of these networks was found to confer unique mechanical properties on the nanowire networks and suggests that materials templated via this method may be excellent candidates for integration into stretchable and flexible devices.

  16. Analytics that Inform the University: Using Data You Already Have

    ERIC Educational Resources Information Center

    Dziuban, Charles; Moskal, Patsy; Cavanagh, Thomas; Watts, Andre

    2012-01-01

    The authors describe the University of Central Florida's top-down/bottom-up action analytics approach to using data to inform decision-making at the University of Central Florida. The top-down approach utilizes information about programs, modalities, and college implementation of Web initiatives. The bottom-up approach continuously monitors…

  17. Effects of Bottom-up and Top-down Controls and Climate Change on Estuarine Macrophyte Communities and the Ecosystem Services they Provide

    EPA Science Inventory

    Macrophytes provide important estuarine benthic habitats and support a significant portion of estuarine productivity. The composition and characteristics of these benthic communities are regulated bottom-up by resource availability and from the top-down by herbivory and predation...

  18. A Physical Mechanism for the Asymmetry in Top-Down and Bottom-Up Diffusion.

    NASA Astrophysics Data System (ADS)

    Wyngaard, J. C.

    1987-04-01

    Recent large-eddy simulations of the vertical diffusion of a passive, conservative scalar through the convective boundary layer (CBL) show strikingly different eddy diffusivity profiles in the `top-down' and `bottom-up' cases. These results indicate that for a given turbulent velocity field and associated scalar flux, the mean change in scalar mixing ratio across the CBL is several times larger if the flux originates at the top of the boundary layer (i.e., in top-down diffusion) rather than at the bottom. The large-eddy simulation (LES) data show that this asymmetry is due to a breakdown of the eddy-diffusion concept.A simple updraft-downdraft model of the CBL reveals a physical mechanism that could cause this unexpected behavior. The large, positive skewness of the convectively driven vertical velocity gives an appreciably higher probability of downdrafts than updrafts; this excess probability of downdrafts, interacting with the time changes of the mean mixing ratio caused by the nonstationarity of the bottom-up and top-down diffusion processes, decreases the equilibrium value of mean mixing-ratio jump across the mixed layer in the bottom-up case and increases it in the top-down case. The resulting diffusion asymmetry agrees qualitatively with that found through LES.

  19. Masses of constituent quarks confined in open bottom hadrons

    NASA Astrophysics Data System (ADS)

    Borka Jovanović, V.; Borka, D.; Jovanović, P.; Milošević, J.; Ignjatović, S. R.

    2014-12-01

    We apply color-spin and flavor-spin quark-quark interactions to the meson and baryon constituent quarks, and calculate constituent quark masses, as well as the coupling constants of these interactions. The main goal of this paper was to determine constituent quark masses from light and open bottom hadron masses, using the fitting method we have developed and clustering of hadron groups. We use color-spin Fermi-Breit (FB) and flavor-spin Glozman-Riska (GR) hyperfine interaction (HFI) to determine constituent quark masses (especially b quark mass). Another aim was to discern between the FB and GR HFI because our previous findings had indicated that both interactions were satisfactory. Our improved fitting procedure of constituent quark masses showed that on average color-spin (FB) HFI yields better fits. The method also shows the way how the constituent quark masses and the strength of the interaction constants appear in different hadron environments.

  20. The Formative Method for Adapting Psychotherapy (FMAP): A community-based developmental approach to culturally adapting therapy

    PubMed Central

    Hwang, Wei-Chin

    2010-01-01

    How do we culturally adapt psychotherapy for ethnic minorities? Although there has been growing interest in doing so, few therapy adaptation frameworks have been developed. The majority of these frameworks take a top-down theoretical approach to adapting psychotherapy. The purpose of this paper is to introduce a community-based developmental approach to modifying psychotherapy for ethnic minorities. The Formative Method for Adapting Psychotherapy (FMAP) is a bottom-up approach that involves collaborating with consumers to generate and support ideas for therapy adaptation. It involves 5-phases that target developing, testing, and reformulating therapy modifications. These phases include: (a) generating knowledge and collaborating with stakeholders (b) integrating generated information with theory and empirical and clinical knowledge, (c) reviewing the initial culturally adapted clinical intervention with stakeholders and revising the culturally adapted intervention, (d) testing the culturally adapted intervention, and (e) finalizing the culturally adapted intervention. Application of the FMAP is illustrated using examples from a study adapting psychotherapy for Chinese Americans, but can also be readily applied to modify therapy for other ethnic groups. PMID:20625458

  1. Studies On Endoscopic Local Hyperthermia Using Nd-YAG Laser

    NASA Astrophysics Data System (ADS)

    Tsunekawa, H.; Kanemaki, N.; Furusawa, A.; Hotta, M.; Kuroiwa, A.; Nishida, M.; Mori, N.; Watanabe, Y.; Morise, K.; Iizuka, A.

    1987-03-01

    Attempting a new method of laser irradiation for depressed gastric carcinoma, using a newly developed interstitial probe and laser attenuator, we applied local hyperthermia with prolonged low watt contact irradiation. Experimental studies were performed with this probe, using BDF1 mice injected hypodermically with Lewis lung carcinoma. A laser power of 2.0 w at the tip of fiber produced the most desirable temperature curve, about 43 - 60°C at the irradiation site. Clinical applications were carried out on 15 patients with early gastric carcinoma (mainly depressed), 10 preoperative pilot cases and 5 inoperable cases. In follow-up operations and biopsies gastric carcinoma was found to have completely dis-appeared in 2 of the preoperative and 4 of the inoperable cases. In the remaining 8 preoperative cases residual traces of carcinoma were found at the margin of the laser ulcer, but not at the bottom of it. We propose that endoscopic local hyperthermia using interstitial probe and low power irradiation (2.0 W) is the safest and most suitable method of dealing with depressed carcinoma.

  2. Evolution of attention mechanisms for early visual processing

    NASA Astrophysics Data System (ADS)

    Müller, Thomas; Knoll, Alois

    2011-03-01

    Early visual processing as a method to speed up computations on visual input data has long been discussed in the computer vision community. The general target of a such approaches is to filter nonrelevant information from the costly higher-level visual processing algorithms. By insertion of this additional filter layer the overall approach can be speeded up without actually changing the visual processing methodology. Being inspired by the layered architecture of the human visual processing apparatus, several approaches for early visual processing have been recently proposed. Most promising in this field is the extraction of a saliency map to determine regions of current attention in the visual field. Such saliency can be computed in a bottom-up manner, i.e. the theory claims that static regions of attention emerge from a certain color footprint, and dynamic regions of attention emerge from connected blobs of textures moving in a uniform way in the visual field. Top-down saliency effects are either unconscious through inherent mechanisms like inhibition-of-return, i.e. within a period of time the attention level paid to a certain region automatically decreases if the properties of that region do not change, or volitional through cognitive feedback, e.g. if an object moves consistently in the visual field. These bottom-up and top-down saliency effects have been implemented and evaluated in a previous computer vision system for the project JAST. In this paper an extension applying evolutionary processes is proposed. The prior vision system utilized multiple threads to analyze the regions of attention delivered from the early processing mechanism. Here, in addition, multiple saliency units are used to produce these regions of attention. All of these saliency units have different parameter-sets. The idea is to let the population of saliency units create regions of attention, then evaluate the results with cognitive feedback and finally apply the genetic mechanism: mutation and cloning of the best performers and extinction of the worst performers considering computation of regions of attention. A fitness function can be derived by evaluating, whether relevant objects are found in the regions created. It can be seen from various experiments, that the approach significantly speeds up visual processing, especially regarding robust ealtime object recognition, compared to an approach not using saliency based preprocessing. Furthermore, the evolutionary algorithm improves the overall performance of the preprocessing system in terms of quality, as the system automatically and autonomously tunes the saliency parameters. The computational overhead produced by periodical clone/delete/mutation operations can be handled well within the realtime constraints of the experimental computer vision system. Nevertheless, limitations apply whenever the visual field does not contain any significant saliency information for some time, but the population still tries to tune the parameters - overfitting avoids generalization in this case and the evolutionary process may be reset by manual intervention.

  3. Figure-ground organization and object recognition processes: an interactive account.

    PubMed

    Vecera, S P; O'Reilly, R C

    1998-04-01

    Traditional bottom-up models of visual processing assume that figure-ground organization precedes object recognition. This assumption seems logically necessary: How can object recognition occur before a region is labeled as figure? However, some behavioral studies find that familiar regions are more likely to be labeled figure than less familiar regions, a problematic finding for bottom-up models. An interactive account is proposed in which figure-ground processes receive top-down input from object representations in a hierarchical system. A graded, interactive computational model is presented that accounts for behavioral results in which familiarity effects are found. The interactive model offers an alternative conception of visual processing to bottom-up models.

  4. Addressing the Misuse Potential of Life Science Research-Perspectives From a Bottom-Up Initiative in Switzerland.

    PubMed

    Oeschger, Franziska M; Jenal, Ursula

    2018-01-01

    Codes of conduct have received wide attention as a bottom-up approach to foster responsibility for dual use aspects of life science research within the scientific community. In Switzerland, a series of discussion sessions led by the Swiss Academy of Sciences with over 40 representatives of most Swiss academic life science research institutions has revealed that while a formal code of conduct was considered too restrictive, a bottom-up approach toward awareness raising and education and demonstrating scientists' responsibility toward society was highly welcomed. Consequently, an informational brochure on "Misuse potential and biosecurity in life sciences research" was developed to provide material for further discussions and education.

  5. Mapping Curie temperature depth in the western United States with a fractal model for crustal magnetization

    USGS Publications Warehouse

    Bouligand, C.; Glen, J.M.G.; Blakely, R.J.

    2009-01-01

    We have revisited the problem of mapping depth to the Curie temperature isotherm from magnetic anomalies in an attempt to provide a measure of crustal temperatures in the western United States. Such methods are based on the estimation of the depth to the bottom of magnetic sources, which is assumed to correspond to the temperature at which rocks lose their spontaneous magnetization. In this study, we test and apply a method based on the spectral analysis of magnetic anomalies. Early spectral analysis methods assumed that crustal magnetization is a completely uncorrelated function of position. Our method incorporates a more realistic representation where magnetization has a fractal distribution defined by three independent parameters: the depths to the top and bottom of magnetic sources and a fractal parameter related to the geology. The predictions of this model are compatible with radial power spectra obtained from aeromagnetic data in the western United States. Model parameters are mapped by estimating their value within a sliding window swept over the study area. The method works well on synthetic data sets when one of the three parameters is specified in advance. The application of this method to western United States magnetic compilations, assuming a constant fractal parameter, allowed us to detect robust long-wavelength variations in the depth to the bottom of magnetic sources. Depending on the geologic and geophysical context, these features may result from variations in depth to the Curie temperature isotherm, depth to the mantle, depth to the base of volcanic rocks, or geologic settings that affect the value of the fractal parameter. Depth to the bottom of magnetic sources shows several features correlated with prominent heat flow anomalies. It also shows some features absent in the map of heat flow. Independent geophysical and geologic data sets are examined to determine their origin, thereby providing new insights on the thermal and geologic crustal structure of the western United States.

  6. Costing Hospital Surgery Services: The Method Matters

    PubMed Central

    Mercier, Gregoire; Naro, Gerald

    2014-01-01

    Background Accurate hospital costs are required for policy-makers, hospital managers and clinicians to improve efficiency and transparency. However, different methods are used to allocate direct costs, and their agreement is poorly understood. The aim of this study was to assess the agreement between bottom-up and top-down unit costs of a large sample of surgical operations in a French tertiary centre. Methods Two thousand one hundred and thirty consecutive procedures performed between January and October 2010 were analysed. Top-down costs were based on pre-determined weights, while bottom-up costs were calculated through an activity-based costing (ABC) model. The agreement was assessed using correlation coefficients and the Bland and Altman method. Variables associated with the difference between methods were identified with bivariate and multivariate linear regressions. Results The correlation coefficient amounted to 0.73 (95%CI: 0.72; 0.76). The overall agreement between methods was poor. In a multivariate analysis, the cost difference was independently associated with age (Beta = −2.4; p = 0.02), ASA score (Beta = 76.3; p<0.001), RCI (Beta = 5.5; p<0.001), staffing level (Beta = 437.0; p<0.001) and intervention duration (Beta = −10.5; p<0.001). Conclusions The ability of the current method to provide relevant information to managers, clinicians and payers is questionable. As in other European countries, a shift towards time-driven activity-based costing should be advocated. PMID:24817167

  7. Hierarchical time series bottom-up approach for forecast the export value in Central Java

    NASA Astrophysics Data System (ADS)

    Mahkya, D. A.; Ulama, B. S.; Suhartono

    2017-10-01

    The purpose of this study is Getting the best modeling and predicting the export value of Central Java using a Hierarchical Time Series. The export value is one variable injection in the economy of a country, meaning that if the export value of the country increases, the country’s economy will increase even more. Therefore, it is necessary appropriate modeling to predict the export value especially in Central Java. Export Value in Central Java are grouped into 21 commodities with each commodity has a different pattern. One approach that can be used time series is a hierarchical approach. Hierarchical Time Series is used Buttom-up. To Forecast the individual series at all levels using Autoregressive Integrated Moving Average (ARIMA), Radial Basis Function Neural Network (RBFNN), and Hybrid ARIMA-RBFNN. For the selection of the best models used Symmetric Mean Absolute Percentage Error (sMAPE). Results of the analysis showed that for the Export Value of Central Java, Bottom-up approach with Hybrid ARIMA-RBFNN modeling can be used for long-term predictions. As for the short and medium-term predictions, it can be used a bottom-up approach RBFNN modeling. Overall bottom-up approach with RBFNN modeling give the best result.

  8. Bottom-up low molecular weight heparin analysis using liquid chromatography-Fourier transform mass spectrometry for extensive characterization.

    PubMed

    Li, Guoyun; Steppich, Julia; Wang, Zhenyu; Sun, Yi; Xue, Changhu; Linhardt, Robert J; Li, Lingyun

    2014-07-01

    Low molecular weight heparins (LMWHs) are heterogeneous, polydisperse, and highly negatively charged mixtures of glycosaminoglycan chains prescribed as anticoagulants. The detailed characterization of LMWH is important for the drug quality assurance and for new drug research and development. In this study, online hydrophilic interaction chromatography (HILIC) Fourier transform mass spectrometry (FTMS) was applied to analyze the oligosaccharide fragments of LMWHs generated by heparin lyase II digestion. More than 40 oligosaccharide fragments of LMWH were quantified and used to compare LMWHs prepared by three different manufacturers. The quantified fragment structures included unsaturated disaccharides/oligosaccharides arising from the prominent repeating units of these LMWHs, 3-O-sulfo containing tetrasaccharides arising from their antithrombin III binding sites, 1,6-anhydro ring-containing oligosaccharides formed during their manufacture, saturated uronic acid oligosaccharides coming from some chain nonreducing ends, and oxidized linkage region oligosaccharides coming from some chain reducing ends. This bottom-up approach provides rich detailed structural analysis and quantitative information with high accuracy and reproducibility. When combined with the top-down approach, HILIC LC-FTMS based analysis should be suitable for the advanced quality control and quality assurance in LMWH production.

  9. Internal geometry effect of structured PLA materials manufactured by dropplet-based 3D printer on its mechanical properties

    NASA Astrophysics Data System (ADS)

    Wicaksono, Sigit T.; Ardhyananta, Hosta; Rasyida, Amaliya; Hidayat, Mas Irfan P.

    2018-04-01

    Rapid Prototyping (RP) technologies, the manufacturing technology with less time consuming including high precission and complicated structure of products, are now become high demanding technologies. Those technologies can be base on top-down or bottom-up approaches. One of the bottom-up approach of RP technology is 3D printing machine. In this research, we have succeed to apply the droplet-based 3D printer to make the structured PLA (Polylactic Acid) materials with different internal geometry structures. The internal geometry used are triangle and honeycomb structure with different size of each symmetry axis of 4.5 mm and 9 mm and the thickness varied of 1 mm and 2 mm as well. The mechanical properties of those structures including tensile and bending stregth are evaluated by using tensile and flexural test respectively. Test results show that the best performance obtained by measuring its tensile and flexural strength is the sampel with triangle geometry of 9 mm geometrical size and 2 mm of thickness. The tensile strength and flexural strength values of the specimens are 59.2996 MPa and 123 MPa respectively.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greiner, Nicolas; Höche, Stefan; Luisoni, Gionata

    The first computation of Higgs production in association with three jets at NLO in QCD has recently been performed using the effective theory, where the top quark is treated as an infinitely heavy particle and integrated out. This approach is restricted to the regions in phase space where the typical scales are not larger than the top quark mass. Here we investigate this statement at a quantitative level by calculating the leading-order contributions to the production of a Standard Model Higgs boson in association with up to three jets taking full top-quark and bottom-quark mass dependence into account. We findmore » that the transverse momentum of the hardest particle or jet plays a key role in the breakdown of the effective theory predictions, and that discrepancies can easily reach an order of magnitude for transverse momenta of about 1 TeV. The impact of bottom-quark loops is found to be visible in the small transverse momentum region, leading to corrections of up to 5 percent. Lastly, we further study the impact of mass corrections when VBF selection cuts are applied and when the center-of-mass energy is increased to 100 TeV.« less

  11. Nutrition and parturition date effects on elk: potential implications for research and management.

    Treesearch

    John G. Cook; Bruce K. Johnson; Rachel C. Cook; Robert A. Riggs; Tim DelCurto; Larry D. Bryant; Larry L. Irwin

    2004-01-01

    Understanding and managing those mechanisms that affect population dynamics comprise, perhaps, the most fundamental aspect of wildlife management (Caughley 1977). Biologists generally categorize these mechanisms as either top-down (predator-driven) or bottom-up (habitat- or animal-density driven). Bottom-up influences involve imbalances between increasing animal...

  12. Bottom Up Succession Planning Works Better.

    ERIC Educational Resources Information Center

    Stevens, Paul

    The majority of current succession planning practices reflect the viewpoint of only a linear career direction for ambitious people. They are based on the premise that competent people have and want only one career direction--an upwardly mobile one. In today's work force, however, a "bottom-up" process works better in succession planning. This…

  13. The landscape of fear: The missing link to understand top-down and bottom-up controls of prey abundance?

    USDA-ARS?s Scientific Manuscript database

    Identifying factors that may be responsible for affecting and possibly regulating the size of animal populations is a cornerstone in understanding population ecology. The main factors that are thought to influence population size are either resources (bottom-up), predation, (top-down), or interspec...

  14. The Girlfriends Project: Evaluating a Promising Community-Based Intervention from a Bottom-Up Perspective

    ERIC Educational Resources Information Center

    Hawk, Mary

    2015-01-01

    Randomized controlled trials are the gold standard in research but may not fully explain or predict outcome variations in community-based interventions. Demonstrating efficacy of externally driven programs in well-controlled environments may not translate to community-based implementation where resources and priorities vary. A bottom-up evaluation…

  15. Challenges in Engaging Communities in Bottom-Up Literacies for Democratic Citizenship

    ERIC Educational Resources Information Center

    Torres, Myriam N.

    2010-01-01

    The purpose of this article is to examine the authors' experiences while trying to enter and engage local communities in bottom-up literacies through participatory action research (PAR) toward the community's own collective self-development. In trying to enter five different communities, I have found several challenges and roadblocks such as…

  16. Comparing Top-Down with Bottom-Up Approaches: Teaching Data Modeling

    ERIC Educational Resources Information Center

    Kung, Hsiang-Jui; Kung, LeeAnn; Gardiner, Adrian

    2013-01-01

    Conceptual database design is a difficult task for novice database designers, such as students, and is also therefore particularly challenging for database educators to teach. In the teaching of database design, two general approaches are frequently emphasized: top-down and bottom-up. In this paper, we present an empirical comparison of students'…

  17. Bottom-Up Analysis of Single-Case Research Designs

    ERIC Educational Resources Information Center

    Parker, Richard I.; Vannest, Kimberly J.

    2012-01-01

    This paper defines and promotes the qualities of a "bottom-up" approach to single-case research (SCR) data analysis. Although "top-down" models, for example, multi-level or hierarchical linear models, are gaining momentum and have much to offer, interventionists should be cautious about analyses that are not easily understood, are not governed by…

  18. Component processes in voluntary task switching.

    PubMed

    Demanet, Jelle; Liefooghe, Baptist

    2014-05-01

    The present study investigated the involvement of bottom-up and top-down control in task-switching situations in which tasks are selected on a voluntary basis. We tested for indices of both types of control in the reduction in switch cost that is observed when more time is available before executing a task. Participants had to indicate their task choice overtly prior to the actual task execution, and two time intervals were manipulated: the interval between the task-execution response of the previous trial and task-indication response of the current trial and the interval between task-indication response and task-execution response of a particular trial. In Experiment 1, the length of these intervals was manipulated orthogonally, and indices for top-down and bottom-up control were observed. Concerned with the validity of these results, Experiments 2-3 additionally discouraged participants from preparing the upcoming task before their task-indication response. Indices for bottom-up control remained, but not for top-down control. The characteristics of top-down and bottom-up control in voluntary task switching and task switching in general are discussed.

  19. [Method for environmental management in paper industry based on pollution control technology simulation].

    PubMed

    Zhang, Xue-Ying; Wen, Zong-Guo

    2014-11-01

    To evaluate the reduction potential of industrial water pollutant emissions and to study the application of technology simulation in pollutant control and environment management, an Industrial Reduction Potential Analysis and Environment Management (IRPAEM) model was developed based on coupling of "material-process-technology-product". The model integrated bottom-up modeling and scenario analysis method, and was applied to China's paper industry. Results showed that under CM scenario, the reduction potentials of waster water, COD and ammonia nitrogen would reach 7 x 10(8) t, 39 x 10(4) t and 0.3 x 10(4) t, respectively in 2015, 13.8 x 10(8) t, 56 x 10(4) t and 0.5 x 10(4) t, respectively in 2020. Strengthening the end-treatment would still be the key method to reduce emissions during 2010-2020, while the reduction effect of structure adjustment would be more obvious during 2015-2020. Pollution production could basically reach the domestic or international advanced level of clean production in 2015 and 2020; the index of wastewater and ammonia nitrogen would basically meet the emission standards in 2015 and 2020 while COD would not.

  20. Crystal-face-selective adsorption of Au nanoparticles onto polycrystalline diamond surfaces.

    PubMed

    Kondo, Takeshi; Aoshima, Shinsuke; Hirata, Kousuke; Honda, Kensuke; Einaga, Yasuaki; Fujishima, Akira; Kawai, Takeshi

    2008-07-15

    Crystal-face-selective adsorption of Au nanoparticles (AuNPs) was achieved on polycrystalline boron-doped diamond (BDD) surface via the self-assembly method combined with a UV/ozone treatment. To the best of our knowledge, this is the first report of crystal-face-selective adsorption on an inorganic solid surface. Hydrogen-plasma-treated BDD samples and those followed by UV/ozone treatment for 2 min or longer showed almost no adsorption of AuNP after immersion in the AuNP solution prepared by the citrate reduction method. However, the samples treated by UV/ozone for 10 s showed AuNP adsorption on their (111) facets selectively after the immersion. Moreover, the sample treated with UV/ozone for 40-60 s showed AuNP adsorption on the whole surface. These results indicate that the AuNP adsorption behavior can be controlled by UV/ozone treatment time. This phenomenon was highly reproducible and was applied to a two-step adsorption method, where AuNPs from different batches were adsorbed on the (111) and (100) surface in this order. Our findings may be of great value for the fabrication of advanced nanoparticle-based functional materials via bottom-up approaches with simple macroscale procedures.

  1. Physical stress modifies top-down and bottom-up forcing on plant growth and reproduction in a coastal ecosystem.

    PubMed

    Daleo, Pedro; Alberti, Juan; Bruschetti, Carlos Martin; Pascual, Jesos; Iribarne, Oscar; Silliman, Brian R

    2015-08-01

    Bottom-up and top-down effects act together to exert strong control over plant growth and reproduction, but how physical stress modifies those interactive forces remains unclear. Even though empirical evidence is scarce, theory predicts that the importance of both top-down- and bottom-up forces may decrease as physical stress increases. Here, we experimentally evaluate in the field the separate and interactive effect of salinity, nutrient availability, and crab herbivory on plant above- and belowground biomass, as well as on sexual and clonal reproduction in the salt marsh plant Spartina densiflora. Results show that the outcome of the interaction between nutrient availability and herbivory is highly context dependent, not only varying with the abiotic context (i.e., with or without increased salinity stress), but also with the dependent variable considered. Contrary to theoretical predictions, our results show that, consistently across different measured variables, salinity stress did not cancel bottom-up (i.e., nutrients) or top-down (i.e., consumers) control, but has additive effects. Our results support emerging theory by highlighting that, under many conditions, physical stress can act additively with, or even stimulate, consumer control, especially in cases where the physical stress is only experienced by basal levels of the trophic chain. Abiotic stress, as well as bottom-up and top-down factors, can affect salt marsh structure and function not only by affecting biomass production but also by having other indirect effects, such as changing patterns in plant biomass allocation and reproduction.

  2. A critical role of temporoparietal junction in the integration of top-down and bottom-up attentional control

    PubMed Central

    Wu, Qiong; Chang, Chi-Fu; Xi, Sisi; Huang, I-Wen; Liu, Zuxiang; Juan, Chi-Hung; Wu, Yanhong; Fan, Jin

    2015-01-01

    Information processing can be biased toward behaviorally relevant and salient stimuli by top-down (goal-directed) and bottom-up (stimulus-driven) attentional control processes. However, the neural basis underlying the integration of these processes is not well understood. We employed functional magnetic resonance imaging and transcranial direct-current stimulation (tDCS) in humans to examine the brain mechanisms underlying the interaction between these two processes. We manipulated the cognitive load involved in top-down processing and stimulus surprise involved in bottom-up processing in a factorial design by combining a majority function task and an oddball paradigm. We found that high cognitive load and high surprise level were associated with prolonged reaction time compared to low cognitive load and low surprise level, with a synergistic interaction effect which was accompanied by a greater deactivation of bilateral temporoparietal junction (TPJ). In addition, the TPJ displayed negative functional connectivity with right middle occipital gyrus involved in bottom-up processing (modulated by the interaction effect) and the right frontal eye field (FEF) involved in top-down control. The enhanced negative functional connectivity between the TPJ and right FEF was accompanied by a larger behavioral interaction effect across subjects. Application of cathodal tDCS over the right TPJ eliminated the interaction effect. These results suggest that the TPJ plays a critical role in processing bottom-up information for top-down control of attention. PMID:26308973

  3. Assessment of discrepancies between bottom-up and regional emission inventories in Norwegian urban areas

    NASA Astrophysics Data System (ADS)

    López-Aparicio, Susana; Guevara, Marc; Thunis, Philippe; Cuvelier, Kees; Tarrasón, Leonor

    2017-04-01

    This study shows the capabilities of a benchmarking system to identify inconsistencies in emission inventories, and to evaluate the reason behind discrepancies as a mean to improve both bottom-up and downscaled emission inventories. Fine scale bottom-up emission inventories for seven urban areas in Norway are compared with three regional emission inventories, EC4MACS, TNO_MACC-II and TNO_MACC-III, downscaled to the same areas. The comparison shows discrepancies in nitrogen oxides (NOx) and particulate matter (PM2.5 and PM10) when evaluating both total and sectorial emissions. The three regional emission inventories underestimate NOx and PM10 traffic emissions by approximately 20-80% and 50-90%, respectively. The main reasons for the underestimation of PM10 emissions from traffic in the regional inventories are related to non-exhaust emissions due to resuspension, which are included in the bottom-up emission inventories but are missing in the official national emissions, and therefore in the downscaled regional inventories. The benchmarking indicates that the most probable reason behind the underestimation of NOx traffic emissions by the regional inventories is the activity data. The fine scale NOx traffic emissions from bottom-up inventories are based on the actual traffic volume at the road link and are much higher than the NOx emissions downscaled from national estimates based on fuel sales and based on population for the urban areas. We have identified important discrepancies in PM2.5 emissions from wood burning for residential heating among all the inventories. These discrepancies are associated with the assumptions made for the allocation of emissions. In the EC4MACs inventory, such assumptions imply high underestimation of PM2.5 emissions from the residential combustion sector in urban areas, which ranges from 40 to 90% compared with the bottom-up inventories. The study shows that in three of the seven Norwegian cities there is need for further improvement of the emission inventories.

  4. Abundant Lysine Methylation and N-Terminal Acetylation in Sulfolobus islandicus Revealed by Bottom-Up and Top-Down Proteomics*

    PubMed Central

    Vorontsov, Egor A.; Rensen, Elena; Prangishvili, David; Krupovic, Mart; Chamot-Rooke, Julia

    2016-01-01

    Protein post-translational methylation has been reported to occur in archaea, including members of the genus Sulfolobus, but has never been characterized on a proteome-wide scale. Among important Sulfolobus proteins carrying such modification are the chromatin proteins that have been described to be methylated on lysine side chains, resembling eukaryotic histones in that aspect. To get more insight into the extent of this modification and its dynamics during the different growth steps of the thermoacidophylic archaeon S. islandicus LAL14/1, we performed a global and deep proteomic analysis using a combination of high-throughput bottom-up and top-down approaches on a single high-resolution mass spectrometer. 1,931 methylation sites on 751 proteins were found by the bottom-up analysis, with methylation sites on 526 proteins monitored throughout three cell culture growth stages: early-exponential, mid-exponential, and stationary. The top-down analysis revealed 3,978 proteoforms arising from 681 proteins, including 292 methylated proteoforms, 85 of which were comprehensively characterized. Methylated proteoforms of the five chromatin proteins (Alba1, Alba2, Cren7, Sul7d1, Sul7d2) were fully characterized by a combination of bottom-up and top-down data. The top-down analysis also revealed an increase of methylation during cell growth for two chromatin proteins, which had not been evidenced by bottom-up. These results shed new light on the ubiquitous lysine methylation throughout the S. islandicus proteome. Furthermore, we found that S. islandicus proteins are frequently acetylated at the N terminus, following the removal of the N-terminal methionine. This study highlights the great value of combining bottom-up and top-down proteomics for obtaining an unprecedented level of accuracy in detecting differentially modified intact proteoforms. The data have been deposited to the ProteomeXchange with identifiers PXD003074 and PXD004179. PMID:27555370

  5. Long-Time Asymptotics of a Box-Type Initial Condition in a Viscous Fluid Conduit

    NASA Astrophysics Data System (ADS)

    Franco, Nevil; Webb, Emily; Maiden, Michelle; Hoefer, Mark; El, Gennady

    2017-11-01

    The initial value problem for a localized hump disturbance is fundamental to dispersive nonlinear waves, beginning with studies of the celebrated, completely integrable Korteweg-de Vries equation. However, understanding responses to similar disturbances in many realistic dispersive wave systems is more complicated because they lack the mathematical property of complete integrability. This project applies Whitham nonlinear wave modulation theory to estimate how a viscous fluid conduit evolves this classic initial value problem. Comparisons between theory, numerical simulations, and experiments are presented. The conduit system consists of a viscous fluid column (glycerol) and a diluted, dyed version of the same fluid introduced to the column through a nozzle at the bottom. Steady injection and the buoyancy of the injected fluid leads to the eventual formation of a stable fluid conduit. Within this structure, a one hump disturbance is introduced and is observed to break up into a quantifiable number of solitons. This structure's experimental evolution is to Whitham theory and numerical simulations of a long-wave interfacial model equation. The method presented is general and can be applied to other dispersive nonlinear wave systems. Please email me, as I am the submitter.

  6. The Influence of a Sandy Substrate, Seagrass, or Highly Turbid Water on Albedo and Surface Heat Flux

    NASA Astrophysics Data System (ADS)

    Fogarty, M. C.; Fewings, M. R.; Paget, A. C.; Dierssen, H. M.

    2018-01-01

    Sea-surface albedo is a combination of surface-reflected and water-leaving irradiance, but water-leaving irradiance typically contributes less than 15% of the total albedo in open-ocean conditions. In coastal systems, however, the bottom substrate or suspended particulate matter can increase the amount of backscattered light, thereby increasing albedo and decreasing net shortwave surface heat flux. Here a sensitivity analysis using observations and models predicts the effect of light scattering on albedo and the net shortwave heat flux for three test cases: a bright sand bottom, a seagrass canopy, and turbid water. After scaling to the full solar shortwave spectrum, daytime average albedo for the test cases is up to 0.20 and exceeds the value of 0.05 predicted using a commonly applied parameterization. Daytime net shortwave heat flux into the water is significantly reduced, particularly for waters with bright sediments, dense horizontal seagrass canopies < 0.25 m from the sea surface, or highly turbid waters with suspended particulate matter concentration ≥ 50 g m-3. Observations of a more vertical seagrass canopy within 0.2 and 1 m of the surface indicate the increase in albedo compared to the common parameterization is negligible. Therefore, we suggest that the commonly applied albedo lookup table can be used in coastal heat flux estimates in water as shallow as 1 m unless the bottom substrate is highly reflective or the water is highly turbid. Our model results provide guidance to researchers who need to determine albedo in highly reflective or highly turbid conditions but have no direct observations.

  7. Settlement Dynamics and Hierarchy from Agent Decision-Making: a Method Derived from Entropy Maximization.

    PubMed

    Altaweel, Mark

    2015-01-01

    This paper presents an agent-based complex system simulation of settlement structure change using methods derived from entropy maximization modeling. The approach is applied to model the movement of people and goods in urban settings to study how settlement size hierarchy develops. While entropy maximization is well known for assessing settlement structure change over different spatiotemporal settings, approaches have rarely attempted to develop and apply this methodology to understand how individual and household decisions may affect settlement size distributions. A new method developed in this paper allows individual decision-makers to chose where to settle based on social-environmental factors, evaluate settlements based on geography and relative benefits, while retaining concepts derived from entropy maximization with settlement size affected by movement ability and site attractiveness feedbacks. To demonstrate the applicability of the theoretical and methodological approach, case study settlement patterns from the Middle Bronze (MBA) and Iron Ages (IA) in the Iraqi North Jazirah Survey (NJS) are used. Results indicate clear differences in settlement factors and household choices in simulations that lead to settlement size hierarchies comparable to the two evaluated periods. Conflict and socio-political cohesion, both their presence and absence, are suggested to have major roles in affecting the observed settlement hierarchy. More broadly, the model is made applicable for different empirically based settings, while being generalized to incorporate data uncertainty, making the model useful for understanding urbanism from top-down and bottom-up perspectives.

  8. High-strength magnetically switchable plasmonic nanorods assembled from a binary nanocrystal mixture

    DOE PAGES

    Zhang, Mingliang; Magagnosc, Daniel J.; Liberal, Iñigo; ...

    2016-11-07

    Next-generation ‘smart’ nanoparticle systems should be precisely engineered in size, shape and composition to introduce multiple functionalities, unattainable from a single material. Bottom-up chemical methods are prized for the synthesis of crystalline nanoparticles, that is, nanocrystals, with size- and shape-dependent physical properties, but they are less successful in achieving multifunctionality. Top-down lithographic methods can produce multifunctional nanoparticles with precise size and shape control, yet this becomes increasingly difficult at sizes of ~10 nm. In this paper, we report the fabrication of multifunctional, smart nanoparticle systems by combining top-down fabrication and bottom-up self-assembly methods. Particularly, we template nanorods from a mixturemore » of superparamagnetic Zn 0.2Fe 2.8O 4 and plasmonic Au nanocrystals. The superparamagnetism of Zn 0.2Fe 2.8O 4 prevents these nanorods from spontaneous magnetic-dipole-induced aggregation, while their magnetic anisotropy makes them responsive to an external field. Ligand exchange drives Au nanocrystal fusion and forms a porous network, imparting the nanorods with high mechanical strength and polarization-dependent infrared surface plasmon resonances. Finally, the combined superparamagnetic and plasmonic functions enable switching of the infrared transmission of a hybrid nanorod suspension using an external magnetic field.« less

  9. A 'movement for improvement'? A qualitative study of the adoption of social movement strategies in the implementation of a quality improvement campaign.

    PubMed

    Waring, Justin; Crompton, Amanda

    2017-09-01

    Given the difficulties of implementing 'top-down' quality improvements, health service leaders have turned to methods that empower clinicians to co-produce 'bottom-up' improvements. This has involved the adoption of strategies and activities associated with social movements, with clinicians encouraged to participate in collective action towards the shared goal of improvement. This paper examines the adoption of social movement methods by hospital managers as a strategy for implementing a quality improvement 'campaign'. Our case study suggests that, despite the claim of empowering clinicians to develop 'bottom-up' improvements, the use of social movement methods can be more narrowly concerned with engaging clinicians in pre-determined programmes of 'top-down' change. It finds a prominent role for 'hybrid' clinical leaders and other staff representatives in the mobilisation of the campaign, especially for enrolling clinicians in change activities. The work of these 'hybrids' suggests some degree of creative mediation between clinical and managerial interests, but more often alignment with the aspirations of management. The study raises questions about the translation of social movement's theories as a strategy for managing change and re-inventing professionalism. © 2017 The Authors. Sociology of Health & Illness published by John Wiley & Sons Ltd on behalf of Foundation for SHIL.

  10. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of organochlorine pesticides and polychlorinated biphenyls in bottom sediment by dual capillary-column gas chromatography with electron-capture detection

    USGS Publications Warehouse

    Foreman, William T.; Connor, Brooke F.; Furlong, Edward T.; Vaught, Deborah G.; Merten, Leslie M.

    1995-01-01

    A method for the determination of 30 individual organochlorine pesticides, total toxaphene, and total polychlorinated biphenyls (PCBs) in bottom sediment is described. The method isolates the pesticides and PCBs by solvent extraction with dichlorobenzene, removes inorganic sulfur, large naturally occurring molecules, and other unwanted interferences by gel permeation chromatography, and further cleans up and class fractionates the extract using adsorption chromatography. The com- pounds then are instrumentally determined using dual capillary-column gas chromatography with electron-capture detection. Reporting limits range from 1 to 5 micrograms per kilogram for 30 individual pesticides, 50 micrograms per kilogram for total PCBs, and 200 micrograms per kilogram for total toxaphene. The method also is designed to allow the simultaneous isolation of 79 other semivolatile organic compounds from the sediment, which are separately quantified using gas chromatography with mass spectrometric detection. The method was developed in support of the U.S. Geological Survey's National Water-Quality Assessment program.

  11. Bottom-Up and Top-Down Solid-State NMR Approaches for Bacterial Biofilm Matrix Composition

    PubMed Central

    Cegelski, Lynette

    2015-01-01

    The genomics and proteomics revolutions have been enormously successful in providing crucial “parts lists” for biological systems. Yet, formidable challenges exist in generating complete descriptions of how the parts function and assemble into macromolecular complexes and whole-cell assemblies. Bacterial biofilms are complex multicellular bacterial communities protected by a slime-like extracellular matrix that confers protection to environmental stress and enhances resistance to antibiotics and host defenses. As a non-crystalline, insoluble, heterogeneous assembly, the biofilm extracellular matrix poses a challenge to compositional analysis by conventional methods. In this Perspective, bottom-up and top-down solid-state NMR approaches are described for defining chemical composition in complex macrosystems. The “sum-of-theparts” bottom-up approach was introduced to examine the amyloid-integrated biofilms formed by E. coli and permitted the first determination of the composition of the intact extracellular matrix from a bacterial biofilm. An alternative top-down approach was developed to define composition in V. cholerae biofilms and relied on an extensive panel of NMR measurements to tease out specific carbon pools from a single sample of the intact extracellular matrix. These two approaches are widely applicable to other heterogeneous assemblies. For bacterial biofilms, quantitative parameters of matrix composition are needed to understand how biofilms are assembled, to improve the development of biofilm inhibitors, and to dissect inhibitor modes of action. Solid-state NMR approaches will also be invaluable in obtaining parameters of matrix architecture. PMID:25797008

  12. Bottom-up and top-down solid-state NMR approaches for bacterial biofilm matrix composition.

    PubMed

    Cegelski, Lynette

    2015-04-01

    The genomics and proteomics revolutions have been enormously successful in providing crucial "parts lists" for biological systems. Yet, formidable challenges exist in generating complete descriptions of how the parts function and assemble into macromolecular complexes and whole-cell assemblies. Bacterial biofilms are complex multicellular bacterial communities protected by a slime-like extracellular matrix that confers protection to environmental stress and enhances resistance to antibiotics and host defenses. As a non-crystalline, insoluble, heterogeneous assembly, the biofilm extracellular matrix poses a challenge to compositional analysis by conventional methods. In this perspective, bottom-up and top-down solid-state NMR approaches are described for defining chemical composition in complex macrosystems. The "sum-of-the-parts" bottom-up approach was introduced to examine the amyloid-integrated biofilms formed by Escherichia coli and permitted the first determination of the composition of the intact extracellular matrix from a bacterial biofilm. An alternative top-down approach was developed to define composition in Vibrio cholerae biofilms and relied on an extensive panel of NMR measurements to tease out specific carbon pools from a single sample of the intact extracellular matrix. These two approaches are widely applicable to other heterogeneous assemblies. For bacterial biofilms, quantitative parameters of matrix composition are needed to understand how biofilms are assembled, to improve the development of biofilm inhibitors, and to dissect inhibitor modes of action. Solid-state NMR approaches will also be invaluable in obtaining parameters of matrix architecture. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Altered emotional interference processing in affective and cognitive-control brain circuitry in major depression

    PubMed Central

    Fales, Christina L.; Barch, Deanna M.; Rundle, Melissa M.; Mintun, Mark A.; Snyder, Abraham Z.; Cohen, Jonathan D.; Mathews, Jose; Sheline, Yvette I.

    2008-01-01

    Background Major depression is characterized by a negativity bias: an enhanced responsiveness to, and memory for, affectively negative stimuli. However it is not yet clear whether this bias represents (1) impaired top-down cognitive control over affective responses, potentially linked to deficits in dorsolateral prefrontal cortex function; or (2) enhanced bottom-up responses to affectively-laden stimuli that dysregulate cognitive control mechanisms, potentially linked to deficits in amygdala and anterior cingulate function. Methods We used an attentional interference task using emotional distracters to test for top-down versus bottom-up dysfunction in the interaction of cognitive-control circuitry and emotion-processing circuitry. A total of 27 patients with major depression and 24 controls were tested. Event-related functional magnetic resonance imaging was carried out as participants directly attended to, or attempted to ignore, fear-related stimuli. Results Compared to controls, patients with depression showed an enhanced amygdala response to unattended fear-related stimuli (relative to unattended neutral). By contrast, control participants showed increased activity in right dorsolateral prefrontal cortex (Brodmann areas 46/9) when ignoring fear stimuli (relative to neutral), which the patients with depression did not. In addition, the depressed participants failed to show evidence of error-related cognitive adjustments (increased activity in bilateral dorsolateral prefrontal cortex on post-error trials), but the control group did show them. Conclusions These results suggest multiple sources of dysregulation in emotional and cognitive control circuitry in depression, implicating both top-down and bottom-up dysfunction. PMID:17719567

  14. Bottom-up and top-down solid-state NMR approaches for bacterial biofilm matrix composition

    NASA Astrophysics Data System (ADS)

    Cegelski, Lynette

    2015-04-01

    The genomics and proteomics revolutions have been enormously successful in providing crucial "parts lists" for biological systems. Yet, formidable challenges exist in generating complete descriptions of how the parts function and assemble into macromolecular complexes and whole-cell assemblies. Bacterial biofilms are complex multicellular bacterial communities protected by a slime-like extracellular matrix that confers protection to environmental stress and enhances resistance to antibiotics and host defenses. As a non-crystalline, insoluble, heterogeneous assembly, the biofilm extracellular matrix poses a challenge to compositional analysis by conventional methods. In this perspective, bottom-up and top-down solid-state NMR approaches are described for defining chemical composition in complex macrosystems. The "sum-of-the-parts" bottom-up approach was introduced to examine the amyloid-integrated biofilms formed by Escherichia coli and permitted the first determination of the composition of the intact extracellular matrix from a bacterial biofilm. An alternative top-down approach was developed to define composition in Vibrio cholerae biofilms and relied on an extensive panel of NMR measurements to tease out specific carbon pools from a single sample of the intact extracellular matrix. These two approaches are widely applicable to other heterogeneous assemblies. For bacterial biofilms, quantitative parameters of matrix composition are needed to understand how biofilms are assembled, to improve the development of biofilm inhibitors, and to dissect inhibitor modes of action. Solid-state NMR approaches will also be invaluable in obtaining parameters of matrix architecture.

  15. Studies of Visual Attention in Physics Problem Solving

    ERIC Educational Resources Information Center

    Madsen, Adrian M.

    2013-01-01

    The work described here represents an effort to understand and influence visual attention while solving physics problems containing a diagram. Our visual system is guided by two types of processes--top-down and bottom-up. The top-down processes are internal and determined by ones prior knowledge and goals. The bottom-up processes are external and…

  16. Two-dimensional combinatorial screening enables the bottom-up design of a microRNA-10b inhibitor.

    PubMed

    Velagapudi, Sai Pradeep; Disney, Matthew D

    2014-03-21

    The RNA motifs that bind guanidinylated kanamycin A (G Kan A) and guanidinylated neomycin B (G Neo B) were identified via two-dimensional combinatorial screening (2DCS). The results of these studies enabled the "bottom-up" design of a small molecule inhibitor of oncogenic microRNA-10b.

  17. Initial Clinician Reports of the Bottom-Up Dissemination of an Evidence-Based Intervention for Early Childhood Trauma

    ERIC Educational Resources Information Center

    David, Paula; Schiff, Miriam

    2018-01-01

    Background: Bottom-up dissemination (BUD) of evidence based treatments (EBT), entailing the spread of an intervention through a peer network in a decentralized manner, is an under-reported phenomenon in the professional literature. Objective: This paper presents findings from a study researching the feasibility of BUD of an evidence-based…

  18. Unleashing the Creative Potential of Faculty to Create Blended Learning

    ERIC Educational Resources Information Center

    Carbonell, Katerina Bohle; Dailey-Hebert, Amber; Gijselaers, Wim

    2013-01-01

    Bottom-up managed change processes offer the advantage to use the creative power of faculty to design and implement blended learning programs. This article proposes four factors as crucial elements for a successful bottom-up change process: the macro and micro contexts, the project leader and the project members. Interviews were conducted with 5…

  19. Reading Nature from a "Bottom-Up" Perspective

    ERIC Educational Resources Information Center

    Magntorn, Ola; Hellden, Gustav

    2007-01-01

    This paper reports on a study of ecology teaching and learning in a Swedish primary school class (age 10-11 yrs). A teaching sequence was designed to help students read nature in a river ecosystem. The teaching sequence had a "bottom up" approach, taking as its starting point a common key organism--the freshwater shrimp. From this…

  20. Teacher-Led Professional Development: A Proposal for a Bottom-up Structure Approach

    ERIC Educational Resources Information Center

    Macias, Angela

    2017-01-01

    This article uses current research recommendations for teacher-led professional development as well as qualitative data from a set of grassroots conferences to propose a new model for bottom-up teacher-led professional development. This article argues that by providing a neutral space and recruiting expertise of local experts, a public sphere can…

  1. Pedagogical Perspectives and Practices Reflected in Metaphors of Learning and Digital Learning of ICT Leaders

    ERIC Educational Resources Information Center

    Blau, Ina; Grinberg, Ronen; Shamir-Inbal, Tamar

    2018-01-01

    This study examines the meaning attributed to the contribution of technology to pedagogical practices from the perspective of school ICT leaders. While previous studies use metaphors for bottom-up exploration, this study employs an innovative combination of bottom-up and top-down metaphor analysis based on two frameworks: (a) metaphors of general…

  2. Thinking about the Weather: How Display Salience and Knowledge Affect Performance in a Graphic Inference Task

    ERIC Educational Resources Information Center

    Hegarty, Mary; Canham, Matt S.; Fabrikant, Sara I.

    2010-01-01

    Three experiments examined how bottom-up and top-down processes interact when people view and make inferences from complex visual displays (weather maps). Bottom-up effects of display design were investigated by manipulating the relative visual salience of task-relevant and task-irrelevant information across different maps. Top-down effects of…

  3. How Adolescents Comprehend Unfamiliar Proverbs: The Role of Top-Down and Bottom-Up Processes.

    ERIC Educational Resources Information Center

    Nippold, Marilyn A.; Allen, Melissa M.; Kirsch, Dixon I.

    2000-01-01

    Relationships between word knowledge and proverb comprehension was examined in 150 typically achieving adolescents (ages 12, 15, and 18). Word knowledge was associated with proverb comprehension in all groups, particularly in the case of abstract proverbs. Results support a model of proverb comprehension in adolescents that includes bottom-up in…

  4. Coupling 2D Finite Element Models and Circuit Equations Using a Bottom-Up Methodology

    DTIC Science & Technology

    2002-11-01

    EQUATIONS USING A BOTTOM-UP METHODOLOGY E. G6mezl, J. Roger-Folch2 , A. Gabald6nt and A. Molina’ ’Dpto. de Ingenieria Eldctrica. Universidad Polit...de Ingenieria Elictrica. ETSII. Universidad Politdcnica de Valencia. PO Box 22012, 46071. Valencia, Spain. E-mail: iroger adie.upv.es ABSTRACT The

  5. A balance of bottom-up and top-down in linking climate policies

    NASA Astrophysics Data System (ADS)

    Green, Jessica F.; Sterner, Thomas; Wagner, Gernot

    2014-12-01

    Top-down climate negotiations embodied by the Kyoto Protocol have all but stalled, chiefly because of disagreements over targets and objections to financial transfers. To avoid those problems, many have shifted their focus to linkage of bottom-up climate policies such as regional carbon markets. This approach is appealing, but we identify four obstacles to successful linkage: different levels of ambition; competing domestic policy objectives; objections to financial transfers; and the difficulty of close regulatory coordination. Even with a more decentralized approach, overcoming the 'global warming gridlock' of the intergovernmental negotiations will require close international coordination. We demonstrate how a balance of bottom-up and top-down elements can create a path toward an effective global climate architecture.

  6. Frost Susceptibility of Soil, Review of Index Tests,

    DTIC Science & Technology

    1981-12-01

    contained in the CAA ( 1948 ) speci- used for both subgrade and base/subbase materi- fications for the construction of airports. Ac- als. An additional...Texas. Details of the Texas method for deter- t Percentage of portion smaller than mining FS were reported by Carothers ( 1948 ) and 20mm were taken...material. pies are frozen from the bottom up at a constant rate of heat flow, similar to the method sug- Switzerland gested by Penner and Ueda (1978

  7. Estimating Anthropogenic Emissions of Hydrogen Chloride and Fine Particulate Chloride in China

    NASA Astrophysics Data System (ADS)

    Fu, X.; Wang, T.; Wang, S.; Zhang, L.

    2017-12-01

    Nitryl chloride (ClNO2) can significantly impact the atmospheric photochemistry via photolysis and subsequent reactions of chlorine radical with other gases. The formation of ClNO2 in the atmosphere is sensitive to the emissions of chlorine-containing particulates from oceanic and anthropogenic sources. For China, the only available anthropogenic chlorine emission inventory was compiled for the year 1990 with a coarse resolution of 1 degree. In this study, we developed an up-to-date anthropogenic inventory of hydrogen chloride (HCl) and fine particulate chloride (Cl-) emissions in China for the year 2014, including coal burning, industrial processes, biomass burning and waste burning. Bottom-up and top-down methodologies were combined. Detailed local data (e.g. Cl content in coal, control technologies, etc.) were collected and applied. In order to improve the spatial resolution of emissions, detailed point source information were collected for coal-fired power plants, cement factories, iron & steel factories and waste incineration factories. Uncertainties of this emission inventory and their major causes were analyzed using the Monte Carlo method. This work enables better quantification of the ClNO2 production and impact over China.

  8. Overweight and obesity on the island of Ireland: an estimation of costs

    PubMed Central

    Dee, Anne; Callinan, Aoife; Doherty, Edel; O'Neill, Ciaran; McVeigh, Treasa; Sweeney, Mary Rose; Staines, Anthony; Kearns, Karen; Fitzgerald, Sarah; Sharp, Linda; Kee, Frank; Hughes, John; Balanda, Kevin; Perry, Ivan J

    2015-01-01

    Objectives The increasing prevalence of overweight and obesity worldwide continues to compromise population health and creates a wider societal cost in terms of productivity loss and premature mortality. Despite extensive international literature on the cost of overweight and obesity, findings are inconsistent between Europe and the USA, and particularly within Europe. Studies vary on issues of focus, specific costs and methods. This study aims to estimate the healthcare and productivity costs of overweight and obesity for the island of Ireland in 2009, using both top-down and bottom-up approaches. Methods Costs were estimated across four categories: healthcare utilisation, drug costs, work absenteeism and premature mortality. Healthcare costs were estimated using Population Attributable Fractions (PAFs). PAFs were applied to national cost data for hospital care and drug prescribing. PAFs were also applied to social welfare and national mortality data to estimate productivity costs due to absenteeism and premature mortality. Results The healthcare costs of overweight and obesity in 2009 were estimated at €437 million for the Republic of Ireland (ROI) and €127.41 million for NI. Productivity loss due to overweight and obesity was up to €865 million for ROI and €362 million for NI. The main drivers of healthcare costs are cardiovascular disease, type II diabetes, colon cancer, stroke and gallbladder disease. In terms of absenteeism, low back pain is the main driver in both jurisdictions, and for productivity loss due to premature mortality the primary driver of cost is coronary heart disease. Conclusions The costs are substantial, and urgent public health action is required in Ireland to address the problem of increasing prevalence of overweight and obesity, which if left unchecked will lead to unsustainable cost escalation within the health service and unacceptable societal costs. PMID:25776042

  9. An Adaptive Cross-Architecture Combination Method for Graph Traversal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    You, Yang; Song, Shuaiwen; Kerbyson, Darren J.

    2014-06-18

    Breadth-First Search (BFS) is widely used in many real-world applications including computational biology, social networks, and electronic design automation. The combination method, using both top-down and bottom-up techniques, is the most effective BFS approach. However, current combination methods rely on trial-and-error and exhaustive search to locate the optimal switching point, which may cause significant runtime overhead. To solve this problem, we design an adaptive method based on regression analysis to predict an optimal switching point for the combination method at runtime within less than 0.1% of the BFS execution time.

  10. Mechanisms of wave‐driven water level variability on reef‐fringed coastlines

    USGS Publications Warehouse

    Buckley, Mark L.; Lowe, Ryan J.; Hansen, Jeff E; van Dongeren, Ap R.; Storlazzi, Curt

    2018-01-01

    Wave‐driven water level variability (and runup at the shoreline) is a significant cause of coastal flooding induced by storms. Wave runup is challenging to predict, particularly along tropical coral reef‐fringed coastlines due to the steep bathymetric profiles and large bottom roughness generated by reef organisms, which can violate assumptions in conventional models applied to open sandy coastlines. To investigate the mechanisms of wave‐driven water level variability on a reef‐fringed coastline, we performed a set of laboratory flume experiments on an along‐shore uniform bathymetric profile with and without bottom roughness. Wave setup and waves at frequencies lower than the incident sea‐swell forcing (infragravity waves) were found to be the dominant components of runup. These infragravity waves were positively correlated with offshore wave groups, signifying they were generated in the surf zone by the oscillation of the breakpoint. On the reef flat and at the shoreline, the low‐frequency waves formed a standing wave pattern with energy concentrated at the natural frequencies of the reef flat, indicating resonant amplification. Roughness elements used in the flume to mimic large reef bottom roughness reduced low frequency motions on the reef flat and reduced wave run up by 30% on average, compared to the runs over a smooth bed. These results provide insight into sea‐swell and infragravity wave transformation and wave setup dynamics on steep‐sloped coastlines, and the effect that future losses of reef bottom roughness may have on coastal flooding along reef‐fringed coasts.

  11. Improving Heat Transfer at the Bottom of Vials for Consistent Freeze Drying with Unidirectional Structured Ice.

    PubMed

    Rosa, Mónica; Tiago, João M; Singh, Satish K; Geraldes, Vítor; Rodrigues, Miguel A

    2016-10-01

    The quality of lyophilized products is dependent of the ice structure formed during the freezing step. Herein, we evaluate the importance of the air gap at the bottom of lyophilization vials for consistent nucleation, ice structure, and cake appearance. The bottom of lyophilization vials was modified by attaching a rectified aluminum disc with an adhesive material. Freezing was studied for normal and converted vials, with different volumes of solution, varying initial solution temperature (from 5°C to 20°C) and shelf temperature (from -20°C to -40°C). The impact of the air gap on the overall heat transfer was interpreted with the assistance of a computational fluid dynamics model. Converted vials caused nucleation at the bottom and decreased the nucleation time up to one order of magnitude. The formation of ice crystals unidirectionally structured from bottom to top lead to a honeycomb-structured cake after lyophilization of a solution with 4% mannitol. The primary drying time was reduced by approximately 35%. Converted vials that were frozen radially instead of bottom-up showed similar improvements compared with normal vials but very poor cake quality. Overall, the curvature of the bottom of glass vials presents a considerable threat to consistency by delaying nucleation and causing radial ice growth. Rectifying the vials bottom with an adhesive material revealed to be a relatively simple alternative to overcome this inconsistency.

  12. Dissociable effects of top-down and bottom-up attention during episodic encoding

    PubMed Central

    Uncapher, Melina R.; Hutchinson, J. Benjamin; Wagner, Anthony D.

    2011-01-01

    It is well established that the formation of memories for life’s experiences—episodic memory—is influenced by how we attend to those experiences, yet the neural mechanisms by which attention shapes episodic encoding are still unclear. We investigated how top-down and bottom-up attention contribute to memory encoding of visual objects in humans by manipulating both types of attention during functional magnetic resonance imaging (fMRI) of episodic memory formation. We show that dorsal parietal cortex—specifically, intraparietal sulcus (IPS)—was engaged during top-down attention and was also recruited during the successful formation of episodic memories. By contrast, bottom-up attention engaged ventral parietal cortex—specifically, temporoparietal junction (TPJ)—and was also more active during encoding failure. Functional connectivity analyses revealed further dissociations in how top-down and bottom-up attention influenced encoding: while both IPS and TPJ influenced activity in perceptual cortices thought to represent the information being encoded (fusiform/lateral occipital cortex), they each exerted opposite effects on memory encoding. Specifically, during a preparatory period preceding stimulus presentation, a stronger drive from IPS was associated with a higher likelihood that the subsequently attended stimulus would be encoded. By contrast, during stimulus processing, stronger connectivity with TPJ was associated with a lower likelihood the stimulus would be successfully encoded. These findings suggest that during encoding of visual objects into episodic memory, top-down and bottom-up attention can have opposite influences on perceptual areas that subserve visual object representation, suggesting that one manner in which attention modulates memory is by altering the perceptual processing of to-be-encoded stimuli. PMID:21880922

  13. Selective Activation of the Deep Layers of the Human Primary Visual Cortex by Top-Down Feedback.

    PubMed

    Kok, Peter; Bains, Lauren J; van Mourik, Tim; Norris, David G; de Lange, Floris P

    2016-02-08

    In addition to bottom-up input, the visual cortex receives large amounts of feedback from other cortical areas [1-3]. One compelling example of feedback activation of early visual neurons in the absence of bottom-up input occurs during the famous Kanizsa illusion, where a triangular shape is perceived, even in regions of the image where there is no bottom-up visual evidence for it. This illusion increases the firing activity of neurons in the primary visual cortex with a receptive field on the illusory contour [4]. Feedback signals are largely segregated from feedforward signals within each cortical area, with feedforward signals arriving in the middle layer, while top-down feedback avoids the middle layers and predominantly targets deep and superficial layers [1, 2, 5, 6]. Therefore, the feedback-mediated activity increase in V1 during the perception of illusory shapes should lead to a specific laminar activity profile that is distinct from the activity elicited by bottom-up stimulation. Here, we used fMRI at high field (7 T) to empirically test this hypothesis, by probing the cortical response to illusory figures in human V1 at different cortical depths [7-14]. We found that, whereas bottom-up stimulation activated all cortical layers, feedback activity induced by illusory figures led to a selective activation of the deep layers of V1. These results demonstrate the potential for non-invasive recordings of neural activity with laminar specificity in humans and elucidate the role of top-down signals during perceptual processing. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. A computer vision approach for solar radiation nowcasting using MSG images

    NASA Astrophysics Data System (ADS)

    Álvarez, L.; Castaño Moraga, C. A.; Martín, J.

    2010-09-01

    Cloud structures and haze are the two main atmospheric phenomena that reduce the performance of solar power plants, since they absorb solar energy reaching terrestrial surface. Thus, accurate forecasting of solar radiation is a challenging research area that involves both a precise localization of cloud structures and haze, as well as the attenuation introduced by these artifacts. Our work presents a novel approach for nowcasting services based on image processing techniques applied to MSG satellite images provided by the EUMETSAT Rapid Scan Service (RSS) service. These data are an interesting source of information for our purposes since every 5 minutes we obtain actual information of the atmospheric state in nearly real time. However, a workaround must be given in order to forecast solar radiation. To that end, we synthetically forecast MSG images forecasts from past images applying computer vision techniques adapted to fluid flows in order to evolve atmospheric state. First, we classify cloud structures on two different layers, corresponding to top and bottom clouds, which includes haze. This two-level classification responds to the dominant climate conditions found in our region of interest, the Canary Islands archipelago, regulated by the Gulf Stream and Trade Winds. Vertical structure of Trade Winds consists of two layers, the bottom one, which is fresh and humid, and the top one, which is warm and dry. Between these two layers a thermal inversion appears that does not allow bottom clouds to go up and naturally divides clouds in these two layers. Top clouds can be directly obtained from satellite images by means of a segmentation algorithm on histogram heights. However, bottom clouds are usually overlapped by the former, so an inpainting algorithm is used to recover overlapped areas of bottom clouds. For each layer, cloud motion is estimated through a correlation based optic flow algorithm that provides a vector field that describes the displacement field in each layer between two consecutive images in a sequence. Since RSS service from EUMETSAT provides images every 5 minutes (Δt), the cloud motion vector field between images at time t0 and (t0 - Δt) is quite similar to that between (t0 - Δt) and (t0 - 2Δt). Under this assumption, we infer the motion vector field for the next image in order to build a synthetic version of the image at time (t0 + Δt). The computation of this future motion vector field takes into account terrain orography in order to produce more realistic forecasts. In this sense, we are currently working on the integration of information from NWP outputs in order to introduce other atmospheric phenomena. Applying this algorithm several times we are able to produce short-term forecasts up to 6 hours with encouraging performance. To validate our results, we use both, comparison of synthetically generated images with the corresponding images at a given time, and direct solar radiation measurement with the set of meteorological stations located at several points of the canarian archipelago.

  15. Bayesian analogy with relational transformations.

    PubMed

    Lu, Hongjing; Chen, Dawn; Holyoak, Keith J

    2012-07-01

    How can humans acquire relational representations that enable analogical inference and other forms of high-level reasoning? Using comparative relations as a model domain, we explore the possibility that bottom-up learning mechanisms applied to objects coded as feature vectors can yield representations of relations sufficient to solve analogy problems. We introduce Bayesian analogy with relational transformations (BART) and apply the model to the task of learning first-order comparative relations (e.g., larger, smaller, fiercer, meeker) from a set of animal pairs. Inputs are coded by vectors of continuous-valued features, based either on human magnitude ratings, normed feature ratings (De Deyne et al., 2008), or outputs of the topics model (Griffiths, Steyvers, & Tenenbaum, 2007). Bootstrapping from empirical priors, the model is able to induce first-order relations represented as probabilistic weight distributions, even when given positive examples only. These learned representations allow classification of novel instantiations of the relations and yield a symbolic distance effect of the sort obtained with both humans and other primates. BART then transforms its learned weight distributions by importance-guided mapping, thereby placing distinct dimensions into correspondence. These transformed representations allow BART to reliably solve 4-term analogies (e.g., larger:smaller::fiercer:meeker), a type of reasoning that is arguably specific to humans. Our results provide a proof-of-concept that structured analogies can be solved with representations induced from unstructured feature vectors by mechanisms that operate in a largely bottom-up fashion. We discuss potential implications for algorithmic and neural models of relational thinking, as well as for the evolution of abstract thought. Copyright 2012 APA, all rights reserved.

  16. The Roles of Feature-Specific Task Set and Bottom-Up Salience in Attentional Capture: An ERP Study

    ERIC Educational Resources Information Center

    Eimer, Martin; Kiss, Monika; Press, Clare; Sauter, Disa

    2009-01-01

    We investigated the roles of top-down task set and bottom-up stimulus salience for feature-specific attentional capture. Spatially nonpredictive cues preceded search arrays that included a color-defined target. For target-color singleton cues, behavioral spatial cueing effects were accompanied by cue-induced N2pc components, indicative of…

  17. Bottom-up processes influence the demography and life-cycle phenology of Hawaiian bird communities

    Treesearch

    Jared D. Wolfe; C. John Ralph; Andrew Wiegardt

    2017-01-01

    Changes in climate can indirectly regulate populations at higher trophic levels by influencing the availability of food resources in the lower reaches of the food web. As such, species that rely on fruit and nectar food resources may be particularly sensitive to these bottom-up perturbations due to the strength of their trophic linkages with climatically-...

  18. The Role of Bottom-Up Processing in Perceptual Categorization by 3- to 4-Month-Old Infants: Simulations and Data

    ERIC Educational Resources Information Center

    French, Robert M.; Mareschal, Denis; Mermillod, Martial; Quinn, Paul C.

    2004-01-01

    Disentangling bottom-up and top-down processing in adult category learning is notoriously difficult. Studying category learning in infancy provides a simple way of exploring category learning while minimizing the contribution of top-down information. Three- to 4-month-old infants presented with cat or dog images will form a perceptual category…

  19. Second Language Listening Instruction: Comparing a Strategies-Based Approach with an Interactive, Strategies/Bottom-Up Skills Approach

    ERIC Educational Resources Information Center

    Yeldham, Michael

    2016-01-01

    This quasi-experimental study compared a strategies approach to second language listening instruction with an interactive approach, one combining a roughly equal balance of strategies and bottom-up skills. The participants were lower-intermediate-level Taiwanese university EFL learners, who were taught for 22 hours over one and a half semesters.…

  20. On the Temporal Relation of Top-Down and Bottom-Up Mechanisms during Guidance of Attention

    ERIC Educational Resources Information Center

    Wykowska, Agnieszka; Schubo, Anna

    2010-01-01

    Two mechanisms are said to be responsible for guiding focal attention in visual selection: bottom-up, saliency-driven capture and top-down control. These mechanisms were examined with a paradigm that combined a visual search task with postdisplay probe detection. Two SOAs between the search display and probe onsets were introduced to investigate…

  1. Citizenship Policy from the Bottom-Up: The Linguistic and Semiotic Landscape of a Naturalization Field Office

    ERIC Educational Resources Information Center

    Loring, Ariel

    2015-01-01

    This article follows a bottom-up approach to language policy (Ramanathan, 2005; Wodak, 2006) in an analysis of citizenship in policy and practice. It compares representations of citizenship in and around a regional branch of the United States Citizenship and Immigration Services (USCIS), with a focus on citizenship swearing-in ceremonies for…

  2. Micelle-templated, poly(lactic-co-glycolic acid) nanoparticles for hydrophobic drug delivery

    PubMed Central

    Nabar, Gauri M; Mahajan, Kalpesh D; Calhoun, Mark A; Duong, Anthony D; Souva, Matthew S; Xu, Jihong; Czeisler, Catherine; Puduvalli, Vinay K; Otero, José Javier; Wyslouzil, Barbara E; Winter, Jessica O

    2018-01-01

    Purpose Poly(lactic-co-glycolic acid) (PLGA) is widely used for drug delivery because of its biocompatibility, ability to solubilize a wide variety of drugs, and tunable degradation. However, achieving sub-100 nm nanoparticles (NPs), as might be desired for delivery via the enhanced permeability and retention effect, is extremely difficult via typical top-down emulsion approaches. Methods Here, we present a bottom-up synthesis method yielding PLGA/block copolymer hybrids (ie, “PolyDots”), consisting of hydrophobic PLGA chains entrapped within self-assembling poly(styrene-b-ethylene oxide) (PS-b-PEO) micelles. Results PolyDots exhibit average diameters <50 nm and lower polydispersity than conventional PLGA NPs. Drug encapsulation efficiencies of PolyDots match conventional PLGA NPs (ie, ~30%) and are greater than those obtained from PS-b-PEO micelles (ie, ~7%). Increasing the PLGA:PS-b-PEO weight ratio alters the drug release mechanism from chain relaxation to erosion controlled. PolyDots are taken up by model glioma cells via endocytotic mechanisms within 24 hours, providing a potential means for delivery to cytoplasm. PolyDots can be lyophilized with minimal change in morphology and encapsulant functionality, and can be produced at scale using electrospray. Conclusion Encapsulation of PLGA within micelles provides a bottom-up route for the synthesis of sub-100 nm PLGA-based nanocarriers with enhanced stability and drug-loading capacity, and tunable drug release, suitable for potential clinical applications. PMID:29391794

  3. Bottom-up and top-down fabrication of nanowire-based electronic devices: In situ doping of vapor liquid solid grown silicon nanowires and etch-dependent leakage current in InGaAs tunnel junctions

    NASA Astrophysics Data System (ADS)

    Kuo, Meng-Wei

    Semiconductor nanowires are important components in future nanoelectronic and optoelectronic device applications. These nanowires can be fabricated using either bottom-up or top-down methods. While bottom-up techniques can achieve higher aspect ratio at reduced dimension without having surface and sub-surface damage, uniform doping distributions with abrupt junction profiles are less challenging for top-down methods. In this dissertation, nanowires fabricated by both methods were systematically investigated to understand: (1) the in situ incorporation of boron (B) dopants in Si nanowires grown by the bottom-up vapor-liquid-solid (VLS) technique, and (2) the impact of plasma-induced etch damage on InGaAs p +-i-n+ nanowire junctions for tunnel field-effect transistors (TFETs) applications. In Chapter 2 and 3, the in situ incorporation of B in Si nanowires grown using silane (SiH4) or silicon tetrachloride (SiCl4) as the Si precursor and trimethylboron (TMB) as the p-type dopant source is investigated by I-V measurements of individual nanowires. The results from measurements using a global-back-gated test structure reveal nonuniform B doping profiles on nanowires grown from SiH4, which is due to simultaneous incorporation of B from nanowire surface and the catalyst during VLS growth. In contrast, a uniform B doping profile in both the axial and radial directions is achieved for TMBdoped Si nanowires grown using SiCl4 at high substrate temperatures. In Chapter 4, the I-V characteristics of wet- and dry-etched InGaAs p+-i-n+ junctions with different mesa geometries, orientations, and perimeter-to-area ratios are compared to evaluate the impact of the dry etch process on the junction leakage current properties. Different post-dry etch treatments, including wet etching and thermal annealing, are performed and the effectiveness of each is assessed by temperaturedependent I-V measurements. As compared to wet-etched control devices, dry-etched junctions have a significantly higher leakage current and a current kink in the reverse bias regime, which is likely due to additional trap states created by plasma-induced damage during the Cl2/Ar/H2 mesa isolation step. These states extend more than 60 nm from the mesa surface and can only be partially passivated after a thermal anneal at 350°C for 20 minutes. The evolution of the electrical properties with post-dry etch treatments indicates that the shallow and deep-level trap states resulting from ion-induced point defects, arsenic vacancies and hydrogen-dopant complexes are the primary cause of degradation in the electrical properties of the dry-etched junctions.

  4. Wide Differences in the Estimation of Cost in Endovenous Laser Therapy for Varicose Veins

    NASA Astrophysics Data System (ADS)

    Lattimer, Christopher R.; Piper, Stephen; Kalodiki, Evi; Trueman, Paul; Geroulakos, George

    2011-08-01

    PURPOSE: To investigate differences in cost of endovenous laser therapy (EVLT) using a top-down approach derived from the Annual Report versus a clinically orientated, bottom-up approach at a single hospital. METHODS: Information was obtained from: the day-case activity Service Line Report (SLR) income statement for general surgery, comparative data from the National Audit Commission, reference costs from the hospital finance department on 69 patients and calculations on individual treatment times and session slots (2 EVLT's per 3.5 hr session) on 37 consecutive patients. Duration of treatment, consumables (over £3) and staff pay were also recorded. Overheads were estimated at 15% and adjustments were made based on location and length of stay. RESULTS: Using a top-down approach with SLR data the total cost of EVLT was estimated at £963.78 per treatment after adjustments for services and consumables. This compares with £1,073.34 using national data. The hospital reference costs per treatment ranged from £767.56 overall by local procedure code (HRG-QZ10B) to £2,353.79 with individual samples. In the bottom-up approach median duration of EVLT was 86 mins (95% CI: 82-95, IQR: 26). With timed treatments median cost per individual treatment was £597.68 (95% CI: 587.65-621.25, IQR: 67.87) compared to £647.28 per session slot. CONCLUSION: Cost estimations of EVLT demonstrate an up to 4-fold difference. Lack of clinical engagement in the top-down approach leads to overestimations. Overheads are underestimated with a bottom-up approach. This variability should be accounted when comparing treatments or interpreting cost-effectiveness analyses.

  5. Top-Down-Assisted Bottom-Up Method for Homologous Protein Sequencing: Hemoglobin from 33 Bird Species

    NASA Astrophysics Data System (ADS)

    Song, Yang; Laskay, Ünige A.; Vilcins, Inger-Marie E.; Barbour, Alan G.; Wysocki, Vicki H.

    2015-11-01

    Ticks are vectors for disease transmission because they are indiscriminant in their feeding on multiple vertebrate hosts, transmitting pathogens between their hosts. Identifying the hosts on which ticks have fed is important for disease prevention and intervention. We have previously shown that hemoglobin (Hb) remnants from a host on which a tick fed can be used to reveal the host's identity. For the present research, blood was collected from 33 bird species that are common in the U.S. as hosts for ticks but that have unknown Hb sequences. A top-down-assisted bottom-up mass spectrometry approach with a customized searching database, based on variability in known bird hemoglobin sequences, has been devised to facilitate fast and complete sequencing of hemoglobin from birds with unknown sequences. These hemoglobin sequences will be added to a hemoglobin database and used for tick host identification. The general approach has the potential to sequence any set of homologous proteins completely in a rapid manner.

  6. 2D FT-ICR MS of Calmodulin: A Top-Down and Bottom-Up Approach.

    PubMed

    Floris, Federico; van Agthoven, Maria; Chiron, Lionel; Soulby, Andrew J; Wootton, Christopher A; Lam, Yuko P Y; Barrow, Mark P; Delsuc, Marc-André; O'Connor, Peter B

    2016-09-01

    Two-dimensional Fourier transform ion cyclotron resonance mass spectrometry (2D FT-ICR MS) allows data-independent fragmentation of all ions in a sample and correlation of fragment ions to their precursors through the modulation of precursor ion cyclotron radii prior to fragmentation. Previous results show that implementation of 2D FT-ICR MS with infrared multi-photon dissociation (IRMPD) and electron capture dissociation (ECD) has turned this method into a useful analytical tool. In this work, IRMPD tandem mass spectrometry of calmodulin (CaM) has been performed both in one-dimensional and two-dimensional FT-ICR MS using a top-down and bottom-up approach. 2D IRMPD FT-ICR MS is used to achieve extensive inter-residue bond cleavage and assignment for CaM, using its unique features for fragment identification in a less time- and sample-consuming experiment than doing the same thing using sequential MS/MS experiments. Graphical Abstract ᅟ.

  7. Bottom-up photonic crystal cavities formed by patterned III-V nanopillars.

    PubMed

    Scofield, Adam C; Shapiro, Joshua N; Lin, Andrew; Williams, Alex D; Wong, Ping-Show; Liang, Baolai L; Huffaker, Diana L

    2011-06-08

    We report on the formation and optical properties of bottom-up photonic crystal (PC) cavities formed by III-V nanopillars (NPs) via catalyst-free selective-area metal-organic chemical vapor deposition on masked GaAs substrates. This method of NP synthesis allows for precise lithographic control of NP position and diameter enabling simultaneous formation of both the photonic band gap (PBG) region and active gain region. The PBG and cavity resonance are determined by independently tuning the NP radius r, pitch a, and height h in the respective masked areas. Near-infrared emission at 970 nm is achieved from axial GaAs/InGaAs heterostructures with in situ passivation by laterally grown InGaP shells. To achieve out-of-plane optical confinement, the PC cavities are embedded in polydimethylsiloxane (PDMS) and removed from the growth substrate. Spatially and spectrally resolved 77 K photoluminescence demonstrates a strong influence of the PBG resonance on device emission. Resonant peaks are observed in the emission spectra of PC cavities embedded in PDMS.

  8. Comparing top-down and bottom-up costing approaches for economic evaluation within social welfare.

    PubMed

    Olsson, Tina M

    2011-10-01

    This study compares two approaches to the estimation of social welfare intervention costs: one "top-down" and the other "bottom-up" for a group of social welfare clients with severe problem behavior participating in a randomized trial. Intervention costs ranging over a two-year period were compared by intervention category (foster care placement, institutional placement, mentorship services, individual support services and structured support services), estimation method (price, micro costing, average cost) and treatment group (intervention, control). Analyses are based upon 2007 costs for 156 individuals receiving 404 interventions. Overall, both approaches were found to produce reliable estimates of intervention costs at the group level but not at the individual level. As choice of approach can greatly impact the estimate of mean difference, adjustment based on estimation approach should be incorporated into sensitivity analyses. Analysts must take care in assessing the purpose and perspective of the analysis when choosing a costing approach for use within economic evaluation.

  9. Comparison of Collisional and Electron-Based Dissociation Modes for Middle-Down Analysis of Multiply Glycosylated Peptides

    NASA Astrophysics Data System (ADS)

    Khatri, Kshitij; Pu, Yi; Klein, Joshua A.; Wei, Juan; Costello, Catherine E.; Lin, Cheng; Zaia, Joseph

    2018-04-01

    Analysis of singly glycosylated peptides has evolved to a point where large-scale LC-MS analyses can be performed at almost the same scale as proteomics experiments. While collisionally activated dissociation (CAD) remains the mainstay of bottom-up analyses, it performs poorly for the middle-down analysis of multiply glycosylated peptides. With improvements in instrumentation, electron-activated dissociation (ExD) modes are becoming increasingly prevalent for proteomics experiments and for the analysis of fragile modifications such as glycosylation. While these methods have been applied for glycopeptide analysis in isolated studies, an organized effort to compare their efficiencies, particularly for analysis of multiply glycosylated peptides (termed here middle-down glycoproteomics), has not been made. We therefore compared the performance of different ExD modes for middle-down glycopeptide analyses. We identified key features among the different dissociation modes and show that increased electron energy and supplemental activation provide the most useful data for middle-down glycopeptide analysis. [Figure not available: see fulltext.

  10. Scaled CMOS Reliability and Considerations for Spacecraft Systems: Bottom-Up and Top-Down Perspective

    NASA Technical Reports Server (NTRS)

    White, Mark

    2012-01-01

    New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.

  11. Top-down and bottom-up: Front to back. Comment on "Move me, astonish me... delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates" by Matthew Pelowski et al.

    NASA Astrophysics Data System (ADS)

    Nadal, Marcos; Skov, Martin

    2017-07-01

    The model presented here [1] is the latest in an evolving series of psychological models aimed at explaining the experience of art, first proposed by Leder and colleagues [2]. The aim of this new version is to ;explicitly connect early bottom-up, artwork-derived processing sequence and outputs to top-down, viewer-derived contribution to the processing sequence; [1, p. 5f & 6]. The ;meeting; of these two processing sequences, the authors contend, is crucial to the understanding of people's responses to art [sections 3.6ff & 4], and therefore the new model's principal motivation.

  12. Considering the normative, systemic and procedural dimensions in indicator-based sustainability assessments in agriculture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binder, Claudia R., E-mail: claudia.binder@geo.uzh.c; Institute for System Science, Innovation and Sustainability Research, University of Graz; Feola, Giuseppe

    This paper develops a framework for evaluating sustainability assessment methods by separately analyzing their normative, systemic and procedural dimensions as suggested by Wiek and Binder [Wiek, A, Binder, C. Solution spaces for decision-making - a sustainability assessment tool for city-regions. Environ Impact Asses Rev 2005, 25: 589-608.]. The framework is then used to characterize indicator-based sustainability assessment methods in agriculture. For a long time, sustainability assessment in agriculture has focused mostly on environmental and technical issues, thus neglecting the economic and, above all, the social aspects of sustainability, the multi-functionality of agriculture and the applicability of the results. In responsemore » to these shortcomings, several integrative sustainability assessment methods have been developed for the agricultural sector. This paper reviews seven of these that represent the diversity of tools developed in this area. The reviewed assessment methods can be categorized into three types: (i) top-down farm assessment methods; (ii) top-down regional assessment methods with some stakeholder participation; (iii) bottom-up, integrated participatory or transdisciplinary methods with stakeholder participation throughout the process. The results readily show the trade-offs encountered when selecting an assessment method. A clear, standardized, top-down procedure allows for potentially benchmarking and comparing results across regions and sites. However, this comes at the cost of system specificity. As the top-down methods often have low stakeholder involvement, the application and implementation of the results might be difficult. Our analysis suggests that to include the aspects mentioned above in agricultural sustainability assessment, the bottom-up, integrated participatory or transdisciplinary methods are the most suitable ones.« less

  13. Split-Channel Ballistic Transport in an InSb Nanowire

    NASA Astrophysics Data System (ADS)

    Estrada Saldaña, Juan Carlos; Niquet, Yann-Michel; Cleuziou, Jean-Pierre; Lee, Eduardo J. H.; Car, Diana; Plissard, Sébastien R.; Bakkers, Erik P. A. M.; De Franceschi, Silvano

    2018-04-01

    We report an experimental study of one-dimensional (1D) electronic transport in an InSb semiconducting nanowire. Three bottom gates are used to locally deplete the nanowire creating a ballistic quantum point contact with only a few conducting channels. In a magnetic field, the Zeeman splitting of the corresponding 1D subbands is revealed by the emergence of conductance plateaus at multiples of $e^2$/h, yet we find a quantized conductance pattern largely dependent on the configuration of voltages applied to the bottom gates. In particular, we can make the first plateau disappear leaving a first conductance step of 2$e^2/h$, which is indicative of a remarkable two-fold subband degeneracy that can persist up to several Tesla. For certain gate voltage settings, we also observe the presence of discrete resonant states producing conductance features that can resemble those expected from the opening of a helical gap in the subband structure. We explain our experimental findings through the formation of two spatially separated 1D conduction channels.

  14. An AC electroosmotic micropump for circular chromatographic applications.

    PubMed

    Debesset, S; Hayden, C J; Dalton, C; Eijkel, J C T; Manz, A

    2004-08-01

    Flow rates of up to 50 microm s(-1) have been successfully achieved in a closed-loop channel using an AC electroosmotic pump. The AC electroosmotic pump is made of an interdigitated array of unequal width electrodes located at the bottom of a channel, with an AC voltage applied between the small and the large electrodes. The flow rate was found to increase linearly with the applied voltage and to decrease linearly with the applied frequency. The pump is expected to be suitable for circular chromatography for the following reasons: the driving forces are distributed over the channel length and the pumping direction is set by the direction of the interdigitated electrodes. Pumping in a closed-loop channel can be achieved by arranging the electrode pattern in a circle. In addition the inherent working principle of AC electroosmotic pumping enables the independent optimisation of the channel height or the flow velocity.

  15. When does fishing lead to more fish? Community consequences of bottom trawl fisheries in demersal food webs

    PubMed Central

    van Denderen, P. Daniel; van Kooten, Tobias; Rijnsdorp, Adriaan D.

    2013-01-01

    Bottom trawls are a globally used fishing gear that physically disturb the seabed and kill non-target organisms, including those that are food for the targeted fish species. There are indications that ensuing changes to the benthic invertebrate community may increase the availability of food and promote growth and even fisheries yield of target fish species. If and how this occurs is the subject of ongoing debate, with evidence both in favour and against. We model the effects of trawling on a simple ecosystem of benthivorous fish and two food populations (benthos), susceptible and resistant to trawling. We show that the ecosystem response to trawling depends on whether the abundance of benthos is top-down or bottom-up controlled. Fishing may result in higher fish abundance, higher (maximum sustainable) yield and increased persistence of fish when the benthos which is the best-quality fish food is also more resistant to trawling. These positive effects occur in bottom-up controlled systems and systems with limited impact of fish feeding on benthos, resembling bottom-up control. Fishing leads to lower yields and fish persistence in all configurations where susceptible benthos are more profitable prey. Our results highlight the importance of mechanistic ecosystem knowledge as a requirement for successful management. PMID:24004941

  16. Top-down and bottom-up modeling in system pharmacology to understand clinical efficacy: An example with NRTIs of HIV-1.

    PubMed

    Duwal, Sulav; von Kleist, Max

    2016-10-30

    A major aim of Systems Pharmacology is to understand clinically relevant mechanisms of action (MOA) of drugs and to use this knowledge in order to optimize therapy. To enable this mission it is necessary to obtain knowledge on how in vitro testable insights translate into clinical efficacy. Mathematical modeling and data integration are essential components to achieve this goal. Two modeling philosophies are prevalent, each of which in isolation is not sufficient to achieve the above described: In a 'top-down' approach, a minimal pharmacokinetic-pharmacodynamic (PK-PD) model is derived from- and fitted to available clinical data. This model may lack interpretability in terms of mechanisms and may only be predictive for scenarios already covered by the data used to derive it. A 'bottom-up' approach builds on mechanistic insights derived from in vitro/ex vivo experiments, which can be conducted under controlled conditions, but may not be fully representative for the in vivo/clinical situation. In this work, we employ both approaches side-by-side to predict the clinical potency (IC 50 values) of the nucleoside reverse transcriptase inhibitors (NRTIs) lamivudine, emtricitabine and tenofovir. In the 'top-down' approach, this requires to establish the dynamic link between the intracellularly active NRTI-triphosphates (which exert the effect) and plasma prodrug PK and to subsequently link this composite PK model to viral kinetics. The 'bottom-up' approach assesses inhibition of reverse transcriptase-mediated viral DNA polymerization by the intracellular, active NRTI-triphosphates, which has to be brought into the context of target cell infection. By using entirely disparate sets of data to derive and parameterize the respective models, our approach serves as a means to assess the clinical relevance of the 'bottom-up' approach. We obtain very good qualitative and quantitative agreement between 'top-down' vs. 'bottom-up' predicted IC 50 values, arguing for the validity of the 'bottom-up' approach. We noted, however, that the 'top-down' approach is strongly dependent on the sparse and noisy intracellular pharmacokinetic data. All in all, our work provides confidence that we can translate in vitro parameters into measures of clinical efficacy using the 'bottom-up' approach. This may allow to infer the potency of various NRTIs in inhibiting e.g. mutant viruses, to distinguish sources of interaction of NRTI combinations and to assess the efficacy of different NRTIs for repurposing, e.g. for pre-exposure prophylaxis. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Inverse Estimation of California Methane Emissions and Their Uncertainties using FLEXPART-WRF

    NASA Astrophysics Data System (ADS)

    Cui, Y.; Brioude, J. F.; Angevine, W. M.; McKeen, S. A.; Peischl, J.; Nowak, J. B.; Henze, D. K.; Bousserez, N.; Fischer, M. L.; Jeong, S.; Liu, Z.; Michelsen, H. A.; Santoni, G.; Daube, B. C.; Kort, E. A.; Frost, G. J.; Ryerson, T. B.; Wofsy, S. C.; Trainer, M.

    2015-12-01

    Methane (CH4) has a large global warming potential and mediates global tropospheric chemistry. In California, CH4 emissions estimates derived from "top-down" methods based on atmospheric observations have been found to be greater than expected from "bottom-up" population-apportioned national and state inventories. Differences between bottom-up and top-down estimates suggest that the understanding of California's CH4 sources is incomplete, leading to uncertainty in the application of regulations to mitigate regional CH4 emissions. In this study, we use airborne measurements from the California research at the Nexus of Air Quality and Climate Change (CalNex) campaign in 2010 to estimate CH4 emissions in the South Coast Air Basin (SoCAB), which includes California's largest metropolitan area (Los Angeles), and in the Central Valley, California's main agricultural and livestock management area. Measurements from 12 daytime flights, prior information from national and regional official inventories (e.g. US EPA's National Emission Inventory, the California Air Resources Board inventories, the Liu et al. Hybrid Inventory, and the California Greenhouse Gas Emissions Measurement dataset), and the FLEXPART-WRF transport model are used in our mesoscale Bayesian inverse system. We compare our optimized posterior CH4 inventory to the prior bottom-up inventories in terms of total emissions (Mg CH4/hr) and the spatial distribution of the emissions (0.1 degree), and quantify uncertainties in our posterior estimates. Our inversions show that the oil and natural gas industry (extraction, processing and distribution) is the main source accounting for the gap between top-down and bottom-up inventories over the SoCAB, while dairy farms are the largest CH4 source in the Central Valley. CH4 emissions of dairy farms in the San Joaquin Valley and variations of CH4 emissions in the rice-growing regions of Sacramento Valley are quantified and discussed. We also estimate CO and NH3 surface fluxes and use their observed correlation with CH4 mixing ratio to further evaluate our CH4 total emission estimates, and understand the spatial distribution of CH4 emissions.

  18. Dual-gate polysilicon nanoribbon biosensors enable high sensitivity detection of proteins.

    PubMed

    Zeimpekis, I; Sun, K; Hu, C; Ditshego, N M J; Thomas, O; de Planque, M R R; Chong, H M H; Morgan, H; Ashburn, P

    2016-04-22

    We demonstrate the advantages of dual-gate polysilicon nanoribbon biosensors with a comprehensive evaluation of different measurement schemes for pH and protein sensing. In particular, we compare the detection of voltage and current changes when top- and bottom-gate bias is applied. Measurements of pH show that a large voltage shift of 491 mV pH(-1) is obtained in the subthreshold region when the top-gate is kept at a fixed potential and the bottom-gate is varied (voltage sweep). This is an improvement of 16 times over the 30 mV pH(-1) measured using a top-gate sweep with the bottom-gate at a fixed potential. A similar large voltage shift of 175 mV is obtained when the protein avidin is sensed using a bottom-gate sweep. This is an improvement of 20 times compared with the 8.8 mV achieved from a top-gate sweep. Current measurements using bottom-gate sweeps do not deliver the same signal amplification as when using bottom-gate sweeps to measure voltage shifts. Thus, for detecting a small signal change on protein binding, it is advantageous to employ a double-gate transistor and to measure a voltage shift using a bottom-gate sweep. For top-gate sweeps, the use of a dual-gate transistor enables the current sensitivity to be enhanced by applying a negative bias to the bottom-gate to reduce the carrier concentration in the nanoribbon. For pH measurements, the current sensitivity increases from 65% to 149% and for avidin sensing it increases from 1.4% to 2.5%.

  19. Dual-gate polysilicon nanoribbon biosensors enable high sensitivity detection of proteins

    NASA Astrophysics Data System (ADS)

    Zeimpekis, I.; Sun, K.; Hu, C.; Ditshego, N. M. J.; Thomas, O.; de Planque, M. R. R.; Chong, H. M. H.; Morgan, H.; Ashburn, P.

    2016-04-01

    We demonstrate the advantages of dual-gate polysilicon nanoribbon biosensors with a comprehensive evaluation of different measurement schemes for pH and protein sensing. In particular, we compare the detection of voltage and current changes when top- and bottom-gate bias is applied. Measurements of pH show that a large voltage shift of 491 mV pH-1 is obtained in the subthreshold region when the top-gate is kept at a fixed potential and the bottom-gate is varied (voltage sweep). This is an improvement of 16 times over the 30 mV pH-1 measured using a top-gate sweep with the bottom-gate at a fixed potential. A similar large voltage shift of 175 mV is obtained when the protein avidin is sensed using a bottom-gate sweep. This is an improvement of 20 times compared with the 8.8 mV achieved from a top-gate sweep. Current measurements using bottom-gate sweeps do not deliver the same signal amplification as when using bottom-gate sweeps to measure voltage shifts. Thus, for detecting a small signal change on protein binding, it is advantageous to employ a double-gate transistor and to measure a voltage shift using a bottom-gate sweep. For top-gate sweeps, the use of a dual-gate transistor enables the current sensitivity to be enhanced by applying a negative bias to the bottom-gate to reduce the carrier concentration in the nanoribbon. For pH measurements, the current sensitivity increases from 65% to 149% and for avidin sensing it increases from 1.4% to 2.5%.

  20. A Clash of Bottom-Up and Top-Down Processes in Visual Search: The Reversed Letter Effect Revisited

    ERIC Educational Resources Information Center

    Zhaoping, Li; Frith, Uta

    2011-01-01

    It is harder to find the letter "N" among its mirror reversals than vice versa, an inconvenient finding for bottom-up saliency accounts based on primary visual cortex (V1) mechanisms. However, in line with this account, we found that in dense search arrays, gaze first landed on either target equally fast. Remarkably, after first landing,…

  1. Bottom-Up Mechanisms Are Involved in the Relation between Accuracy in Timing Tasks and Intelligence--Further Evidence Using Manipulations of State Motivation

    ERIC Educational Resources Information Center

    Ullen, Fredrik; Soderlund, Therese; Kaaria, Lenita; Madison, Guy

    2012-01-01

    Intelligence correlates with accuracy in various timing tasks. Such correlations could be due to both bottom-up mechanisms, e.g. neural properties that influence both temporal accuracy and cognitive processing, and differences in top-down control. We have investigated the timing-intelligence relation using a simple temporal motor task, isochronous…

  2. Cost of Illness of Multiple Sclerosis - A Systematic Review

    PubMed Central

    Ernstsson, Olivia; Gyllensten, Hanna; Alexanderson, Kristina; Tinghög, Petter; Friberg, Emilie; Norlund, Anders

    2016-01-01

    Background Cost-of-illness (COI) studies of Multiple Sclerosis (MS) are vital components for describing the economic burden of MS, and are frequently used in model studies of interventions of MS. We conducted a systematic review of studies estimating the COI of MS, to compare costs between studies and examine cost drivers, emphasizing generalizability and methodological choices. Material and method A literature search on studies published in English on COI of MS was performed in PubMed for the period January 1969 to January 2014, resulting in 1,326 publications. A mapping of studies using a bottom-up approach or top-down approach, respectively, was conducted for the 48 studies assessed as relevant. In a second analysis, the cost estimates were compared between the 29 studies that used a societal perspective on costs, human capital approach for indirect costs, presenting number of patients included, time-period studied, and year of price level used. Results The mapping showed that bottom-up studies and prevalence approaches were most common. The cost ratios between different severity levels within studies were relatively stable, to the ratio of 1 to 2 to 3 for disability level categories. Drugs were the main cost drivers for MS-patients with low disease severity, representing 29% to 82% of all costs in this patient group, while the main cost components for groups with more advanced MS symptoms were production losses due to MS and informal care, together representing 17% to 67% of costs in those groups. Conclusion The bottom-up method and prevalence approach dominated in studies of COI of MS. Our findings show that there are difficulties in comparing absolute costs across studies, nevertheless, the relative costs expressed as cost ratios, comparing different severity levels, showed higher resemblance. Costs of drugs were main cost drivers for less severe MS and informal care and production losses for the most severe MS. PMID:27411042

  3. How to Assess Vulnerabilities of Water Policies to Global Change?

    NASA Astrophysics Data System (ADS)

    Kumar, A.; Haasnoot, M.; Weijs, S.

    2017-12-01

    Water managers are confronted with uncertainties arising from hydrological, societal, economical and political drivers. To manage these uncertainties two paradigms have been identified: top-down and bottom-up approaches. Top-down or prediction-based approaches use socio-economic scenarios together with a discrete set of GCM projections (often downscaled) to assess the expected impact of drivers and policies on water resource system through various hydrological and social systems models. Adaptation strategies to alleviate these impacts are then identified and tested against the scenarios. To address GCM and downscaling uncertainties, these approaches put more focus on climate predictions, rather than the decision problem itself. Triggered by the wish to have a more scenario-neutral approach and address downscaling uncertainties, recent analyses have been shifted towards vulnerability-based (bottom-up or decision-centric) approaches. They begin at the local scale by addressing socio-economic responses to climate, often involving stakeholder's input; identify vulnerabilities under a larger sample of plausible futures and evaluate sensitivity and robustness of possible adaptation options. Several bottom-up approaches have emerged so far and are increasingly recommended. Fundamentally they share several core ideas, however, subtle differences exist in vulnerability assessment, visualization tools for exploring vulnerabilities and computational methods used for identifying robust water policies. Through this study, we try to identify how these approaches are progressing, how the climate and non-climate uncertainties are being confronted and how to integrate existing and new tools. We find that choice of a method may depend on the number of vulnerability drivers identified and type of threshold levels (environmental conditions or policy objectives) defined. Certain approaches are suited well for assessing adaptive capacities, tipping points and sequencing of decisions. However, visualization of the vulnerability domain is still challenging if multiple drivers are present. New emerging tools are focused on generating synthetic scenarios, addressing multiple objectives, linking decision-making frameworks to adaptation pathways and communicating risks to the stakeholders.

  4. Top-down and bottom-up approaches to greenhouse gas inventory methods—a comparison between national- and forest-scale reporting methods

    Treesearch

    David Nicholls; Frank Barnes; Felicia Acrea; Chinling Chen; Lara Y. Buluç; Michele M. Parker

    2015-01-01

    Federal agencies are mandated to measure, manage, and reduce greenhouse gas (GHG) emissions. The General Services Administration (GSA) Carbon Footprint Tool (CFT) is an online tool built to utilize measured GHG inventories to help Forest Service units streamline reporting and make informed decisions about operational efficiency. In fiscal year 2013, the Forest Service...

  5. Design of a Bottom Impermeable Barrier in Conjunction with A contaminated Site Containment Structure.

    DTIC Science & Technology

    1994-05-01

    utilizes drill bits and tubing to cut through the soil. Unlike the auger method, a slurry mixture is used to keep the drill bit clean and assist in...is applied. In the sleeve pipe method, or also called tube -a-manchette, the sleeve pipe is installed in the grout hole, and sealed in place with a...acts as a one-way valve. allowing grout out of the pipe, but not back into the sleeve. A grouting tube with double packer is used to inject the grout

  6. Systematic Review of Methods in Low-Consensus Fields: Supporting Commensuration through `Construct-Centered Methods Aggregation' in the Case of Climate Change Vulnerability Research.

    PubMed

    Delaney, Aogán; Tamás, Peter A; Crane, Todd A; Chesterman, Sabrina

    2016-01-01

    There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts' commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research.

  7. Systematic Review of Methods in Low-Consensus Fields: Supporting Commensuration through `Construct-Centered Methods Aggregation’ in the Case of Climate Change Vulnerability Research

    PubMed Central

    Crane, Todd A.; Chesterman, Sabrina

    2016-01-01

    There is increasing interest in using systematic review to synthesize evidence on the social and environmental effects of and adaptations to climate change. Use of systematic review for evidence in this field is complicated by the heterogeneity of methods used and by uneven reporting. In order to facilitate synthesis of results and design of subsequent research a method, construct-centered methods aggregation, was designed to 1) provide a transparent, valid and reliable description of research methods, 2) support comparability of primary studies and 3) contribute to a shared empirical basis for improving research practice. Rather than taking research reports at face value, research designs are reviewed through inductive analysis. This involves bottom-up identification of constructs, definitions and operationalizations; assessment of concepts’ commensurability through comparison of definitions; identification of theoretical frameworks through patterns of construct use; and integration of transparently reported and valid operationalizations into ideal-type research frameworks. Through the integration of reliable bottom-up inductive coding from operationalizations and top-down coding driven from stated theory with expert interpretation, construct-centered methods aggregation enabled both resolution of heterogeneity within identically named constructs and merging of differently labeled but identical constructs. These two processes allowed transparent, rigorous and contextually sensitive synthesis of the research presented in an uneven set of reports undertaken in a heterogenous field. If adopted more broadly, construct-centered methods aggregation may contribute to the emergence of a valid, empirically-grounded description of methods used in primary research. These descriptions may function as a set of expectations that improves the transparency of reporting and as an evolving comprehensive framework that supports both interpretation of existing and design of future research. PMID:26901409

  8. Approach for ochratoxin A fast screening in spices using clean-up tandem immunoassay columns with confirmation by high performance liquid chromatography-tandem mass spectrometry (HPLC-MS/MS).

    PubMed

    Goryacheva, I Yu; De Saeger, S; Lobeau, M; Eremin, S A; Barna-Vetró, I; Van Peteghem, C

    2006-09-01

    An approach for ochratoxin A (OTA) fast cost-effective screening based on clean-up tandem immunoassay columns was developed and optimized for OTA detection with a cut-off level of 10 microg kg(-1) in spices. Two procedures were tested and applied for OTA detection. Column with bottom detection immunolayer was optimized for OTA determination in Capsicum ssp. spices. A modified clean-up tandem immunoassay procedure with top detection immunolayer was successfully applied for all tested spices. Its main advantages were decreasing of the number of analysis steps and quantity of antibody and also minimizing of matrix effects. The total duration of the extraction and analysis was about 40 min for six samples. Chilli, red pepper, pili-pili, cayenne, paprika, nutmeg, ginger, white pepper and black pepper samples were analyzed for OTA contamination by the proposed clean-up tandem immunoassay procedures. Clean-up tandem immunoassay results were confirmed by HPLC-MS/MS with immunoaffinity column clean-up. Among 17 tested Capsicum ssp. spices, 6 samples (35%) contained OTA in a concentration exceeding the 10 microg kg(-1) limit discussed by the European Commission. All tested nutmeg (n=8), ginger (n=5), white pepper (n=7) and black pepper (n=6) samples did not contain OTA above this action level.

  9. Pairing top-down and bottom-up approaches to analyze catchment scale management of water quality and quantity

    NASA Astrophysics Data System (ADS)

    Lovette, J. P.; Duncan, J. M.; Band, L. E.

    2016-12-01

    Watershed management requires information on the hydrologic impacts of local to regional land use, land cover and infrastructure conditions. Management of runoff volumes, storm flows, and water quality can benefit from large scale, "top-down" screening tools, using readily available information, as well as more detailed, "bottom-up" process-based models that explicitly track local runoff production and routing from sources to receiving water bodies. Regional scale data, available nationwide through the NHD+, and top-down models based on aggregated catchment information provide useful tools for estimating regional patterns of peak flows, volumes and nutrient loads at the catchment level. Management impacts can be estimated with these models, but have limited ability to resolve impacts beyond simple changes to land cover proportions. Alternatively, distributed process-based models provide more flexibility in modeling management impacts by resolving spatial patterns of nutrient source, runoff generation, and uptake. This bottom-up approach can incorporate explicit patterns of land cover, drainage connectivity, and vegetation extent, but are typically applied over smaller areas. Here, we first model peak flood flows and nitrogen loads across North Carolina's 70,000 NHD+ catchments using USGS regional streamflow regression equations and the SPARROW model. We also estimate management impact by altering aggregated sources in each of these models. To address the missing spatial implications of the top-down approach, we further explore the demand for riparian buffers as a management strategy, simulating the accumulation of nutrient sources along flow paths and the potential mitigation of these sources through forested buffers. We use the Regional Hydro-Ecological Simulation System (RHESSys) to model changes across several basins in North Carolina's Piedmont and Blue Ridge regions, ranging in size from 15 - 1,130 km2. The two approaches provide a complementary set of tools for large area screening, followed by smaller, more process based assessment and design tools.

  10. Meso-scale on-road vehicle emission inventory approach: a study on Dhaka City of Bangladesh supporting the 'cause-effect' analysis of the transport system.

    PubMed

    Iqbal, Asif; Allan, Andrew; Zito, Rocco

    2016-03-01

    The study aims to develop an emission inventory (EI) approach and conduct an inventory for vehicular sources in Dhaka City, Bangladesh. A meso-scale modelling approach was adopted for the inventory; the factors that influence the emissions and the magnitude of emission variation were identified and reported on, which was an innovative approach to account emissions unlike the conventional inventory approaches. Two techniques for the emission inventory were applied, viz. (i) a combined top-down and bottom-up approach that considered the total vehicle population and the average diurnal on-road vehicle speed profile in the city and (ii) a bottom-up approach that accounted for road link-specific emissions of the city considering diurnal traffic volume and speed profiles of the respective roads. For the bottom-up approach, road link-specific detailed data were obtained through field survey in 2012, where mid-block traffic count of the day, vehicle speed profile, road network and congestion data were collected principally. The emission variances for the change in transport system characteristics (like change in fuel type, AC usage pattern, increased speed and reduced congestion/stopping) were predicted and analysed in this study; congestion influenced average speed of the vehicles, and fuel types in the vehicles were identified as the major stressors. The study performance was considered reasonable when comparing with the limited number of similar studies conducted earlier. Given the increasing trend of private vehicles each year coupled with increasing traffic congestion, the city is under threat of increased vehicular emissions unless a good management strategy is implemented. Although the inventory is conducted for Dhaka and the result may be important locally, the approach adopted in this research is innovative in nature to be followed for conducting research on other urban transport systems.

  11. Integral recycling of municipal solid waste incineration (MSWI) bottom ash fines (0-2mm) and industrial powder wastes by cold-bonding pelletization.

    PubMed

    Tang, P; Brouwers, H J H

    2017-04-01

    The cold-bonding pelletizing technique is applied in this study as an integrated method to recycle municipal solid waste incineration (MSWI) bottom ash fines (BAF, 0-2mm) and several other industrial powder wastes. Artificial lightweight aggregates are produced successfully based on the combination of these solid wastes, and the properties of these artificial aggregates are investigated and then compared with others' results reported in literature. Additionally, methods for improving the aggregate properties are suggested, and the corresponding experimental results show that increasing the BAF amount, higher binder content and addition of polypropylene fibres can improve the pellet properties (bulk density, crushing resistance, etc.). The mechanisms regarding to the improvement of the pellet properties are discussed. Furthermore, the leaching behaviours of contaminants from the produced aggregates are investigated and compared with Dutch environmental legislation. The application of these produced artificial lightweight aggregates are proposed according to their properties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Rigorous evaluation of chemical measurement uncertainty: liquid chromatographic analysis methods using detector response factor calibration

    NASA Astrophysics Data System (ADS)

    Toman, Blaza; Nelson, Michael A.; Bedner, Mary

    2017-06-01

    Chemical measurement methods are designed to promote accurate knowledge of a measurand or system. As such, these methods often allow elicitation of latent sources of variability and correlation in experimental data. They typically implement measurement equations that support quantification of effects associated with calibration standards and other known or observed parametric variables. Additionally, multiple samples and calibrants are usually analyzed to assess accuracy of the measurement procedure and repeatability by the analyst. Thus, a realistic assessment of uncertainty for most chemical measurement methods is not purely bottom-up (based on the measurement equation) or top-down (based on the experimental design), but inherently contains elements of both. Confidence in results must be rigorously evaluated for the sources of variability in all of the bottom-up and top-down elements. This type of analysis presents unique challenges due to various statistical correlations among the outputs of measurement equations. One approach is to use a Bayesian hierarchical (BH) model which is intrinsically rigorous, thus making it a straightforward method for use with complex experimental designs, particularly when correlations among data are numerous and difficult to elucidate or explicitly quantify. In simpler cases, careful analysis using GUM Supplement 1 (MC) methods augmented with random effects meta analysis yields similar results to a full BH model analysis. In this article we describe both approaches to rigorous uncertainty evaluation using as examples measurements of 25-hydroxyvitamin D3 in solution reference materials via liquid chromatography with UV absorbance detection (LC-UV) and liquid chromatography mass spectrometric detection using isotope dilution (LC-IDMS).

  13. Neural correlates of context-dependent feature conjunction learning in visual search tasks.

    PubMed

    Reavis, Eric A; Frank, Sebastian M; Greenlee, Mark W; Tse, Peter U

    2016-06-01

    Many perceptual learning experiments show that repeated exposure to a basic visual feature such as a specific orientation or spatial frequency can modify perception of that feature, and that those perceptual changes are associated with changes in neural tuning early in visual processing. Such perceptual learning effects thus exert a bottom-up influence on subsequent stimulus processing, independent of task-demands or endogenous influences (e.g., volitional attention). However, it is unclear whether such bottom-up changes in perception can occur as more complex stimuli such as conjunctions of visual features are learned. It is not known whether changes in the efficiency with which people learn to process feature conjunctions in a task (e.g., visual search) reflect true bottom-up perceptual learning versus top-down, task-related learning (e.g., learning better control of endogenous attention). Here we show that feature conjunction learning in visual search leads to bottom-up changes in stimulus processing. First, using fMRI, we demonstrate that conjunction learning in visual search has a distinct neural signature: an increase in target-evoked activity relative to distractor-evoked activity (i.e., a relative increase in target salience). Second, we demonstrate that after learning, this neural signature is still evident even when participants passively view learned stimuli while performing an unrelated, attention-demanding task. This suggests that conjunction learning results in altered bottom-up perceptual processing of the learned conjunction stimuli (i.e., a perceptual change independent of the task). We further show that the acquired change in target-evoked activity is contextually dependent on the presence of distractors, suggesting that search array Gestalts are learned. Hum Brain Mapp 37:2319-2330, 2016. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Full mass dependence in Higgs boson production in association with jets at the LHC and FCC

    DOE PAGES

    Greiner, Nicolas; Höche, Stefan; Luisoni, Gionata; ...

    2017-01-23

    The first computation of Higgs production in association with three jets at NLO in QCD has recently been performed using the effective theory, where the top quark is treated as an infinitely heavy particle and integrated out. This approach is restricted to the regions in phase space where the typical scales are not larger than the top quark mass. Here we investigate this statement at a quantitative level by calculating the leading-order contributions to the production of a Standard Model Higgs boson in association with up to three jets taking full top-quark and bottom-quark mass dependence into account. We findmore » that the transverse momentum of the hardest particle or jet plays a key role in the breakdown of the effective theory predictions, and that discrepancies can easily reach an order of magnitude for transverse momenta of about 1 TeV. The impact of bottom-quark loops is found to be visible in the small transverse momentum region, leading to corrections of up to 5 percent. Lastly, we further study the impact of mass corrections when VBF selection cuts are applied and when the center-of-mass energy is increased to 100 TeV.« less

  15. Evaluating trophic cascades as drivers of regime shifts in different ocean ecosystems

    PubMed Central

    Pershing, Andrew J.; Mills, Katherine E.; Record, Nicholas R.; Stamieszkin, Karen; Wurtzell, Katharine V.; Byron, Carrie J.; Fitzpatrick, Dominic; Golet, Walter J.; Koob, Elise

    2015-01-01

    In ecosystems that are strongly structured by predation, reducing top predator abundance can alter several lower trophic levels—a process known as a trophic cascade. A persistent trophic cascade also fits the definition of a regime shift. Such ‘trophic cascade regime shifts' have been reported in a few pelagic marine systems—notably the Black Sea, Baltic Sea and eastern Scotian Shelf—raising the question of how common this phenomenon is in the marine environment. We provide a general methodology for distinguishing top-down and bottom-up effects and apply this methodology to time series from these three ecosystems. We found evidence for top-down forcing in the Black Sea due primarily to gelatinous zooplankton. Changes in the Baltic Sea are primarily bottom-up, strongly structured by salinity, but top-down forcing related to changes in cod abundance also shapes the ecosystem. Changes in the eastern Scotian Shelf that were originally attributed to declines in groundfish are better explained by changes in stratification. Our review suggests that trophic cascade regime shifts are rare in open ocean ecosystems and that their likelihood increases as the residence time of water in the system increases. Our work challenges the assumption that negative correlation between consecutive trophic levels implies top-down forcing.

  16. Learning from bottom-up dissemination: Importing an evidence-based trauma intervention for infants and young children to Israel.

    PubMed

    David, Paula; Schiff, Miriam

    2015-12-01

    This article describes a pilot study of a "bottom up" dissemination process of a new evidence based intervention for treating early childhood trauma. Clinicians applied to learn Child-Parent Psychotherapy (CPP), imported to Israel from the U.S. A focus group of six graduates of a CPP training program responded to questions concerning their experiences learning and using CPP. All 39 CPP graduates from two cohorts also completed a cross sectional survey related to their use of CPP. Within the focus group, the openness of the workplace and the intervention's characteristics were considered major factors impacting CPP use; the training program was perceived to promote CPP implementation, and lack of supervision and secondary traumatic stress were the major inhibiting factors. Using CPP-informed therapy, as opposed to CPP with fidelity, was perceived to be one of the main outcomes of the training. Survey results showed that 53% of graduates were using CPP in over three cases, and almost all intended to use CPP within the next year. Ninety-five percent were using CPP principles in their therapeutic work. The implications of importing a new evidence based intervention to a foreign country that utilizes a different dissemination system within a different professional culture are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Two-stage bottom-up tiered approach combining several alternatives for identification of eye irritation potential of chemicals including insoluble or volatile substances.

    PubMed

    Hayashi, Kazuhiko; Mori, Taeko; Abo, Takayuki; Ooshima, Kenichi; Hayashi, Takumi; Komano, Tomoko; Takahashi, Yutaka; Sakaguchi, Hitoshi; Takatsu, Akihiko; Nishiyama, Naohiro

    2012-10-01

    For the assessment of eye irritation, one alternative test may not completely replace the rabbit Draize test. In the present study, we examined the predictive potential of a tiered approach analyzing the results from several alternatives (i.e., the Short Time Exposure (STE) test, the EpiOcular assay, the Hen's Egg Test-Chorioallantoic Membrane (HET-CAM) assay and the Bovine Corneal Opacity and Permeability (BCOP) assay) for assessing Globally Harmonized System (GHS) eye irritation categories. Fifty-six chemicals including alcohols, surfactants, and esters were selected with a balanced GHS category and a wide range of chemical classes. From a standpoint of both assessable sample numbers and predictive accuracy, the more favorable tiered approach was considered to be the two-stage bottom-up tiered approach combining the STE test, the EpiOcular assay followed by the BCOP assay (accuracy 69.6%, under prediction rate 8.9%). Moreover, a more favorable predictive capacity (accuracy 71.4%, under prediction rate 3.6%) was obtained when high volatile alcohols/esters with vapor pressures >6 kilopascal (kPa) at 25°C were evaluated with EpiOcular assay instead of the STE test. From these results, the two-stage bottom-up tiered approach combining the STE test, the EpiOcular assay followed by the BCOP assay might be a promising method for the classification of GHS eye irritation category (Not classified (NC), Category 2 (Cat. 2), and Category 1 (Cat. 1)) for a wide range of test chemicals regardless of solubility. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Controlled Growth of Ceria Nanoarrays on Anatase Titania Powder: A Bottom-up Physical Picture.

    PubMed

    Kim, Hyun You; Hybertsen, Mark S; Liu, Ping

    2017-01-11

    The leading edge of catalysis research motivates physical understanding of the growth of nanoscale oxide structures on different supporting oxide materials that are themselves also nanostructured. This research opens up for consideration a diverse range of facets on the support material, versus the single facet typically involved in wide-area growth of thin films. Here, we study the growth of ceria nanoarchitectures on practical anatase titania powders as a showcase inspired by recent experiments. Density functional theory (DFT)-based methods are employed to characterize and rationalize the broad array of low energy nanostructures that emerge. Using a bottom-up approach, we are able to identify and characterize the underlying mechanisms for the facet-dependent growth of various ceria motifs on anatase titania based on formation energy. These motifs include 0D clusters, 1D chains, 2D plates, and 3D nanoparticles. The ceria growth mode and morphology are determined by the interplay of several factors including the role of the common cation valence, the interface template effect for different facets of the anatase support, enhanced ionic binding for more compact ceria motifs, and the local structural flexibility of oxygen ions in bridging the interface between anatase and ceria structures.

  19. Non-intrusive cooling system

    DOEpatents

    Morrison, Edward F.; Bergman, John W.

    2001-05-22

    A readily replaceable heat exchange cooling jacket for applying fluid to a system conduit pipe. The cooling jacket comprises at least two members, separable into upper and lower portions. A chamber is formed between the conduit pipe and cooling jacket once the members are positioned about the pipe. The upper portion includes a fluid spray means positioned above the pipe and the bottom portion includes a fluid removal means. The heat exchange cooling jacket is adaptable with a drain tank, a heat exchanger, a pump and other standard equipment to provide a system for removing heat from a pipe. A method to remove heat from a pipe, includes the steps of enclosing a portion of the pipe with a jacket to form a chamber between an outside surface of the pipe and the cooling jacket; spraying cooling fluid at low pressure from an upper portion of the cooling jacket, allowing the fluid to flow downwardly by gravity along the surface of the pipe toward a bottom portion of the chamber; and removing the fluid at the bottom portion of the chamber.

  20. Rapid identification of Chinese Sauce liquor from different fermentation positions with FT-IR spectroscopy

    NASA Astrophysics Data System (ADS)

    Li, Changwen; Wei, Jiping; Zhou, Qun; Sun, Suqin

    2008-07-01

    FT-IR and two-dimensional correlation spectroscopy (2D-IR) technology were applied to discriminate Chinese Sauce liquor from different fermentation positions (top, middle and bottom of fermentation cellar) for the first time. The liquors at top, middle and bottom of fermentation cellar, possessed the characteristic peaks at 1731 cm -1, 1733 cm -1 and 1602 cm -1, respectively. In the 2D correlation infrared spectra, the differences were amplified. A strong auto-peak at 1725 cm -1 showed in the 2D spectra of the Top Liquor, which indicated that the liquor might contain some ester compounds. Different from Top Liquor, three auto-peaks at 1695, 1590 and 1480 cm -1 were identified in 2D spectra of Middle Liquor, which were the characteristic absorption of acid, lactate. In 2D spectra of Bottom Liquor, two auto-peaks at 1570 and 1485 cm -1 indicated that lactate was the major component. As a result, FT-IR and 2D-IR correlation spectra technology provided a rapid and effective method for the quality analysis of the Sauce liquor.

  1. A cost-effective method to prepare curcumin nanosuspensions with enhanced oral bioavailability.

    PubMed

    Wang, Yutong; Wang, Changyuan; Zhao, Jing; Ding, Yanfang; Li, Lei

    2017-01-01

    Nanosuspension is one of the most promising strategies to improve the oral bioavailability of insoluble drugs. The existing techniques applied to produce nanosuspensions are classified as "bottom-up" or "top-down" methods, or a combination of both. Curcumin (CUR), a Biopharmaceutics Classification System (BCS) class IV substance, is a promising drug candidate in view of its good bioactivity, but its use is limited due to its poor solubility and permeability. In the present study, CUR nanosuspensions were developed to enhance CUR oral bioavailability using a cost-effective method different from conventional techniques. The physicochemical properties of CUR nanosuspensions were characterized by dynamic light scattering (DLS) and transmission electron microscopy (TEM). The crystalline state of CUR in different nanosuspensions analyzed using differential scanning calorimeter (DSC) and X-ray diffraction analysis (PXRD) confirmed its amorphous state. In vitro dissolution degree of the prepared CUR nanosuspensions using TPGS or Brij78 as stabilizer was greatly increased. Pharmacokinetic studies demonstrated that the oral bioavailability of CUR was increased 3.18 and 3.7 times after administration of CUR/TPGS nanosuspensions or CUR/Brij78 nanosuspensions, when compared with the administration of CUR suspension. CUR nanosuspensions produced by our cost-effective method could improve its oral bioavailability. In addition, the low-cost and time-saving method reported here is highly suitable for a fast and inexpensive preparation. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Search for scalar bottom quark pair production with the ATLAS detector in pp collisions at sqrt[s]=7  TeV.

    PubMed

    Aad, G; Abbott, B; Abdallah, J; Abdelalim, A A; Abdesselam, A; Abdinov, O; Abi, B; Abolins, M; AbouZeid, O S; Abramowicz, H; Abreu, H; Acerbi, E; Acharya, B S; Adamczyk, L; Adams, D L; Addy, T N; Adelman, J; Aderholz, M; Adomeit, S; Adragna, P; Adye, T; Aefsky, S; Aguilar-Saavedra, J A; Aharrouche, M; Ahlen, S P; Ahles, F; Ahmad, A; Ahsan, M; Aielli, G; Akdogan, T; Åkesson, T P A; Akimoto, G; Akimov, A V; Akiyama, A; Alam, M S; Alam, M A; Albert, J; Albrand, S; Aleksa, M; Aleksandrov, I N; Alessandria, F; Alexa, C; Alexander, G; Alexandre, G; Alexopoulos, T; Alhroob, M; Aliev, M; Alimonti, G; Alison, J; Aliyev, M; Allport, P P; Allwood-Spiers, S E; Almond, J; Aloisio, A; Alon, R; Alonso, A; Alvarez Gonzalez, B; Alviggi, M G; Amako, K; Amaral, P; Amelung, C; Ammosov, V V; Amorim, A; Amorós, G; Amram, N; Anastopoulos, C; Ancu, L S; Andari, N; Andeen, T; Anders, C F; Anders, G; Anderson, K J; Andreazza, A; Andrei, V; Andrieux, M-L; Anduaga, X S; Angerami, A; Anghinolfi, F; Anisenkov, A; Anjos, N; Annovi, A; Antonaki, A; Antonelli, M; Antonov, A; Antos, J; Anulli, F; Aoun, S; Aperio Bella, L; Apolle, R; Arabidze, G; Aracena, I; Arai, Y; Arce, A T H; Archambault, J P; Arfaoui, S; Arguin, J-F; Arik, E; Arik, M; Armbruster, A J; Arnaez, O; Arnault, C; Artamonov, A; Artoni, G; Arutinov, D; Asai, S; Asfandiyarov, R; Ask, S; Åsman, B; Asquith, L; Assamagan, K; Astbury, A; Astvatsatourov, A; Aubert, B; Auge, E; Augsten, K; Aurousseau, M; Avolio, G; Avramidou, R; Axen, D; Ay, C; Azuelos, G; Azuma, Y; Baak, M A; Baccaglioni, G; Bacci, C; Bach, A M; Bachacou, H; Bachas, K; Bachy, G; Backes, M; Backhaus, M; Badescu, E; Bagnaia, P; Bahinipati, S; Bai, Y; Bailey, D C; Bain, T; Baines, J T; Baker, O K; Baker, M D; Baker, S; Banas, E; Banerjee, P; Banerjee, Sw; Banfi, D; Bangert, A; Bansal, V; Bansil, H S; Barak, L; Baranov, S P; Barashkou, A; Barbaro Galtieri, A; Barber, T; Barberio, E L; Barberis, D; Barbero, M; Bardin, D Y; Barillari, T; Barisonzi, M; Barklow, T; Barlow, N; Barnett, B M; Barnett, R M; Baroncelli, A; Barone, G; Barr, A J; Barreiro, F; Barreiro Guimarães da Costa, J; Barrillon, P; Bartoldus, R; Barton, A E; Bartsch, V; Bates, R L; Batkova, L; Batley, J R; Battaglia, A; Battistin, M; Bauer, F; Bawa, H S; Beale, S; Beare, B; Beau, T; Beauchemin, P H; Beccherle, R; Bechtle, P; Beck, H P; Becker, S; Beckingham, M; Becks, K H; Beddall, A J; Beddall, A; Bedikian, S; Bednyakov, V A; Bee, C P; Begel, M; Behar Harpaz, S; Behera, P K; Beimforde, M; Belanger-Champagne, C; Bell, P J; Bell, W H; Bella, G; Bellagamba, L; Bellina, F; Bellomo, M; Belloni, A; Beloborodova, O; Belotskiy, K; Beltramello, O; Ben Ami, S; Benary, O; Benchekroun, D; Benchouk, C; Bendel, M; Benekos, N; Benhammou, Y; Benhar Noccioli, E; Benitez Garcia, J A; Benjamin, D P; Benoit, M; Bensinger, J R; Benslama, K; Bentvelsen, S; Berge, D; Bergeaas Kuutmann, E; Berger, N; Berghaus, F; Berglund, E; Beringer, J; Bernat, P; Bernhard, R; Bernius, C; Berry, T; Bertella, C; Bertin, A; Bertinelli, F; Bertolucci, F; Besana, M I; Besson, N; Bethke, S; Bhimji, W; Bianchi, R M; Bianco, M; Biebel, O; Bieniek, S P; Bierwagen, K; Biesiada, J; Biglietti, M; Bilokon, H; Bindi, M; Binet, S; Bingul, A; Bini, C; Biscarat, C; Bitenc, U; Black, K M; Blair, R E; Blanchard, J-B; Blanchot, G; Blazek, T; Blocker, C; Blocki, J; Blondel, A; Blum, W; Blumenschein, U; Bobbink, G J; Bobrovnikov, V B; Bocchetta, S S; Bocci, A; Boddy, C R; Boehler, M; Boek, J; Boelaert, N; Böser, S; Bogaerts, J A; Bogdanchikov, A; Bogouch, A; Bohm, C; Boisvert, V; Bold, T; Boldea, V; Bolnet, N M; Bona, M; Bondarenko, V G; Bondioli, M; Boonekamp, M; Boorman, G; Booth, C N; Bordoni, S; Borer, C; Borisov, A; Borissov, G; Borjanovic, I; Borroni, S; Bos, K; Boscherini, D; Bosman, M; Boterenbrood, H; Botterill, D; Bouchami, J; Boudreau, J; Bouhova-Thacker, E V; Boumediene, D; Bourdarios, C; Bousson, N; Boveia, A; Boyd, J; Boyko, I R; Bozhko, N I; Bozovic-Jelisavcic, I; Bracinik, J; Braem, A; Branchini, P; Brandenburg, G W; Brandt, A; Brandt, G; Brandt, O; Bratzler, U; Brau, B; Brau, J E; Braun, H M; Brelier, B; Bremer, J; Brenner, R; Bressler, S; Breton, D; Britton, D; Brochu, F M; Brock, I; Brock, R; Brodbeck, T J; Brodet, E; Broggi, F; Bromberg, C; Bronner, J; Brooijmans, G; Brooks, W K; Brown, G; Brown, H; Bruckman de Renstrom, P A; Bruncko, D; Bruneliere, R; Brunet, S; Bruni, A; Bruni, G; Bruschi, M; Buanes, T; Buat, Q; Bucci, F; Buchanan, J; Buchanan, N J; Buchholz, P; Buckingham, R M; Buckley, A G; Buda, S I; Budagov, I A; Budick, B; Büscher, V; Bugge, L; Bulekov, O; Bunse, M; Buran, T; Burckhart, H; Burdin, S; Burgess, T; Burke, S; Busato, E; Bussey, P; Buszello, C P; Butin, F; Butler, B; Butler, J M; Buttar, C M; Butterworth, J M; Buttinger, W; Cabrera Urbán, S; Caforio, D; Cakir, O; Calafiura, P; Calderini, G; Calfayan, P; Calkins, R; Caloba, L P; Caloi, R; Calvet, D; Calvet, S; Camacho Toro, R; Camarri, P; Cambiaghi, M; Cameron, D; Caminada, L M; Campana, S; Campanelli, M; Canale, V; Canelli, F; Canepa, A; Cantero, J; Capasso, L; Capeans Garrido, M D M; Caprini, I; Caprini, M; Capriotti, D; Capua, M; Caputo, R; Caramarcu, C; Cardarelli, R; Carli, T; Carlino, G; Carminati, L; Caron, B; Caron, S; Carrillo Montoya, G D; Carter, A A; Carter, J R; Carvalho, J; Casadei, D; Casado, M P; Cascella, M; Caso, C; Castaneda Hernandez, A M; Castaneda-Miranda, E; Castillo Gimenez, V; Castro, N F; Cataldi, G; Cataneo, F; Catinaccio, A; Catmore, J R; Cattai, A; Cattani, G; Caughron, S; Cauz, D; Cavalleri, P; Cavalli, D; Cavalli-Sforza, M; Cavasinni, V; Ceradini, F; Cerqueira, A S; Cerri, A; Cerrito, L; Cerutti, F; Cetin, S A; Cevenini, F; Chafaq, A; Chakraborty, D; Chan, K; Chapleau, B; Chapman, J D; Chapman, J W; Chareyre, E; Charlton, D G; Chavda, V; Chavez Barajas, C A; Cheatham, S; Chekanov, S; Chekulaev, S V; Chelkov, G A; Chelstowska, M A; Chen, C; Chen, H; Chen, S; Chen, T; Chen, X; Cheng, S; Cheplakov, A; Chepurnov, V F; Cherkaoui El Moursli, R; Chernyatin, V; Cheu, E; Cheung, S L; Chevalier, L; Chiefari, G; Chikovani, L; Childers, J T; Chilingarov, A; Chiodini, G; Chizhov, M V; Choudalakis, G; Chouridou, S; Christidi, I A; Christov, A; Chromek-Burckhart, D; Chu, M L; Chudoba, J; Ciapetti, G; Ciba, K; Ciftci, A K; Ciftci, R; Cinca, D; Cindro, V; Ciobotaru, M D; Ciocca, C; Ciocio, A; Cirilli, M; Citterio, M; Ciubancan, M; Clark, A; Clark, P J; Cleland, W; Clemens, J C; Clement, B; Clement, C; Clifft, R W; Coadou, Y; Cobal, M; Coccaro, A; Cochran, J; Coe, P; Cogan, J G; Coggeshall, J; Cogneras, E; Colas, J; Colijn, A P; Collins, N J; Collins-Tooth, C; Collot, J; Colon, G; Conde Muiño, P; Coniavitis, E; Conidi, M C; Consonni, M; Consorti, V; Constantinescu, S; Conta, C; Conventi, F; Cook, J; Cooke, M; Cooper, B D; Cooper-Sarkar, A M; Copic, K; Cornelissen, T; Corradi, M; Corriveau, F; Cortes-Gonzalez, A; Cortiana, G; Costa, G; Costa, M J; Costanzo, D; Costin, T; Côté, D; Coura Torres, R; Courneyea, L; Cowan, G; Cowden, C; Cox, B E; Cranmer, K; Crescioli, F; Cristinziani, M; Crosetti, G; Crupi, R; Crépé-Renaudin, S; Cuciuc, C-M; Cuenca Almenar, C; Cuhadar Donszelmann, T; Curatolo, M; Curtis, C J; Cuthbert, C; Cwetanski, P; Czirr, H; Czodrowski, P; Czyczula, Z; D'Auria, S; D'Onofrio, M; D'Orazio, A; Da Silva, P V M; Da Via, C; Dabrowski, W; Dai, T; Dallapiccola, C; Dam, M; Dameri, M; Damiani, D S; Danielsson, H O; Dannheim, D; Dao, V; Darbo, G; Darlea, G L; Daum, C; Davey, W; Davidek, T; Davidson, N; Davidson, R; Davies, E; Davies, M; Davison, A R; Davygora, Y; Dawe, E; Dawson, I; Dawson, J W; Daya-Ishmukhametova, R K; De, K; de Asmundis, R; De Castro, S; De Castro Faria Salgado, P E; De Cecco, S; de Graat, J; De Groot, N; de Jong, P; De la Taille, C; De la Torre, H; De Lotto, B; de Mora, L; De Nooij, L; De Pedis, D; De Salvo, A; De Sanctis, U; De Santo, A; De Vivie De Regie, J B; Dean, S; Dearnaley, W J; Debbe, R; Debenedetti, C; Dedovich, D V; Degenhardt, J; Dehchar, M; Del Papa, C; Del Peso, J; Del Prete, T; Delemontex, T; Deliyergiyev, M; Dell'Acqua, A; Dell'Asta, L; Della Pietra, M; della Volpe, D; Delmastro, M; Delruelle, N; Delsart, P A; Deluca, C; Demers, S; Demichev, M; Demirkoz, B; Deng, J; Denisov, S P; Derendarz, D; Derkaoui, J E; Derue, F; Dervan, P; Desch, K; Devetak, E; Deviveiros, P O; Dewhurst, A; DeWilde, B; Dhaliwal, S; Dhullipudi, R; Di Ciaccio, A; Di Ciaccio, L; Di Girolamo, A; Di Girolamo, B; Di Luise, S; Di Mattia, A; Di Micco, B; Di Nardo, R; Di Simone, A; Di Sipio, R; Diaz, M A; Diblen, F; Diehl, E B; Dietrich, J; Dietzsch, T A; Diglio, S; Dindar Yagci, K; Dingfelder, J; Dionisi, C; Dita, P; Dita, S; Dittus, F; Djama, F; Djobava, T; do Vale, M A B; Do Valle Wemans, A; Doan, T K O; Dobbs, M; Dobinson, R; Dobos, D; Dobson, E; Dodd, J; Doglioni, C; Doherty, T; Doi, Y; Dolejsi, J; Dolenc, I; Dolezal, Z; Dolgoshein, B A; Dohmae, T; Donadelli, M; Donega, M; Donini, J; Dopke, J; Doria, A; Dos Anjos, A; Dosil, M; Dotti, A; Dova, M T; Dowell, J D; Doxiadis, A D; Doyle, A T; Drasal, Z; Drees, J; Dressnandt, N; Drevermann, H; Driouichi, C; Dris, M; Dubbert, J; Dube, S; Duchovni, E; Duckeck, G; Dudarev, A; Dudziak, F; Dührssen, M; Duerdoth, I P; Duflot, L; Dufour, M-A; Dunford, M; Duran Yildiz, H; Duxfield, R; Dwuznik, M; Dydak, F; Düren, M; Ebenstein, W L; Ebke, J; Eckweiler, S; Edmonds, K; Edwards, C A; Edwards, N C; Ehrenfeld, W; Ehrich, T; Eifert, T; Eigen, G; Einsweiler, K; Eisenhandler, E; Ekelof, T; El Kacimi, M; Ellert, M; Elles, S; Ellinghaus, F; Ellis, K; Ellis, N; Elmsheuser, J; Elsing, M; Emeliyanov, D; Engelmann, R; Engl, A; Epp, B; Eppig, A; Erdmann, J; Ereditato, A; Eriksson, D; Ernst, J; Ernst, M; Ernwein, J; Errede, D; Errede, S; Ertel, E; Escalier, M; Escobar, C; Espinal Curull, X; Esposito, B; Etienne, F; Etienvre, A I; Etzion, E; Evangelakou, D; Evans, H; Fabbri, L; Fabre, C; Fakhrutdinov, R M; Falciano, S; Fang, Y; Fanti, M; Farbin, A; Farilla, A; Farley, J; Farooque, T; Farrington, S M; Farthouat, P; Fassnacht, P; Fassouliotis, D; Fatholahzadeh, B; Favareto, A; Fayard, L; Fazio, S; Febbraro, R; Federic, P; Fedin, O L; Fedorko, W; Fehling-Kaschek, M; Feligioni, L; Fellmann, D; Feng, C; Feng, E J; Fenyuk, A B; Ferencei, J; Ferland, J; Fernando, W; Ferrag, S; Ferrando, J; Ferrara, V; Ferrari, A; Ferrari, P; Ferrari, R; Ferrer, A; Ferrer, M L; Ferrere, D; Ferretti, C; Ferretto Parodi, A; Fiascaris, M; Fiedler, F; Filipčič, A; Filippas, A; Filthaut, F; Fincke-Keeler, M; Fiolhais, M C N; Fiorini, L; Firan, A; Fischer, G; Fischer, P; Fisher, M J; Flechl, M; Fleck, I; Fleckner, J; Fleischmann, P; Fleischmann, S; Flick, T; Flores Castillo, L R; Flowerdew, M J; Fokitis, M; Fonseca Martin, T; Forbush, D A; Formica, A; Forti, A; Fortin, D; Foster, J M; Fournier, D; Foussat, A; Fowler, A J; Fowler, K; Fox, H; Francavilla, P; Franchino, S; Francis, D; Frank, T; Franklin, M; Franz, S; Fraternali, M; Fratina, S; French, S T; Friedrich, F; Froeschl, R; Froidevaux, D; Frost, J A; Fukunaga, C; Fullana Torregrosa, E; Fuster, J; Gabaldon, C; Gabizon, O; Gadfort, T; Gadomski, S; Gagliardi, G; Gagnon, P; Galea, C; Gallas, E J; Gallo, V; Gallop, B J; Gallus, P; Gan, K K; Gao, Y S; Gapienko, V A; Gaponenko, A; Garberson, F; Garcia-Sciveres, M; García, C; García Navarro, J E; Gardner, R W; Garelli, N; Garitaonandia, H; Garonne, V; Garvey, J; Gatti, C; Gaudio, G; Gaumer, O; Gaur, B; Gauthier, L; Gavrilenko, I L; Gay, C; Gaycken, G; Gayde, J-C; Gazis, E N; Ge, P; Gee, C N P; Geerts, D A A; Geich-Gimbel, Ch; Gellerstedt, K; Gemme, C; Gemmell, A; Genest, M H; Gentile, S; George, M; George, S; Gerlach, P; Gershon, A; Geweniger, C; Ghazlane, H; Ghodbane, N; Giacobbe, B; Giagu, S; Giakoumopoulou, V; Giangiobbe, V; Gianotti, F; Gibbard, B; Gibson, A; Gibson, S M; Gilbert, L M; Gilewsky, V; Gillberg, D; Gillman, A R; Gingrich, D M; Ginzburg, J; Giokaris, N; Giordani, M P; Giordano, R; Giorgi, F M; Giovannini, P; Giraud, P F; Giugni, D; Giunta, M; Giusti, P; Gjelsten, B K; Gladilin, L K; Glasman, C; Glatzer, J; Glazov, A; Glitza, K W; Glonti, G L; Goddard, J R; Godfrey, J; Godlewski, J; Goebel, M; Göpfert, T; Goeringer, C; Gössling, C; Göttfert, T; Goldfarb, S; Golling, T; Golovnia, S N; Gomes, A; Gomez Fajardo, L S; Gonçalo, R; Goncalves Pinto Firmino Da Costa, J; Gonella, L; Gonidec, A; Gonzalez, S; González de la Hoz, S; Gonzalez Parra, G; Gonzalez Silva, M L; Gonzalez-Sevilla, S; Goodson, J J; Goossens, L; Gorbounov, P A; Gordon, H A; Gorelov, I; Gorfine, G; Gorini, B; Gorini, E; Gorišek, A; Gornicki, E; Gorokhov, S A; Goryachev, V N; Gosdzik, B; Gosselink, M; Gostkin, M I; Gough Eschrich, I; Gouighri, M; Goujdami, D; Goulette, M P; Goussiou, A G; Goy, C; Gozpinar, S; Grabowska-Bold, I; Grafström, P; Grahn, K-J; Grancagnolo, F; Grancagnolo, S; Grassi, V; Gratchev, V; Grau, N; Gray, H M; Gray, J A; Graziani, E; Grebenyuk, O G; Greenshaw, T; Greenwood, Z D; Gregersen, K; Gregor, I M; Grenier, P; Griffiths, J; Grigalashvili, N; Grillo, A A; Grinstein, S; Grishkevich, Y V; Grivaz, J-F; Groh, M; Gross, E; Grosse-Knetter, J; Groth-Jensen, J; Grybel, K; Guarino, V J; Guest, D; Guicheney, C; Guida, A; Guindon, S; Guler, H; Gunther, J; Guo, B; Guo, J; Gupta, A; Gusakov, Y; Gushchin, V N; Gutierrez, A; Gutierrez, P; Guttman, N; Gutzwiller, O; Guyot, C; Gwenlan, C; Gwilliam, C B; Haas, A; Haas, S; Haber, C; Hadavand, H K; Hadley, D R; Haefner, P; Hahn, F; Haider, S; Hajduk, Z; Hakobyan, H; Hall, D; Haller, J; Hamacher, K; Hamal, P; Hamer, M; Hamilton, A; Hamilton, S; Han, H; Han, L; Hanagaki, K; Hanawa, K; Hance, M; Handel, C; Hanke, P; Hansen, J R; Hansen, J B; Hansen, J D; Hansen, P H; Hansson, P; Hara, K; Hare, G A; Harenberg, T; Harkusha, S; Harper, D; Harrington, R D; Harris, O M; Harrison, K; Hartert, J; Hartjes, F; Haruyama, T; Harvey, A; Hasegawa, S; Hasegawa, Y; Hassani, S; Hatch, M; Hauff, D; Haug, S; Hauschild, M; Hauser, R; Havranek, M; Hawes, B M; Hawkes, C M; Hawkings, R J; Hawkins, A D; Hawkins, D; Hayakawa, T; Hayashi, T; Hayden, D; Hayward, H S; Haywood, S J; Hazen, E; He, M; Head, S J; Hedberg, V; Heelan, L; Heim, S; Heinemann, B; Heisterkamp, S; Helary, L; Heller, C; Heller, M; Hellman, S; Hellmich, D; Helsens, C; Henderson, R C W; Henke, M; Henrichs, A; Henriques Correia, A M; Henrot-Versille, S; Henry-Couannier, F; Hensel, C; Henß, T; Hernandez, C M; Hernández Jiménez, Y; Herrberg, R; Hershenhorn, A D; Herten, G; Hertenberger, R; Hervas, L; Hessey, N P; Higón-Rodriguez, E; Hill, D; Hill, J C; Hill, N; Hiller, K H; Hillert, S; Hillier, S J; Hinchliffe, I; Hines, E; Hirose, M; Hirsch, F; Hirschbuehl, D; Hobbs, J; Hod, N; Hodgkinson, M C; Hodgson, P; Hoecker, A; Hoeferkamp, M R; Hoffman, J; Hoffmann, D; Hohlfeld, M; Holder, M; Holmgren, S O; Holy, T; Holzbauer, J L; Homma, Y; Hong, T M; Hooft van Huysduynen, L; Horazdovsky, T; Horn, C; Horner, S; Hostachy, J-Y; Hou, S; Houlden, M A; Hoummada, A; Howarth, J; Howell, D F; Hristova, I; Hrivnac, J; Hruska, I; Hryn'ova, T; Hsu, P J; Hsu, S-C; Huang, G S; Hubacek, Z; Hubaut, F; Huegging, F; Huettmann, A; Huffman, T B; Hughes, E W; Hughes, G; Hughes-Jones, R E; Huhtinen, M; Hurst, P; Hurwitz, M; Husemann, U; Huseynov, N; Huston, J; Huth, J; Iacobucci, G; Iakovidis, G; Ibbotson, M; Ibragimov, I; Ichimiya, R; Iconomidou-Fayard, L; Idarraga, J; Iengo, P; Igonkina, O; Ikegami, Y; Ikeno, M; Ilchenko, Y; Iliadis, D; Ilic, N; Imbault, D; Imori, M; Ince, T; Inigo-Golfin, J; Ioannou, P; Iodice, M; Ippolito, V; Irles Quiles, A; Isaksson, C; Ishikawa, A; Ishino, M; Ishmukhametov, R; Issever, C; Istin, S; Ivashin, A V; Iwanski, W; Iwasaki, H; Izen, J M; Izzo, V; Jackson, B; Jackson, J N; Jackson, P; Jaekel, M R; Jain, V; Jakobs, K; Jakobsen, S; Jakubek, J; Jana, D K; Jankowski, E; Jansen, E; Jansen, H; Jantsch, A; Janus, M; Jarlskog, G; Jeanty, L; Jelen, K; Jen-La Plante, I; Jenni, P; Jeremie, A; Jež, P; Jézéquel, S; Jha, M K; Ji, H; Ji, W; Jia, J; Jiang, Y; Jimenez Belenguer, M; Jin, G; Jin, S; Jinnouchi, O; Joergensen, M D; Joffe, D; Johansen, L G; Johansen, M; Johansson, K E; Johansson, P; Johnert, S; Johns, K A; Jon-And, K; Jones, G; Jones, R W L; Jones, T W; Jones, T J; Jonsson, O; Joram, C; Jorge, P M; Joseph, J; Jovin, T; Ju, X; Jung, C A; Jungst, R M; Juranek, V; Jussel, P; Juste Rozas, A; Kabachenko, V V; Kabana, S; Kaci, M; Kaczmarska, A; Kadlecik, P; Kado, M; Kagan, H; Kagan, M; Kaiser, S; Kajomovitz, E; Kalinin, S; Kalinovskaya, L V; Kama, S; Kanaya, N; Kaneda, M; Kaneti, S; Kanno, T; Kantserov, V A; Kanzaki, J; Kaplan, B; Kapliy, A; Kaplon, J; Kar, D; Karagounis, M; Karagoz, M; Karnevskiy, M; Karr, K; Kartvelishvili, V; Karyukhin, A N; Kashif, L; Kasieczka, G; Kass, R D; Kastanas, A; Kataoka, M; Kataoka, Y; Katsoufis, E; Katzy, J; Kaushik, V; Kawagoe, K; Kawamoto, T; Kawamura, G; Kayl, M S; Kazanin, V A; Kazarinov, M Y; Keeler, R; Kehoe, R; Keil, M; Kekelidze, G D; Kennedy, J; Kenney, C J; Kenyon, M; Kepka, O; Kerschen, N; Kerševan, B P; Kersten, S; Kessoku, K; Keung, J; Khalil-zada, F; Khandanyan, H; Khanov, A; Kharchenko, D; Khodinov, A; Kholodenko, A G; Khomich, A; Khoo, T J; Khoriauli, G; Khoroshilov, A; Khovanskiy, N; Khovanskiy, V; Khramov, E; Khubua, J; Kim, H; Kim, M S; Kim, P C; Kim, S H; Kimura, N; Kind, O; King, B T; King, M; King, R S B; Kirk, J; Kirsch, L E; Kiryunin, A E; Kishimoto, T; Kisielewska, D; Kittelmann, T; Kiver, A M; Kladiva, E; Klaiber-Lodewigs, J; Klein, M; Klein, U; Kleinknecht, K; Klemetti, M; Klier, A; Klimek, P; Klimentov, A; Klingenberg, R; Klinkby, E B; Klioutchnikova, T; Klok, P F; Klous, S; Kluge, E-E; Kluge, T; Kluit, P; Kluth, S; Knecht, N S; Kneringer, E; Knobloch, J; Knoops, E B F G; Knue, A; Ko, B R; Kobayashi, T; Kobel, M; Kocian, M; Kodys, P; Köneke, K; König, A C; Koenig, S; Köpke, L; Koetsveld, F; Koevesarki, P; Koffas, T; Koffeman, E; Kogan, L A; Kohn, F; Kohout, Z; Kohriki, T; Koi, T; Kokott, T; Kolachev, G M; Kolanoski, H; Kolesnikov, V; Koletsou, I; Koll, J; Kollar, D; Kollefrath, M; Kolya, S D; Komar, A A; Komori, Y; Kondo, T; Kono, T; Kononov, A I; Konoplich, R; Konstantinidis, N; Kootz, A; Koperny, S; Korcyl, K; Kordas, K; Koreshev, V; Korn, A; Korol, A; Korolkov, I; Korolkova, E V; Korotkov, V A; Kortner, O; Kortner, S; Kostyukhin, V V; Kotamäki, M J; Kotov, S; Kotov, V M; Kotwal, A; Kourkoumelis, C; Kouskoura, V; Koutsman, A; Kowalewski, R; Kowalski, T Z; Kozanecki, W; Kozhin, A S; Kral, V; Kramarenko, V A; Kramberger, G; Krasny, M W; Krasznahorkay, A; Kraus, J; Kraus, J K; Kreisel, A; Krejci, F; Kretzschmar, J; Krieger, N; Krieger, P; Kroeninger, K; Kroha, H; Kroll, J; Kroseberg, J; Krstic, J; Kruchonak, U; Krüger, H; Kruker, T; Krumnack, N; Krumshteyn, Z V; Kruth, A; Kubota, T; Kuehn, S; Kugel, A; Kuhl, T; Kuhn, D; Kukhtin, V; Kulchitsky, Y; Kuleshov, S; Kummer, C; Kuna, M; Kundu, N; Kunkle, J; Kupco, A; Kurashige, H; Kurata, M; Kurochkin, Y A; Kus, V; Kuwertz, E S; Kuze, M; Kvita, J; Kwee, R; La Rosa, A; La Rotonda, L; Labarga, L; Labbe, J; Lablak, S; Lacasta, C; Lacava, F; Lacker, H; Lacour, D; Lacuesta, V R; Ladygin, E; Lafaye, R; Laforge, B; Lagouri, T; Lai, S; Laisne, E; Lamanna, M; Lampen, C L; Lampl, W; Lancon, E; Landgraf, U; Landon, M P J; Landsman, H; Lane, J L; Lange, C; Lankford, A J; Lanni, F; Lantzsch, K; Laplace, S; Lapoire, C; Laporte, J F; Lari, T; Larionov, A V; Larner, A; Lasseur, C; Lassnig, M; Laurelli, P; Lavrijsen, W; Laycock, P; Lazarev, A B; Le Dortz, O; Le Guirriec, E; Le Maner, C; Le Menedeu, E; Lebel, C; LeCompte, T; Ledroit-Guillon, F; Lee, H; Lee, J S H; Lee, S C; Lee, L; Lefebvre, M; Legendre, M; Leger, A; LeGeyt, B C; Legger, F; Leggett, C; Lehmacher, M; Lehmann Miotto, G; Lei, X; Leite, M A L; Leitner, R; Lellouch, D; Leltchouk, M; Lemmer, B; Lendermann, V; Leney, K J C; Lenz, T; Lenzen, G; Lenzi, B; Leonhardt, K; Leontsinis, S; Leroy, C; Lessard, J-R; Lesser, J; Lester, C G; Leung Fook Cheong, A; Levêque, J; Levin, D; Levinson, L J; Levitski, M S; Lewis, A; Lewis, G H; Leyko, A M; Leyton, M; Li, B; Li, H; Li, S; Li, X; Liang, Z; Liao, H; Liberti, B; Lichard, P; Lichtnecker, M; Lie, K; Liebig, W; Lifshitz, R; Limbach, C; Limosani, A; Limper, M; Lin, S C; Linde, F; Linnemann, J T; Lipeles, E; Lipinsky, L; Lipniacka, A; Liss, T M; Lissauer, D; Lister, A; Litke, A M; Liu, C; Liu, D; Liu, H; Liu, J B; Liu, M; Liu, S; Liu, Y; Livan, M; Livermore, S S A; Lleres, A; Llorente Merino, J; Lloyd, S L; Lobodzinska, E; Loch, P; Lockman, W S; Loddenkoetter, T; Loebinger, F K; Loginov, A; Loh, C W; Lohse, T; Lohwasser, K; Lokajicek, M; Loken, J; Lombardo, V P; Long, R E; Lopes, L; Lopez Mateos, D; Lorenz, J; Losada, M; Loscutoff, P; Lo Sterzo, F; Losty, M J; Lou, X; Lounis, A; Loureiro, K F; Love, J; Love, P A; Lowe, A J; Lu, F; Lubatti, H J; Luci, C; Lucotte, A; Ludwig, A; Ludwig, D; Ludwig, I; Ludwig, J; Luehring, F; Luijckx, G; Lumb, D; Luminari, L; Lund, E; Lund-Jensen, B; Lundberg, B; Lundberg, J; Lundquist, J; Lungwitz, M; Lutz, G; Lynn, D; Lys, J; Lytken, E; Ma, H; Ma, L L; Macana Goia, J A; Maccarrone, G; Macchiolo, A; Maček, B; Machado Miguens, J; Mackeprang, R; Madaras, R J; Mader, W F; Maenner, R; Maeno, T; Mättig, P; Mättig, S; Magnoni, L; Magradze, E; Mahalalel, Y; Mahboubi, K; Mahout, G; Maiani, C; Maidantchik, C; Maio, A; Majewski, S; Makida, Y; Makovec, N; Mal, P; Malaescu, B; Malecki, Pa; Malecki, P; Maleev, V P; Malek, F; Mallik, U; Malon, D; Malone, C; Maltezos, S; Malyshev, V; Malyukov, S; Mameghani, R; Mamuzic, J; Manabe, A; Mandelli, L; Mandić, I; Mandrysch, R; Maneira, J; Mangeard, P S; Manhaes de Andrade Filho, L; Manjavidze, I D; Mann, A; Manning, P M; Manousakis-Katsikakis, A; Mansoulie, B; Manz, A; Mapelli, A; Mapelli, L; March, L; Marchand, J F; Marchese, F; Marchiori, G; Marcisovsky, M; Marin, A; Marino, C P; Marroquim, F; Marshall, R; Marshall, Z; Martens, F K; Marti-Garcia, S; Martin, A J; Martin, B; Martin, B; Martin, F F; Martin, J P; Martin, Ph; Martin, T A; Martin, V J; Martin dit Latour, B; Martin-Haugh, S; Martinez, M; Martinez Outschoorn, V; Martyniuk, A C; Marx, M; Marzano, F; Marzin, A; Masetti, L; Mashimo, T; Mashinistov, R; Masik, J; Maslennikov, A L; Massa, I; Massaro, G; Massol, N; Mastrandrea, P; Mastroberardino, A; Masubuchi, T; Mathes, M; Matricon, P; Matsumoto, H; Matsunaga, H; Matsushita, T; Mattravers, C; Maugain, J M; Maurer, J; Maxfield, S J; Maximov, D A; May, E N; Mayne, A; Mazini, R; Mazur, M; Mazzanti, M; Mazzoni, E; Mc Kee, S P; McCarn, A; McCarthy, R L; McCarthy, T G; McCubbin, N A; McFarlane, K W; Mcfayden, J A; McGlone, H; Mchedlidze, G; McLaren, R A; Mclaughlan, T; McMahon, S J; McPherson, R A; Meade, A; Mechnich, J; Mechtel, M; Medinnis, M; Meera-Lebbai, R; Meguro, T; Mehdiyev, R; Mehlhase, S; Mehta, A; Meier, K; Meirose, B; Melachrinos, C; Mellado Garcia, B R; Mendoza Navas, L; Meng, Z; Mengarelli, A; Menke, S; Menot, C; Meoni, E; Mercurio, K M; Mermod, P; Merola, L; Meroni, C; Merritt, F S; Messina, A; Metcalfe, J; Mete, A S; Meyer, C; Meyer, C; Meyer, J-P; Meyer, J; Meyer, J; Meyer, T C; Meyer, W T; Miao, J; Michal, S; Micu, L; Middleton, R P; Migas, S; Mijović, L; Mikenberg, G; Mikestikova, M; Mikuž, M; Miller, D W; Miller, R J; Mills, W J; Mills, C; Milov, A; Milstead, D A; Milstein, D; Minaenko, A A; Miñano Moya, M; Minashvili, I A; Mincer, A I; Mindur, B; Mineev, M; Ming, Y; Mir, L M; Mirabelli, G; Miralles Verge, L; Misiejuk, A; Mitrevski, J; Mitrofanov, G Y; Mitsou, V A; Mitsui, S; Miyagawa, P S; Miyazaki, K; Mjörnmark, J U; Moa, T; Mockett, P; Moed, S; Moeller, V; Mönig, K; Möser, N; Mohapatra, S; Mohr, W; Mohrdieck-Möck, S; Moisseev, A M; Moles-Valls, R; Molina-Perez, J; Monk, J; Monnier, E; Montesano, S; Monticelli, F; Monzani, S; Moore, R W; Moorhead, G F; Mora Herrera, C; Moraes, A; Morange, N; Morel, J; Morello, G; Moreno, D; Moreno Llácer, M; Morettini, P; Morii, M; Morin, J; Morley, A K; Mornacchi, G; Morozov, S V; Morris, J D; Morvaj, L; Moser, H G; Mosidze, M; Moss, J; Mount, R; Mountricha, E; Mouraviev, S V; Moyse, E J W; Mudrinic, M; Mueller, F; Mueller, J; Mueller, K; Müller, T A; Mueller, T; Muenstermann, D; Muir, A; Munwes, Y; Murray, W J; Mussche, I; Musto, E; Myagkov, A G; Myska, M; Nadal, J; Nagai, K; Nagano, K; Nagasaka, Y; Nagel, M; Nairz, A M; Nakahama, Y; Nakamura, K; Nakamura, T; Nakano, I; Nanava, G; Napier, A; Narayan, R; Nash, M; Nation, N R; Nattermann, T; Naumann, T; Navarro, G; Neal, H A; Nebot, E; Nechaeva, P Yu; Negri, A; Negri, G; Nektarijevic, S; Nelson, A; Nelson, S; Nelson, T K; Nemecek, S; Nemethy, P; Nepomuceno, A A; Nessi, M; Neubauer, M S; Neusiedl, A; Neves, R M; Nevski, P; Newman, P R; Nguyen Thi Hong, V; Nickerson, R B; Nicolaidou, R; Nicolas, L; Nicquevert, B; Niedercorn, F; Nielsen, J; Niinikoski, T; Nikiforou, N; Nikiforov, A; Nikolaenko, V; Nikolaev, K; Nikolic-Audit, I; Nikolics, K; Nikolopoulos, K; Nilsen, H; Nilsson, P; Ninomiya, Y; Nisati, A; Nishiyama, T; Nisius, R; Nodulman, L; Nomachi, M; Nomidis, I; Nordberg, M; Nordkvist, B; Norton, P R; Novakova, J; Nozaki, M; Nozka, L; Nugent, I M; Nuncio-Quiroz, A-E; Nunes Hanninger, G; Nunnemann, T; Nurse, E; Nyman, T; O'Brien, B J; O'Neale, S W; O'Neil, D C; O'Shea, V; Oakes, L B; Oakham, F G; Oberlack, H; Ocariz, J; Ochi, A; Oda, S; Odaka, S; Odier, J; Ogren, H; Oh, A; Oh, S H; Ohm, C C; Ohshima, T; Ohshita, H; Ohsugi, T; Okada, S; Okawa, H; Okumura, Y; Okuyama, T; Olariu, A; Olcese, M; Olchevski, A G; Oliveira, M; Oliveira Damazio, D; Oliver Garcia, E; Olivito, D; Olszewski, A; Olszowska, J; Omachi, C; Onofre, A; Onyisi, P U E; Oram, C J; Oreglia, M J; Oren, Y; Orestano, D; Orlov, I; Oropeza Barrera, C; Orr, R S; Osculati, B; Ospanov, R; Osuna, C; Otero y Garzon, G; Ottersbach, J P; Ouchrif, M; Ouellette, E A; Ould-Saada, F; Ouraou, A; Ouyang, Q; Ovcharova, A; Owen, M; Owen, S; Ozcan, V E; Ozturk, N; Pacheco Pages, A; Padilla Aranda, C; Pagan Griso, S; Paganis, E; Paige, F; Pais, P; Pajchel, K; Palacino, G; Paleari, C P; Palestini, S; Pallin, D; Palma, A; Palmer, J D; Pan, Y B; Panagiotopoulou, E; Panes, B; Panikashvili, N; Panitkin, S; Pantea, D; Panuskova, M; Paolone, V; Papadelis, A; Papadopoulou, Th D; Paramonov, A; Park, W; Parker, M A; Parodi, F; Parsons, J A; Parzefall, U; Pasqualucci, E; Passaggio, S; Passeri, A; Pastore, F; Pastore, Fr; Pásztor, G; Pataraia, S; Patel, N; Pater, J R; Patricelli, S; Pauly, T; Pecsy, M; Pedraza Morales, M I; Peleganchuk, S V; Peng, H; Pengo, R; Penson, A; Penwell, J; Perantoni, M; Perez, K; Perez Cavalcanti, T; Perez Codina, E; Pérez García-Estañ, M T; Perez Reale, V; Perini, L; Pernegger, H; Perrino, R; Perrodo, P; Persembe, S; Perus, A; Peshekhonov, V D; Peters, K; Petersen, B A; Petersen, J; Petersen, T C; Petit, E; Petridis, A; Petridou, C; Petrolo, E; Petrucci, F; Petschull, D; Petteni, M; Pezoa, R; Phan, A; Phillips, P W; Piacquadio, G; Piccaro, E; Piccinini, M; Piec, S M; Piegaia, R; Pignotti, D T; Pilcher, J E; Pilkington, A D; Pina, J; Pinamonti, M; Pinder, A; Pinfold, J L; Ping, J; Pinto, B; Pirotte, O; Pizio, C; Plamondon, M; Pleier, M-A; Pleskach, A V; Poblaguev, A; Poddar, S; Podlyski, F; Poggioli, L; Poghosyan, T; Pohl, M; Polci, F; Polesello, G; Policicchio, A; Polini, A; Poll, J; Polychronakos, V; Pomarede, D M; Pomeroy, D; Pommès, K; Pontecorvo, L; Pope, B G; Popeneciu, G A; Popovic, D S; Poppleton, A; Portell Bueso, X; Posch, C; Pospelov, G E; Pospisil, S; Potrap, I N; Potter, C J; Potter, C T; Poulard, G; Poveda, J; Prabhu, R; Pralavorio, P; Pranko, A; Prasad, S; Pravahan, R; Prell, S; Pretzl, K; Pribyl, L; Price, D; Price, J; Price, L E; Price, M J; Prieur, D; Primavera, M; Prokofiev, K; Prokoshin, F; Protopopescu, S; Proudfoot, J; Prudent, X; Przybycien, M; Przysiezniak, H; Psoroulas, S; Ptacek, E; Pueschel, E; Purdham, J; Purohit, M; Puzo, P; Pylypchenko, Y; Qian, J; Qian, Z; Qin, Z; Quadt, A; Quarrie, D R; Quayle, W B; Quinonez, F; Raas, M; Radescu, V; Radics, B; Radloff, P; Rador, T; Ragusa, F; Rahal, G; Rahimi, A M; Rahm, D; Rajagopalan, S; Rammensee, M; Rammes, M; Randle-Conde, A S; Randrianarivony, K; Ratoff, P N; Rauscher, F; Raymond, M; Read, A L; Rebuzzi, D M; Redelbach, A; Redlinger, G; Reece, R; Reeves, K; Reichold, A; Reinherz-Aronis, E; Reinsch, A; Reisinger, I; Reljic, D; Rembser, C; Ren, Z L; Renaud, A; Renkel, P; Rescigno, M; Resconi, S; Resende, B; Reznicek, P; Rezvani, R; Richards, A; Richter, R; Richter-Was, E; Ridel, M; Rijpstra, M; Rijssenbeek, M; Rimoldi, A; Rinaldi, L; Rios, R R; Riu, I; Rivoltella, G; Rizatdinova, F; Rizvi, E; Robertson, S H; Robichaud-Veronneau, A; Robinson, D; Robinson, J E M; Robinson, M; Robson, A; Rocha de Lima, J G; Roda, C; Roda Dos Santos, D; Rodriguez, D; Roe, A; Roe, S; Røhne, O; Rojo, V; Rolli, S; Romaniouk, A; Romano, M; Romanov, V M; Romeo, G; Romero Adam, E; Roos, L; Ros, E; Rosati, S; Rosbach, K; Rose, A; Rose, M; Rosenbaum, G A; Rosenberg, E I; Rosendahl, P L; Rosenthal, O; Rosselet, L; Rossetti, V; Rossi, E; Rossi, L P; Rotaru, M; Roth, I; Rothberg, J; Rousseau, D; Royon, C R; Rozanov, A; Rozen, Y; Ruan, X; Rubinskiy, I; Ruckert, B; Ruckstuhl, N; Rud, V I; Rudolph, C; Rudolph, G; Rühr, F; Ruggieri, F; Ruiz-Martinez, A; Rumiantsev, V; Rumyantsev, L; Runge, K; Rurikova, Z; Rusakovich, N A; Rust, D R; Rutherfoord, J P; Ruwiedel, C; Ruzicka, P; Ryabov, Y F; Ryadovikov, V; Ryan, P; Rybar, M; Rybkin, G; Ryder, N C; Rzaeva, S; Saavedra, A F; Sadeh, I; Sadrozinski, H F-W; Sadykov, R; Safai Tehrani, F; Sakamoto, H; Salamanna, G; Salamon, A; Saleem, M; Salihagic, D; Salnikov, A; Salt, J; Salvachua Ferrando, B M; Salvatore, D; Salvatore, F; Salvucci, A; Salzburger, A; Sampsonidis, D; Samset, B H; Sanchez, A; Sandaker, H; Sander, H G; Sanders, M P; Sandhoff, M; Sandoval, T; Sandoval, C; Sandstroem, R; Sandvoss, S; Sankey, D P C; Sansoni, A; Santamarina Rios, C; Santoni, C; Santonico, R; Santos, H; Saraiva, J G; Sarangi, T; Sarkisyan-Grinbaum, E; Sarri, F; Sartisohn, G; Sasaki, O; Sasao, N; Satsounkevitch, I; Sauvage, G; Sauvan, E; Sauvan, J B; Savard, P; Savinov, V; Savu, D O; Sawyer, L; Saxon, D H; Says, L P; Sbarra, C; Sbrizzi, A; Scallon, O; Scannicchio, D A; Scarcella, M; Schaarschmidt, J; Schacht, P; Schäfer, U; Schaepe, S; Schaetzel, S; Schaffer, A C; Schaile, D; Schamberger, R D; Schamov, A G; Scharf, V; Schegelsky, V A; Scheirich, D; Schernau, M; Scherzer, M I; Schiavi, C; Schieck, J; Schioppa, M; Schlenker, S; Schlereth, J L; Schmidt, E; Schmieden, K; Schmitt, C; Schmitt, S; Schmitz, M; Schöning, A; Schott, M; Schouten, D; Schovancova, J; Schram, M; Schroeder, C; Schroer, N; Schuh, S; Schuler, G; Schultes, J; Schultz-Coulon, H-C; Schulz, H; Schumacher, J W; Schumacher, M; Schumm, B A; Schune, Ph; Schwanenberger, C; Schwartzman, A; Schwemling, Ph; Schwienhorst, R; Schwierz, R; Schwindling, J; Schwindt, T; Schwoerer, M; Scott, W G; Searcy, J; Sedov, G; Sedykh, E; Segura, E; Seidel, S C; Seiden, A; Seifert, F; Seixas, J M; Sekhniaidze, G; Selbach, K E; Seliverstov, D M; Sellden, B; Sellers, G; Seman, M; Semprini-Cesari, N; Serfon, C; Serin, L; Seuster, R; Severini, H; Sevior, M E; Sfyrla, A; Shabalina, E; Shamim, M; Shan, L Y; Shank, J T; Shao, Q T; Shapiro, M; Shatalov, P B; Shaver, L; Shaw, K; Sherman, D; Sherwood, P; Shibata, A; Shichi, H; Shimizu, S; Shimojima, M; Shin, T; Shiyakova, M; Shmeleva, A; Shochet, M J; Short, D; Shrestha, S; Shupe, M A; Sicho, P; Sidoti, A; Siegert, F; Sijacki, Dj; Silbert, O; Silva, J; Silver, Y; Silverstein, D; Silverstein, S B; Simak, V; Simard, O; Simic, Lj; Simion, S; Simmons, B; Simonyan, M; Sinervo, P; Sinev, N B; Sipica, V; Siragusa, G; Sircar, A; Sisakyan, A N; Sivoklokov, S Yu; Sjölin, J; Sjursen, T B; Skinnari, L A; Skottowe, H P; Skovpen, K; Skubic, P; Skvorodnev, N; Slater, M; Slavicek, T; Sliwa, K; Sloper, J; Smakhtin, V; Smirnov, S Yu; Smirnova, L N; Smirnova, O; Smith, B C; Smith, D; Smith, K M; Smizanska, M; Smolek, K; Snesarev, A A; Snow, S W; Snow, J; Snuverink, J; Snyder, S; Soares, M; Sobie, R; Sodomka, J; Soffer, A; Solans, C A; Solar, M; Solc, J; Soldatov, E; Soldevila, U; Solfaroli Camillocci, E; Solodkov, A A; Solovyanov, O V; Soni, N; Sopko, V; Sopko, B; Sosebee, M; Soualah, R; Soukharev, A; Spagnolo, S; Spanò, F; Spighi, R; Spigo, G; Spila, F; Spiwoks, R; Spousta, M; Spreitzer, T; Spurlock, B; St Denis, R D; Stahl, T; Stahlman, J; Stamen, R; Stanecka, E; Stanek, R W; Stanescu, C; Stapnes, S; Starchenko, E A; Stark, J; Staroba, P; Starovoitov, P; Staude, A; Stavina, P; Stavropoulos, G; Steele, G; Steinbach, P; Steinberg, P; Stekl, I; Stelzer, B; Stelzer, H J; Stelzer-Chilton, O; Stenzel, H; Stern, S; Stevenson, K; Stewart, G A; Stillings, J A; Stockton, M C; Stoerig, K; Stoicea, G; Stonjek, S; Strachota, P; Stradling, A R; Straessner, A; Strandberg, J; Strandberg, S; Strandlie, A; Strang, M; Strauss, E; Strauss, M; Strizenec, P; Ströhmer, R; Strom, D M; Strong, J A; Stroynowski, R; Strube, J; Stugu, B; Stumer, I; Stupak, J; Sturm, P; Styles, N A; Soh, D A; Su, D; Subramania, H S; Succurro, A; Sugaya, Y; Sugimoto, T; Suhr, C; Suita, K; Suk, M; Sulin, V V; Sultansoy, S; Sumida, T; Sun, X; Sundermann, J E; Suruliz, K; Sushkov, S; Susinno, G; Sutton, M R; Suzuki, Y; Suzuki, Y; Svatos, M; Sviridov, Yu M; Swedish, S; Sykora, I; Sykora, T; Szeless, B; Sánchez, J; Ta, D; Tackmann, K; Taffard, A; Tafirout, R; Taiblum, N; Takahashi, Y; Takai, H; Takashima, R; Takeda, H; Takeshita, T; Takubo, Y; Talby, M; Talyshev, A; Tamsett, M C; Tanaka, J; Tanaka, R; Tanaka, S; Tanaka, S; Tanaka, Y; Tanasijczuk, A J; Tani, K; Tannoury, N; Tappern, G P; Tapprogge, S; Tardif, D; Tarem, S; Tarrade, F; Tartarelli, G F; Tas, P; Tasevsky, M; Tassi, E; Tatarkhanov, M; Tayalati, Y; Taylor, C; Taylor, F E; Taylor, G N; Taylor, W; Teinturier, M; Teixeira Dias Castanheira, M; Teixeira-Dias, P; Temming, K K; Ten Kate, H; Teng, P K; Terada, S; Terashi, K; Terron, J; Testa, M; Teuscher, R J; Thadome, J; Therhaag, J; Theveneaux-Pelzer, T; Thioye, M; Thoma, S; Thomas, J P; Thompson, E N; Thompson, P D; Thompson, P D; Thompson, A S; Thomson, E; Thomson, M; Thun, R P; Tian, F; Tibbetts, M J; Tic, T; Tikhomirov, V O; Tikhonov, Y A; Timoshenko, S; Tipton, P; Tique Aires Viegas, F J; Tisserant, S; Toczek, B; Todorov, T; Todorova-Nova, S; Toggerson, B; Tojo, J; Tokár, S; Tokunaga, K; Tokushuku, K; Tollefson, K; Tomoto, M; Tompkins, L; Toms, K; Tong, G; Tonoyan, A; Topfel, C; Topilin, N D; Torchiani, I; Torrence, E; Torres, H; Torró Pastor, E; Toth, J; Touchard, F; Tovey, D R; Trefzger, T; Tremblet, L; Tricoli, A; Trigger, I M; Trincaz-Duvoid, S; Trinh, T N; Tripiana, M F; Trischuk, W; Trivedi, A; Trocmé, B; Troncon, C; Trottier-McDonald, M; Trzebinski, M; Trzupek, A; Tsarouchas, C; Tseng, J C-L; Tsiakiris, M; Tsiareshka, P V; Tsionou, D; Tsipolitis, G; Tsiskaridze, V; Tskhadadze, E G; Tsukerman, I I; Tsulaia, V; Tsung, J-W; Tsuno, S; Tsybychev, D; Tua, A; Tudorache, A; Tudorache, V; Tuggle, J M; Turala, M; Turecek, D; Turk Cakir, I; Turlay, E; Turra, R; Tuts, P M; Tykhonov, A; Tylmad, M; Tyndel, M; Tzanakos, G; Uchida, K; Ueda, I; Ueno, R; Ugland, M; Uhlenbrock, M; Uhrmacher, M; Ukegawa, F; Unal, G; Underwood, D G; Undrus, A; Unel, G; Unno, Y; Urbaniec, D; Usai, G; Uslenghi, M; Vacavant, L; Vacek, V; Vachon, B; Vahsen, S; Valenta, J; Valente, P; Valentinetti, S; Valkar, S; Valladolid Gallego, E; Vallecorsa, S; Valls Ferrer, J A; van der Graaf, H; van der Kraaij, E; Van der Leeuw, R; van der Poel, E; van der Ster, D; van Eldik, N; van Gemmeren, P; van Kesteren, Z; van Vulpen, I; Vanadia, M; Vandelli, W; Vandoni, G; Vaniachine, A; Vankov, P; Vannucci, F; Varela Rodriguez, F; Vari, R; Varnes, E W; Varouchas, D; Vartapetian, A; Varvell, K E; Vassilakopoulos, V I; Vazeille, F; Vegni, G; Veillet, J J; Vellidis, C; Veloso, F; Veness, R; Veneziano, S; Ventura, A; Ventura, D; Venturi, M; Venturi, N; Vercesi, V; Verducci, M; Verkerke, W; Vermeulen, J C; Vest, A; Vetterli, M C; Vichou, I; Vickey, T; Vickey Boeriu, O E; Viehhauser, G H A; Viel, S; Villa, M; Villaplana Perez, M; Vilucchi, E; Vincter, M G; Vinek, E; Vinogradov, V B; Virchaux, M; Virzi, J; Vitells, O; Viti, M; Vivarelli, I; Vives Vaque, F; Vlachos, S; Vladoiu, D; Vlasak, M; Vlasov, N; Vogel, A; Vokac, P; Volpi, G; Volpi, M; Volpini, G; von der Schmitt, H; von Loeben, J; von Radziewski, H; von Toerne, E; Vorobel, V; Vorobiev, A P; Vorwerk, V; Vos, M; Voss, R; Voss, T T; Vossebeld, J H; Vranjes, N; Vranjes Milosavljevic, M; Vrba, V; Vreeswijk, M; Vu Anh, T; Vuillermet, R; Vukotic, I; Wagner, W; Wagner, P; Wahlen, H; Wakabayashi, J; Walbersloh, J; Walch, S; Walder, J; Walker, R; Walkowiak, W; Wall, R; Waller, P; Wang, C; Wang, H; Wang, H; Wang, J; Wang, J; Wang, J C; Wang, R; Wang, S M; Warburton, A; Ward, C P; Warsinsky, M; Watkins, P M; Watson, A T; Watson, I J; Watson, M F; Watts, G; Watts, S; Waugh, A T; Waugh, B M; Weber, M; Weber, M S; Weber, P; Weidberg, A R; Weigell, P; Weingarten, J; Weiser, C; Wellenstein, H; Wells, P S; Wen, M; Wenaus, T; Wendler, S; Weng, Z; Wengler, T; Wenig, S; Wermes, N; Werner, M; Werner, P; Werth, M; Wessels, M; Weydert, C; Whalen, K; Wheeler-Ellis, S J; Whitaker, S P; White, A; White, M J; Whitehead, S R; Whiteson, D; Whittington, D; Wicek, F; Wicke, D; Wickens, F J; Wiedenmann, W; Wielers, M; Wienemann, P; Wiglesworth, C; Wiik-Fuchs, L A M; Wijeratne, P A; Wildauer, A; Wildt, M A; Wilhelm, I; Wilkens, H G; Will, J Z; Williams, E; Williams, H H; Willis, W; Willocq, S; Wilson, J A; Wilson, M G; Wilson, A; Wingerter-Seez, I; Winkelmann, S; Winklmeier, F; Wittgen, M; Wolter, M W; Wolters, H; Wong, W C; Wooden, G; Wosiek, B K; Wotschack, J; Woudstra, M J; Wozniak, K W; Wraight, K; Wright, C; Wright, M; Wrona, B; Wu, S L; Wu, X; Wu, Y; Wulf, E; Wunstorf, R; Wynne, B M; Xella, S; Xiao, M; Xie, S; Xie, Y; Xu, C; Xu, D; Xu, G; Yabsley, B; Yacoob, S; Yamada, M; Yamaguchi, H; Yamamoto, A; Yamamoto, K; Yamamoto, S; Yamamura, T; Yamanaka, T; Yamaoka, J; Yamazaki, T; Yamazaki, Y; Yan, Z; Yang, H; Yang, U K; Yang, Y; Yang, Y; Yang, Z; Yanush, S; Yao, Y; Yasu, Y; Ybeles Smit, G V; Ye, J; Ye, S; Yilmaz, M; Yoosoofmiya, R; Yorita, K; Yoshida, R; Young, C; Youssef, S; Yu, D; Yu, J; Yu, J; Yuan, L; Yurkewicz, A; Zabinski, B; Zaets, V G; Zaidan, R; Zaitsev, A M; Zajacova, Z; Zanello, L; Zarzhitsky, P; Zaytsev, A; Zeitnitz, C; Zeller, M; Zeman, M; Zemla, A; Zendler, C; Zenin, O; Ženiš, T; Zinonos, Z; Zenz, S; Zerwas, D; Zevi della Porta, G; Zhan, Z; Zhang, D; Zhang, H; Zhang, J; Zhang, X; Zhang, Z; Zhao, L; Zhao, T; Zhao, Z; Zhemchugov, A; Zheng, S; Zhong, J; Zhou, B; Zhou, N; Zhou, Y; Zhu, C G; Zhu, H; Zhu, J; Zhu, Y; Zhuang, X; Zhuravlov, V; Zieminska, D; Zimmermann, R; Zimmermann, S; Zimmermann, S; Ziolkowski, M; Zitoun, R; Živković, L; Zmouchko, V V; Zobernig, G; Zoccoli, A; Zolnierowski, Y; Zsenei, A; zur Nedden, M; Zutshi, V; Zwalinski, L

    2012-05-04

    The results of a search for pair production of the scalar partners of bottom quarks in 2.05  fb(-1) of pp collisions at sqrt[s]=7  TeV using the ATLAS experiment are reported. Scalar bottom quarks are searched for in events with large missing transverse momentum and two jets in the final state, where both jets are identified as originating from a bottom quark. In an R-parity conserving minimal supersymmetric scenario, assuming that the scalar bottom quark decays exclusively into a bottom quark and a neutralino, 95% confidence-level upper limits are obtained in the b(1) - χ(1)(0) mass plane such that for neutralino masses below 60 GeV scalar bottom masses up to 390 GeV are excluded.

  3. Fire and climate variation in western North America from fire-scar and tree-ring networks

    Treesearch

    Donald A. Falk; E. K. Heyerdahl; P. M. Brown; T. W. Swetnam; E. K. Sutherland; Z. Gedalof; L. Yocom; T. J. Brown

    2010-01-01

    Fire regimes (i.e., the pattern, frequency and intensity of fire in a region) reflect a complex interplay of bottom-up and top-down controls (Lertzman et al., 1998; Mc Kenzie et al., in press). Bottom-up controls include local variations in topographic, fuel and weather factors at the time of a burn (e.g., fuel moisture and continuity, ignition density and local wind...

  4. Great expectations: top-down attention modulates the costs of clutter and eccentricity.

    PubMed

    Steelman, Kelly S; McCarley, Jason S; Wickens, Christopher D

    2013-12-01

    An experiment and modeling effort examined interactions between bottom-up and top-down attentional control in visual alert detection. Participants performed a manual tracking task while monitoring peripheral display channels for alerts of varying salience, eccentricity, and spatial expectancy. Spatial expectancy modulated the influence of salience and eccentricity; alerts in low-probability locations engendered higher miss rates, longer detection times, and larger costs of visual clutter and eccentricity, indicating that top-down attentional control offset the costs of poor bottom-up stimulus quality. Data were compared to the predictions of a computational model of scanning and noticing that incorporates bottom-up and top-down sources of attentional control. The model accounted well for the overall pattern of miss rates and response times, predicting each of the observed main effects and interactions. Empirical results suggest that designers should expect the costs of poor bottom-up visibility to be greater for low expectancy signals, and that the placement of alerts within a display should be determined based on the combination of alert expectancy and response priority. Model fits suggest that the current model can serve as a useful tool for exploring a design space as a precursor to empirical data collection and for generating hypotheses for future experiments. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  5. Saccade Generation by the Frontal Eye Fields in Rhesus Monkeys Is Separable from Visual Detection and Bottom-Up Attention Shift

    PubMed Central

    Lee, Kyoung-Min; Ahn, Kyung-Ha; Keller, Edward L.

    2012-01-01

    The frontal eye fields (FEF), originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area’s role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task), and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task). Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area’s functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory. PMID:22761923

  6. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift.

    PubMed

    Lee, Kyoung-Min; Ahn, Kyung-Ha; Keller, Edward L

    2012-01-01

    The frontal eye fields (FEF), originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task), and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task). Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.

  7. Theory-driven intervention for changing personality: expectancy value theory, behavioral activation, and conscientiousness.

    PubMed

    Magidson, Jessica F; Roberts, Brent W; Collado-Rodriguez, Anahi; Lejuez, C W

    2014-05-01

    Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-up approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that, over time, become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this article proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of expectancy value theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance-dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically driven, bottom-up approach to changing personality traits. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  8. Graph-based layout analysis for PDF documents

    NASA Astrophysics Data System (ADS)

    Xu, Canhui; Tang, Zhi; Tao, Xin; Li, Yun; Shi, Cao

    2013-03-01

    To increase the flexibility and enrich the reading experience of e-book on small portable screens, a graph based method is proposed to perform layout analysis on Portable Document Format (PDF) documents. Digital born document has its inherent advantages like representing texts and fractional images in explicit form, which can be straightforwardly exploited. To integrate traditional image-based document analysis and the inherent meta-data provided by PDF parser, the page primitives including text, image and path elements are processed to produce text and non text layer for respective analysis. Graph-based method is developed in superpixel representation level, and page text elements corresponding to vertices are used to construct an undirected graph. Euclidean distance between adjacent vertices is applied in a top-down manner to cut the graph tree formed by Kruskal's algorithm. And edge orientation is then used in a bottom-up manner to extract text lines from each sub tree. On the other hand, non-textual objects are segmented by connected component analysis. For each segmented text and non-text composite, a 13-dimensional feature vector is extracted for labelling purpose. The experimental results on selected pages from PDF books are presented.

  9. A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.

    PubMed

    Revell, Christopher; Somveille, Marius

    2017-08-29

    In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.

  10. Accounting for the Material Stock of Nations

    PubMed Central

    Fishman, Tomer; Schandl, Heinz; Tanikawa, Hiroki; Walker, Paul; Krausmann, Fridolin

    2014-01-01

    National material stock (MS) accounts have been a neglected field of analysis in industrial ecology, possibly because of the difficulty in establishing such accounts. In this research, we propose a novel method to model national MS based on historical material flow data. This enables us to avoid the laborious data work involved with bottom-up accounts for stocks and to arrive at plausible levels of stock accumulation for nations. We apply the method for the United States and Japan to establish a proof of concept for two very different cases of industrial development. Looking at a period of 75 years (1930–2005), we find that per capita MS has been much higher in the United States for the entire period, but that Japan has experienced much higher growth rates throughout, in line with Japan's late industrial development. By 2005, however, both Japan and the United States arrive at a very similar level of national MS of 310 to 375 tonnes per capita, respectively. This research provides new insight into the relationship between MS and flows in national economies and enables us to extend the debate about material efficiency from a narrow perspective of throughput to a broader perspective of stocks. PMID:25505368

  11. Accounting for the Material Stock of Nations.

    PubMed

    Fishman, Tomer; Schandl, Heinz; Tanikawa, Hiroki; Walker, Paul; Krausmann, Fridolin

    2014-05-01

    National material stock (MS) accounts have been a neglected field of analysis in industrial ecology, possibly because of the difficulty in establishing such accounts. In this research, we propose a novel method to model national MS based on historical material flow data. This enables us to avoid the laborious data work involved with bottom-up accounts for stocks and to arrive at plausible levels of stock accumulation for nations. We apply the method for the United States and Japan to establish a proof of concept for two very different cases of industrial development. Looking at a period of 75 years (1930-2005), we find that per capita MS has been much higher in the United States for the entire period, but that Japan has experienced much higher growth rates throughout, in line with Japan's late industrial development. By 2005, however, both Japan and the United States arrive at a very similar level of national MS of 310 to 375 tonnes per capita, respectively. This research provides new insight into the relationship between MS and flows in national economies and enables us to extend the debate about material efficiency from a narrow perspective of throughput to a broader perspective of stocks.

  12. Active noise attenuation in ventilation windows.

    PubMed

    Huang, Huahua; Qiu, Xiaojun; Kang, Jian

    2011-07-01

    The feasibility of applying active noise control techniques to attenuate low frequency noise transmission through a natural ventilation window into a room is investigated analytically and experimentally. The window system is constructed by staggering the opening sashes of a spaced double glazing window to allow ventilation and natural light. An analytical model based on the modal expansion method is developed to calculate the low frequency sound field inside the window and the room and to be used in the active noise control simulations. The effectiveness of the proposed analytical model is validated by using the finite element method. The performance of the active control system for a window with different source and receiver configurations are compared, and it is found that the numerical and experimental results are in good agreement and the best result is achieved when the secondary sources are placed in the center at the bottom of the staggered window. The extra attenuation at the observation points in the optimized window system is almost equivalent to the noise reduction at the error sensor and the frequency range of effective control is up to 390 Hz in the case of a single channel active noise control system. © 2011 Acoustical Society of America

  13. An integrative cross-design synthesis approach to estimate the cost of illness: an applied case to the cost of depression in Catalonia.

    PubMed

    Bendeck, Murielle; Serrano-Blanco, Antoni; García-Alonso, Carlos; Bonet, Pere; Jordà, Esther; Sabes-Figuera, Ramon; Salvador-Carulla, Luis

    2013-04-01

    Cost of illness (COI) studies are carried out under conditions of uncertainty and with incomplete information. There are concerns regarding their generalisability, accuracy and usability in evidence-informed care. A hybrid methodology is used to estimate the regional costs of depression in Catalonia (Spain) following an integrative approach. The cross-design synthesis included nominal groups and quantitative analysis of both top-down and bottom-up studies, and incorporated primary and secondary data from different sources of information in Catalonia. Sensitivity analysis used probabilistic Monte Carlo simulation modelling. A dissemination strategy was planned, including a standard form adapted from cost-effectiveness studies to summarise methods and results. The method used allows for a comprehensive estimate of the cost of depression in Catalonia. Health officers and decision-makers concluded that this methodology provided useful information and knowledge for evidence-informed planning in mental health. The mix of methods, combined with a simulation model, contributed to a reduction in data gaps and, in conditions of uncertainty, supplied more complete information on the costs of depression in Catalonia. This approach to COI should be differentiated from other COI designs to allow like-with-like comparisons. A consensus on COI typology, procedures and dissemination is needed.

  14. Multicale modeling of the detonation of aluminized explosives using SPH-MD-QM method

    NASA Astrophysics Data System (ADS)

    Peng, Qing; Wang, Guangyu; Liu, Gui-Rong; de, Suvranu

    Aluminized explosives have been applied in military industry since decades ago. Compared with ideal explosives, aluminized explosives feature both fast detonation and slow metal combustion chemistry, generating a complex multi-phase reactive flow. Here, we introduce a sequential multiscale model of SPH-MD-QM to simulate the detonation behavior of aluminized explosives. At the bottom level, first-principles quantum mechanics (QM) calculations are employed to obtain the training sets for fitting the ReaxFF potentials, which are used in turn in the reactive molecular dynamics (MD) simulations in the middle level to obtain the chemical reaction rates and equations of states. At the up lever, a smooth particle hydrodynamics (SPH) method incorporated ignition and growth model and afterburning model has been used for the simulation of the detonation and combustion of the aluminized explosive. Simulation is compared with experiment and good agreement is observed. The proposed multiscale method of SPH-MD-QM could be used to optimize the performance of aluminized explosives. The authors would like to acknowledge the generous financial support from the Defense Threat Reduction Agency (DTRA) Grant No. HDTRA1-13-1-0025 and the Office of Naval Research Grants ONR Award No. N00014-08-1-0462 and No. N00014-12-1-0527.

  15. Contact Trees: Network Visualization beyond Nodes and Edges

    PubMed Central

    Sallaberry, Arnaud; Fu, Yang-chih; Ho, Hwai-Chung; Ma, Kwan-Liu

    2016-01-01

    Node-Link diagrams make it possible to take a quick glance at how nodes (or actors) in a network are connected by edges (or ties). A conventional network diagram of a “contact tree” maps out a root and branches that represent the structure of nodes and edges, often without further specifying leaves or fruits that would have grown from small branches. By furnishing such a network structure with leaves and fruits, we reveal details about “contacts” in our ContactTrees upon which ties and relationships are constructed. Our elegant design employs a bottom-up approach that resembles a recent attempt to understand subjective well-being by means of a series of emotions. Such a bottom-up approach to social-network studies decomposes each tie into a series of interactions or contacts, which can help deepen our understanding of the complexity embedded in a network structure. Unlike previous network visualizations, ContactTrees highlight how relationships form and change based upon interactions among actors, as well as how relationships and networks vary by contact attributes. Based on a botanical tree metaphor, the design is easy to construct and the resulting tree-like visualization can display many properties at both tie and contact levels, thus recapturing a key ingredient missing from conventional techniques of network visualization. We demonstrate ContactTrees using data sets consisting of up to three waves of 3-month contact diaries over the 2004-2012 period, and discuss how this design can be applied to other types of datasets. PMID:26784350

  16. Spectroscopic and Mechanical Properties of a New Generation of Bulk Fill Composites

    PubMed Central

    Monterubbianesi, Riccardo; Orsini, Giovanna; Tosi, Giorgio; Conti, Carla; Librando, Vito; Procaccini, Maurizio; Putignano, Angelo

    2016-01-01

    Objectives: The aims of this study were to in vitro evaluate the degree of conversion and the microhardness properties of five bulk fill resin composites; in addition, the performance of two curing lamps, used for composites polymerization, was also analyzed. Materials and Methods: The following five resin-based bulk fill composites were tested: SureFil SDR®, Fill Up!™, Filtek™, SonicFill™, and SonicFill2™. Samples of 4 mm in thickness were prepared using Teflon molds filled in one increment and light-polymerized using two LED power units. Ten samples for each composite were cured using Elipar S10 and 10 using Demi Ultra. Additional samples of SonicFill2, (3 and 5 mm-thick) were also tested. The degree of conversion (DC) was determined by Raman spectroscopy, while the Vickers microhardness (VMH) was evaluated using a microhardness tester. The experimental evaluation was carried out on top and bottom sides, immediately after curing (t0), and, on bottom, after 24 h (t24). Two-ways analysis of variance was applied to evaluate DC and VMH-values. In all analyses, the level of significance was set at p < 0.05. Results: All bulk fill resin composites recorded satisfactory DCs on top and bottom sides. At t0, the top of SDR and SonicFill2 showed the highest DCs-values (85.56 ± 9.52 and 85.47 ± 1.90, respectively), when cured using Elipar S10; using Demi Ultra, SonicFill2 showed the highest DCs-values (90.53 ± 2.18). At t0, the highest DCs-values of bottom sides were recorded by SDR (84.64 ± 11.68), when cured using Elipar S10, and Filtek (81.52 ± 4.14), using Demi Ultra. On top sides, Demi Ultra lamp showed significant higher DCs compared to the Elipar S10 (p < 0.05). SonicFill2 reached suitable DCs also on bottom of 5 mm-thick samples. At t0, VMH-values ranged between 24.4 and 69.18 for Elipar S10, and between 26.5 and 67.3 for Demi Ultra. Using both lamps, the lowest VMH-values were shown by SDR, while the highest values by SonicFill2. At t24, all DC and VMH values significantly increased. Conclusions: Differences in DC and VMH among materials are suggested to be material and curing lamp dependent. Even at t0, the three high viscosity bulk composites showed higher VMH than the flowable or dual curing composites. PMID:28082918

  17. Spectroscopic and Mechanical Properties of a New Generation of Bulk Fill Composites.

    PubMed

    Monterubbianesi, Riccardo; Orsini, Giovanna; Tosi, Giorgio; Conti, Carla; Librando, Vito; Procaccini, Maurizio; Putignano, Angelo

    2016-01-01

    Objectives: The aims of this study were to in vitro evaluate the degree of conversion and the microhardness properties of five bulk fill resin composites; in addition, the performance of two curing lamps, used for composites polymerization, was also analyzed. Materials and Methods: The following five resin-based bulk fill composites were tested: SureFil SDR®, Fill Up!™, Filtek™, SonicFill™, and SonicFill2™. Samples of 4 mm in thickness were prepared using Teflon molds filled in one increment and light-polymerized using two LED power units. Ten samples for each composite were cured using Elipar S10 and 10 using Demi Ultra. Additional samples of SonicFill2, (3 and 5 mm-thick) were also tested. The degree of conversion (DC) was determined by Raman spectroscopy, while the Vickers microhardness (VMH) was evaluated using a microhardness tester. The experimental evaluation was carried out on top and bottom sides, immediately after curing (t0), and, on bottom, after 24 h (t24). Two-ways analysis of variance was applied to evaluate DC and VMH-values. In all analyses, the level of significance was set at p < 0.05. Results: All bulk fill resin composites recorded satisfactory DCs on top and bottom sides. At t0, the top of SDR and SonicFill2 showed the highest DCs-values (85.56 ± 9.52 and 85.47 ± 1.90, respectively), when cured using Elipar S10; using Demi Ultra, SonicFill2 showed the highest DCs-values (90.53 ± 2.18). At t0, the highest DCs-values of bottom sides were recorded by SDR (84.64 ± 11.68), when cured using Elipar S10, and Filtek (81.52 ± 4.14), using Demi Ultra. On top sides, Demi Ultra lamp showed significant higher DCs compared to the Elipar S10 ( p < 0.05). SonicFill2 reached suitable DCs also on bottom of 5 mm-thick samples. At t0, VMH-values ranged between 24.4 and 69.18 for Elipar S10, and between 26.5 and 67.3 for Demi Ultra. Using both lamps, the lowest VMH-values were shown by SDR, while the highest values by SonicFill2. At t24, all DC and VMH values significantly increased. Conclusions: Differences in DC and VMH among materials are suggested to be material and curing lamp dependent. Even at t0, the three high viscosity bulk composites showed higher VMH than the flowable or dual curing composites.

  18. Heat sinking for printed circuitry

    DOEpatents

    Wilson, S.K.; Richardson, G.; Pinkerton, A.L.

    1984-09-11

    A flat pak or other solid-state device mounted on a printed circuit board directly over a hole extends therethrough so that the bottom of the pak or device extends beyond the bottom of the circuit board. A heat sink disposed beneath the circuit board contacts the bottom of the pak or device and provides direct heat sinking thereto. Pressure may be applied to the top of the pak or device to assure good mechanical and thermal contact with the heat sink.

  19. Facile fabrication of uniaxial nanopatterns on shape memory polymer substrates using a complete bottom-up approach

    NASA Astrophysics Data System (ADS)

    Chen, Zhongbi; Krishnaswamy, Sridhar

    2014-03-01

    In earlier work, we have demonstrated an assisted self-assembly fabrication method for unidirectional submicron patterns using pre-programmed shape memory polymers (SMP) as the substrate in an organic/inorganic bilayer structure. In this paper, we propose a complete bottom-up method for fabrication of uniaxial wrinkles whose wavelength is below 300 nm. The method starts with using the aforementioned self-assembled bi-layer wrinkled surface as the template to make a replica of surface wrinkles on a PDMS layer which is spin-coated on a pre-programmed SMP substrate. When the shape recovery of the substrate is triggered by heating it to its transition temperature, the substrate has been programmed in such a way that it shrinks uniaxially to return to its permanent shape. Consequently, the wrinkle wavelength on PDMS reduces accordingly. A subsequent contact molding process is carried out on the PDMS layer spin-coated on another pre-programmed SMP substrate, but using the wrinkled PDMS surface obtained in the previous step as the master. By activating the shape recovery of the substrate, the wrinkle wavelength is further reduced a second time in a similar fashion. Our experiments showed that the starting wavelength of 640 nm decreased to 290 nm after two cycles of recursive molding. We discuss the advantages and limitations of our recursive molding approach compared to the prevalent top-down fabrication methods represented by lithography. The present study is expected to o er a simple and cost-e ective fabrication method of nano-scale uniaxial wrinkle patterns with the potential for large-scale mass-production.

  20. Design of the Bottom-up Innovation project - a participatory, primary preventive, organizational level intervention on work-related stress and well-being for workers in Dutch vocational education

    PubMed Central

    2013-01-01

    Background In the educational sector job demands have intensified, while job resources remained the same. A prolonged disbalance between demands and resources contributes to lowered vitality and heightened need for recovery, eventually resulting in burnout, sickness absence and retention problems. Until now stress management interventions in education focused mostly on strengthening the individual capacity to cope with stress, instead of altering the sources of stress at work at the organizational level. These interventions have been only partly effective in influencing burnout and well-being. Therefore, the “Bottom-up Innovation” project tests a two-phased participatory, primary preventive organizational level intervention (i.e. a participatory action approach) that targets and engages all workers in the primary process of schools. It is hypothesized that participating in the project results in increased occupational self-efficacy and organizational efficacy. The central research question: is an organization focused stress management intervention based on participatory action effective in reducing the need for recovery and enhancing vitality in school employees in comparison to business as usual? Methods/Design The study is designed as a controlled trial with mixed methods and three measurement moments: baseline (quantitative measures), six months and 18 months (quantitative and qualitative measures). At first follow-up short term effects of taking part in the needs assessment (phase 1) will be determined. At second follow-up the long term effects of taking part in the needs assessment will be determined as well as the effects of implemented tailored workplace solutions (phase 2). A process evaluation based on quantitative and qualitative data will shed light on whether, how and why the intervention (does not) work(s). Discussion “Bottom-up Innovation” is a combined effort of the educational sector, intervention providers and researchers. Results will provide insight into (1) the relation between participating in the intervention and occupational and organizational self-efficacy, (2) how an improved balance between job demands and job resources might affect need for recovery and vitality, in the short and long term, from an organizational perspective, and (3) success and fail factors for implementation of an organizational intervention. Trial registration number Netherlands Trial Register NTR3284 PMID:23947538

  1. The Bottom-up Move within Vygotsky's Zone of Proximal Development: A Pedagogical Application for Teaching Agreement in Spanish as a Foreign Language

    ERIC Educational Resources Information Center

    Escandon, Arturo; Sanz, Montserrat

    2011-01-01

    This paper presents the findings of a longitudinal study in which two instructional methods to teach agreement features to first-year university students specializing in Spanish in Japan are compared. On the one hand, the control group was exposed to the traditional top-down teaching of agreement paradigms and were instructed to practice them…

  2. Criteria for Comparing Domain Analysis Approaches Version 01.00.00

    DTIC Science & Technology

    1991-12-01

    Down-Bottom-Up Domain Analysis Process (1990 Version) ..... 14 Figure 8. FODAs Domain Analysis Process ............................................ 16... FODA , which uses the Design Approach for Real-Time Systems (DARTS) design method (Gomaa 1984)? 1 1. hntmduction Domain analysis is still immature... Analysis Process 16 2. An Orvicw of Some Domain AnabAppro•d•a 2.4.3 ExAzs The FODA report illustrates the process by using the window management

  3. Constraining East Asian CO2 emissions with GOSAT retrievals: methods and policy implications

    NASA Astrophysics Data System (ADS)

    Shim, C.; Henze, D. K.; Deng, F.

    2017-12-01

    The world largest CO2 emissions are from East Asia. However, there are large uncertainties in CO2 emission inventories, mainly because of imperfections in bottom-up statistics and a lack of observations for validating emission fluxes, particularly over China. Here we tried to constrain East Asian CO2 emissions with GOSAT retrievals applying 4-Dvar GEOS-Chem and its adjoint model. We applied the inversion to only the cold season (November - February) in 2009 - 2010 since the summer monsoon and greater transboundary impacts in spring and fall greatly reduced the GOSAT retrievals. In the cold season, the a posteriori CO2 emissions over East Asia generally higher by 5 - 20%, particularly Northeastern China shows intensively higher in a posteriori emissions ( 20%), where the Chinese government is recently focusing on mitigating the air pollutants. In another hand, a posteriori emissions from Southern China are lower 10 - 25%. A posteriori emissions in Korea and Japan are mostly higher by 10 % except over Kyushu region. With our top-down estimates with 4-Dvar CO2 inversion, we will evaluate the current regional CO2 emissions inventories and potential uncertainties in the sectoral emissions. This study will help understand the quantitative information on anthropogenic CO2 emissions over East Asia and will give policy implications for the mitigation targets.

  4. A Citizen Science Approach: A Detailed Ecological Assessment of Subtropical Reefs at Point Lookout, Australia

    PubMed Central

    Thurstan, Ruth; Beger, Maria; Dudgeon, Christine; Loder, Jennifer; Kovacs, Eva; Gallo, Michele; Flower, Jason; Gomez Cabrera, K-le; Ortiz, Juan; Lea, Alexandra; Kleine, Diana

    2016-01-01

    Subtropical reefs provide an important habitat for flora and fauna, and proper monitoring is required for conservation. Monitoring these exposed and submerged reefs is challenging and available resources are limited. Citizen science is increasing in momentum, as an applied research tool and in the variety of monitoring approaches adopted. This paper aims to demonstrate an ecological assessment and mapping approach that incorporates both top-down (volunteer marine scientists) and bottom-up (divers/community) engagement aspects of citizen science, applied at a subtropical reef at Point Lookout, Southeast Queensland, Australia. Marine scientists trained fifty citizen scientists in survey techniques that included mapping of habitat features, recording of substrate, fish and invertebrate composition, and quantifying impacts (e.g., occurrence of substrate damage, presence of litter). In 2014 these volunteers conducted four seasonal surveys along semi-permanent transects, at five sites, across three reefs. The project presented is a model on how citizen science can be conducted in a marine environment through collaboration of volunteer researchers, non-researchers and local marine authorities. Significant differences in coral and algal cover were observed among the three sites, while fluctuations in algal cover were also observed seasonally. Differences in fish assemblages were apparent among sites and seasons, with subtropical fish groups observed more commonly in colder seasons. The least physical damage occurred in the most exposed sites (Flat Rock) within the highly protected marine park zones. The broad range of data collected through this top-down/bottom-up approach to citizen science exemplifies the projects’ value and application for identifying ecosystem trends or patterns. The results of the project support natural resource and marine park management, providing a valuable contribution to existing scientific knowledge and the conservation of local reefs. PMID:27706182

  5. A Citizen Science Approach: A Detailed Ecological Assessment of Subtropical Reefs at Point Lookout, Australia.

    PubMed

    Roelfsema, Chris; Thurstan, Ruth; Beger, Maria; Dudgeon, Christine; Loder, Jennifer; Kovacs, Eva; Gallo, Michele; Flower, Jason; Gomez Cabrera, K-le; Ortiz, Juan; Lea, Alexandra; Kleine, Diana

    2016-01-01

    Subtropical reefs provide an important habitat for flora and fauna, and proper monitoring is required for conservation. Monitoring these exposed and submerged reefs is challenging and available resources are limited. Citizen science is increasing in momentum, as an applied research tool and in the variety of monitoring approaches adopted. This paper aims to demonstrate an ecological assessment and mapping approach that incorporates both top-down (volunteer marine scientists) and bottom-up (divers/community) engagement aspects of citizen science, applied at a subtropical reef at Point Lookout, Southeast Queensland, Australia. Marine scientists trained fifty citizen scientists in survey techniques that included mapping of habitat features, recording of substrate, fish and invertebrate composition, and quantifying impacts (e.g., occurrence of substrate damage, presence of litter). In 2014 these volunteers conducted four seasonal surveys along semi-permanent transects, at five sites, across three reefs. The project presented is a model on how citizen science can be conducted in a marine environment through collaboration of volunteer researchers, non-researchers and local marine authorities. Significant differences in coral and algal cover were observed among the three sites, while fluctuations in algal cover were also observed seasonally. Differences in fish assemblages were apparent among sites and seasons, with subtropical fish groups observed more commonly in colder seasons. The least physical damage occurred in the most exposed sites (Flat Rock) within the highly protected marine park zones. The broad range of data collected through this top-down/bottom-up approach to citizen science exemplifies the projects' value and application for identifying ecosystem trends or patterns. The results of the project support natural resource and marine park management, providing a valuable contribution to existing scientific knowledge and the conservation of local reefs.

  6. Framework for Probabilistic Projections of Energy-Relevant Streamflow Indicators under Climate Change Scenarios for the U.S.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wagener, Thorsten; Mann, Michael; Crane, Robert

    2014-04-29

    This project focuses on uncertainty in streamflow forecasting under climate change conditions. The objective is to develop easy to use methodologies that can be applied across a range of river basins to estimate changes in water availability for realistic projections of climate change. There are three major components to the project: Empirical downscaling of regional climate change projections from a range of Global Climate Models; Developing a methodology to use present day information on the climate controls on the parameterizations in streamflow models to adjust the parameterizations under future climate conditions (a trading-space-for-time approach); and Demonstrating a bottom-up approach tomore » establishing streamflow vulnerabilities to climate change. The results reinforce the need for downscaling of climate data for regional applications, and further demonstrates the challenges of using raw GCM data to make local projections. In addition, it reinforces the need to make projections across a range of global climate models. The project demonstrates the potential for improving streamflow forecasts by using model parameters that are adjusted for future climate conditions, but suggests that even with improved streamflow models and reduced climate uncertainty through the use of downscaled data, there is still large uncertainty is the streamflow projections. The most useful output from the project is the bottom-up vulnerability driven approach to examining possible climate and land use change impacts on streamflow. Here, we demonstrate an inexpensive and easy to apply methodology that uses Classification and Regression Trees (CART) to define the climate and environmental parameters space that can produce vulnerabilities in the system, and then feeds in the downscaled projections to determine the probability top transitioning to a vulnerable sate. Vulnerabilities, in this case, are defined by the end user.« less

  7. Exploring petroleum hydrocarbons in groundwater by double solid phase extraction coupled to gas chromatography-flame ionization detector.

    PubMed

    Pindado Jiménez, Oscar; Pérez Pastor, Rosa Ma; Escolano Segovia, Olga; del Reino Querencia, Susana

    2015-01-01

    This work proposes an analytical procedure for measuring aliphatic and aromatic hydrocarbons fractions present in groundwater. In this method, hydrocarbons are solid phase extracted (SPE) twice from the groundwater and the resulting fractions are analyzed by gas chromatography with flame ionization detection. The first SPE disposes the hydrocarbons present in groundwater in organic solvents and the second SPE divides them into aliphatic and aromatic hydrocarbons. The validation study is carried out and its uncertainties are discussed. Identifying the main sources of uncertainty is evaluated through applying the bottom-up approach. Limits of detection for hydrocarbons ranges are below 5 µg L(-1), precision is not above of 30%, and acceptable recoveries are reached for aliphatic and aromatic fractions studied. The uncertainty due to volume of the sample, factor of calibration and recovery are the highest contributions. The expanded uncertainty range from 13% to 26% for the aliphatic hydrocarbons ranges and from 14% to 23% for the aromatic hydrocarbons ranges. As application, the proposed method is satisfactorily applied to a set of groundwater samples collected in a polluted area where there is evidence to present a high degree of hydrocarbons. The results have shown the range of aliphatic hydrocarbons >C21-C35 is the most abundant, with values ranging from 215 µg L(-1) to 354 µg L(-1), which it is associated to a contamination due to diesel. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Search for Scalar Bottom Quark Pair Production with the ATLAS Detector in p p Collisions at s = 7 TeV

    DOE PAGES

    Aad, G.; Abbott, B.; Abdallah, J.; ...

    2012-05-02

    The results of a search for pair production of the scalar partners of bottom quarks in 2.05 fb -1 of p p collisions at √ s = 7 TeV using the ATLAS experiment are reported. Scalar bottom quarks are searched for in events with large missing transverse momentum and two jets in the final state, where both jets are identified as originating from a bottom quark. In an R -parity conserving minimal supersymmetric scenario, assuming that the scalar bottom quark decays exclusively into a bottom quark and a neutralino, 95% confidence-level upper limits are obtained in the ˜ b 1more » - ˜ χ 0 1 mass plane such that for neutralino masses below 60 GeV scalar bottom masses up to 390 GeV are excluded.« less

  9. Search for scalar bottom quarks from gluino decays in collisions at.

    PubMed

    Abulencia, A; Acosta, D; Adelman, J; Affolder, T; Akimoto, T; Albrow, M G; Ambrose, D; Amerio, S; Amidei, D; Anastassov, A; Anikeev, K; Annovi, A; Antos, J; Aoki, M; Apollinari, G; Arguin, J-F; Arisawa, T; Artikov, A; Ashmanskas, W; Attal, A; Azfar, F; Azzi-Bacchetta, P; Azzurri, P; Bacchetta, N; Bachacou, H; Badgett, W; Barbaro-Galtieri, A; Barnes, V E; Barnett, B A; Baroiant, S; Bartsch, V; Bauer, G; Bedeschi, F; Behari, S; Belforte, S; Bellettini, G; Bellinger, J; Belloni, A; Ben-Haim, E; Benjamin, D; Beretvas, A; Beringer, J; Berry, T; Bhatti, A; Binkley, M; Bisello, D; Bishai, M; Blair, R E; Blocker, C; Bloom, K; Blumenfeld, B; Bocci, A; Bodek, A; Boisvert, V; Bolla, G; Bolshov, A; Bortoletto, D; Boudreau, J; Bourov, S; Boveia, A; Brau, B; Bromberg, C; Brubaker, E; Budagov, J; Budd, H S; Budd, S; Burkett, K; Busetto, G; Bussey, P; Byrum, K L; Cabrera, S; Campanelli, M; Campbell, M; Canelli, F; Canepa, A; Carlsmith, D; Carosi, R; Carron, S; Casarsa, M; Castro, A; Catastini, P; Cauz, D; Cavalli-Sforza, M; Cerri, A; Cerrito, L; Chang, S H; Chapman, J; Chen, Y C; Chertok, M; Chiarelli, G; Chlachidze, G; Chlebana, F; Cho, I; Cho, K; Chokheli, D; Chou, J P; Chu, P H; Chuang, S H; Chung, K; Chung, W H; Chung, Y S; Ciljak, M; Ciobanu, C I; Ciocci, M A; Clark, A; Clark, D; Coca, M; Connolly, A; Convery, M E; Conway, J; Cooper, B; Copic, K; Cordelli, M; Cortiana, G; Cruz, A; Cuevas, J; Culbertson, R; Cyr, D; DaRonco, S; D'Auria, S; D'onofrio, M; Dagenhart, D; de Barbaro, P; De Cecco, S; Deisher, A; De Lentdecker, G; Dell'Orso, M; Demers, S; Demortier, L; Deng, J; Deninno, M; De Pedis, D; Derwent, P F; Dionisi, C; Dittmann, J R; Dituro, P; Dörr, C; Dominguez, A; Donati, S; Donega, M; Dong, P; Donini, J; Dorigo, T; Dube, S; Ebina, K; Efron, J; Ehlers, J; Erbacher, R; Errede, D; Errede, S; Eusebi, R; Fang, H C; Farrington, S; Fedorko, I; Fedorko, W T; Feild, R G; Feindt, M; Fernandez, J P; Field, R; Flanagan, G; Flores-Castillo, L R; Foland, A; Forrester, S; Foster, G W; Franklin, M; Freeman, J C; Fujii, Y; Furic, I; Gajjar, A; Gallinaro, M; Galyardt, J; Garcia, J E; Garcia Sciverez, M; Garfinkel, A F; Gay, C; Gerberich, H; Gerchtein, E; Gerdes, D; Giagu, S; di Giovanni, G P; Giannetti, P; Gibson, A; Gibson, K; Ginsburg, C; Giokaris, N; Giolo, K; Giordani, M; Giunta, M; Giurgiu, G; Glagolev, V; Glenzinski, D; Gold, M; Goldschmidt, N; Goldstein, J; Gomez, G; Gomez-Ceballos, G; Goncharov, M; González, O; Gorelov, I; Goshaw, A T; Gotra, Y; Goulianos, K; Gresele, A; Griffiths, M; Grinstein, S; Grosso-Pilcher, C; Grundler, U; Guimaraes da Costa, J; Haber, C; Hahn, S R; Hahn, K; Halkiadakis, E; Hamilton, A; Han, B-Y; Handler, R; Happacher, F; Hara, K; Hare, M; Harper, S; Harr, R F; Harris, R M; Hatakeyama, K; Hauser, J; Hays, C; Hayward, H; Heijboer, A; Heinemann, B; Heinrich, J; Hennecke, M; Herndon, M; Heuser, J; Hidas, D; Hill, C S; Hirschbuehl, D; Hocker, A; Holloway, A; Hou, S; Houlden, M; Hsu, S-C; Huffman, B T; Hughes, R E; Huston, J; Ikado, K; Incandela, J; Introzzi, G; Iori, M; Ishizawa, Y; Ivanov, A; Iyutin, B; James, E; Jang, D; Jayatilaka, B; Jeans, D; Jensen, H; Jeon, E J; Jones, M; Joo, K K; Jun, S Y; Junk, T R; Kamon, T; Kang, J; Karagoz-Unel, M; Karchin, P E; Kato, Y; Kemp, Y; Kephart, R; Kerzel, U; Khotilovich, V; Kilminster, B; Kim, D H; Kim, H S; Kim, J E; Kim, M J; Kim, M S; Kim, S B; Kim, S H; Kim, Y K; Kirby, M; Kirsch, L; Klimenko, S; Klute, M; Knuteson, B; Ko, B R; Kobayashi, H; Kondo, K; Kong, D J; Konigsberg, J; Kordas, K; Korytov, A; Kotwal, A V; Kovalev, A; Kraus, J; Kravchenko, I; Kreps, M; Kreymer, A; Kroll, J; Krumnack, N; Kruse, M; Krutelyov, V; Kuhlmann, S E; Kusakabe, Y; Kwang, S; Laasanen, A T; Lai, S; Lami, S; Lammel, S; Lancaster, M; Lander, R L; Lannon, K; Lath, A; Latino, G; Lazzizzera, I; Lecci, C; Lecompte, T; Lee, J; Lee, J; Lee, S W; Lefèvre, R; Leonardo, N; Leone, S; Levy, S; Lewis, J D; Li, K; Lin, C; Lin, C S; Lindgren, M; Lipeles, E; Liss, T M; Lister, A; Litvintsev, D O; Liu, T; Liu, Y; Lockyer, N S; Loginov, A; Loreti, M; Loverre, P; Lu, R-S; Lucchesi, D; Lujan, P; Lukens, P; Lungu, G; Lyons, L; Lys, J; Lysak, R; Lytken, E; Mack, P; MacQueen, D; Madrak, R; Maeshima, K; Maksimovic, P; Manca, G; Margaroli, F; Marginean, R; Marino, C; Martin, A; Martin, M; Martin, V; Martínez, M; Maruyama, T; Matsunaga, H; Mattson, M E; Mazini, R; Mazzanti, P; McFarland, K S; McGivern, D; McIntyre, P; McNamara, P; McNulty, R; Mehta, A; Menzemer, S; Menzione, A; Merkel, P; Mesropian, C; Messina, A; von der Mey, M; Miao, T; Miladinovic, N; Miles, J; Miller, R; Miller, J S; Mills, C; Milnik, M; Miquel, R; Miscetti, S; Mitselmakher, G; Miyamoto, A; Moggi, N; Mohr, B; Moore, R; Morello, M; Movilla Fernandez, P; Mülmenstädt, J; Mukherjee, A; Mulhearn, M; Muller, Th; Mumford, R; Munar, A; Murat, P; Nachtman, J; Nahn, S; Nakano, I; Napier, A; Naumov, D; Necula, V; Neu, C; Neubauer, M S; Nielsen, J; Nigmanov, T; Nodulman, L; Norniella, O; Ogawa, T; Oh, S H; Oh, Y D; Okusawa, T; Oldeman, R; Orava, R; Osterberg, K; Pagliarone, C; Palencia, E; Paoletti, R; Papadimitriou, V; Papikonomou, A; Paramonov, A A; Parks, B; Pashapour, S; Patrick, J; Pauletta, G; Paulini, M; Paus, C; Pellett, D E; Penzo, A; Phillips, T J; Piacentino, G; Piedra, J; Pitts, K; Plager, C; Pondrom, L; Pope, G; Portell, X; Poukhov, O; Pounder, N; Prakoshyn, F; Pronko, A; Proudfoot, J; Ptohos, F; Punzi, G; Pursley, J; Rademacker, J; Rahaman, A; Rakitin, A; Rappoccio, S; Ratnikov, F; Reisert, B; Rekovic, V; van Remortel, N; Renton, P; Rescigno, M; Richter, S; Rimondi, F; Rinnert, K; Ristori, L; Robertson, W J; Robson, A; Rodrigo, T; Rogers, E; Rolli, S; Roser, R; Rossi, M; Rossin, R; Rott, C; Ruiz, A; Russ, J; Rusu, V; Ryan, D; Saarikko, H; Sabik, S; Safonov, A; Sakumoto, W K; Salamanna, G; Salto, O; Saltzberg, D; Sanchez, C; Santi, L; Sarkar, S; Sato, K; Savard, P; Savoy-Navarro, A; Scheidle, T; Schlabach, P; Schmidt, E E; Schmidt, M P; Schmitt, M; Schwarz, T; Scodellaro, L; Scott, A L; Scribano, A; Scuri, F; Sedov, A; Seidel, S; Seiya, Y; Semenov, A; Semeria, F; Sexton-Kennedy, L; Sfiligoi, I; Shapiro, M D; Shears, T; Shepard, P F; Sherman, D; Shimojima, M; Shochet, M; Shon, Y; Shreyber, I; Sidoti, A; Siegrist, J L; Sill, A; Sinervo, P; Sisakyan, A; Sjolin, J; Skiba, A; Slaughter, A J; Sliwa, K; Smirnov, D; Smith, J R; Snider, F D; Snihur, R; Soderberg, M; Soha, A; Somalwar, S; Sorin, V; Spalding, J; Spinella, F; Squillacioti, P; Stanitzki, M; Staveris-Polykalas, A; St Dennis, R; Stelzer, B; Stelzer-Chilton, O; Stentz, D; Strologas, J; Stuart, D; Suh, J S; Sukhanov, A; Sumorok, K; Sun, H; Suzuki, T; Taffard, A; Tafirout, R; Takashima, R; Takeuchi, Y; Takikawa, K; Tanaka, M; Tanaka, R; Tecchio, M; Teng, P K; Terashi, K; Tether, S; Thom, J; Thompson, A S; Thomson, E; Tipton, P; Tiwari, V; Tkaczyk, S; Toback, D; Tokar, S; Tollefson, K; Tomura, T; Tonelli, D; Tönnesmann, M; Torre, S; Torretta, D; Tourneur, S; Trischuk, W; Tsuchiya, R; Tsuno, S; Turini, N; Ukegawa, F; Unverhau, T; Uozumi, S; Usynin, D; Vacavant, L; Vaiciulis, A; Vallecorsa, S; Varganov, A; Vataga, E; Velev, G; Veramendi, G; Veszpremi, V; Vickey, T; Vidal, R; Vila, I; Vilar, R; Vollrath, I; Volobouev, I; Würthwein, F; Wagner, P; Wagner, R G; Wagner, R L; Wagner, W; Wallny, R; Walter, T; Wan, Z; Wang, M J; Wang, S M; Warburton, A; Ward, B; Waschke, S; Waters, D; Watts, T; Weber, M; Wester, W C; Whitehouse, B; Whiteson, D; Wicklund, A B; Wicklund, E; Williams, H H; Wilson, P; Winer, B L; Wittich, P; Wolbers, S; Wolfe, C; Worm, S; Wright, T; Wu, X; Wynne, S M; Yagil, A; Yamamoto, K; Yamaoka, J; Yamashita, Y; Yang, C; Yang, U K; Yao, W M; Yeh, G P; Yoh, J; Yorita, K; Yoshida, T; Yu, I; Yu, S S; Yun, J C; Zanello, L; Zanetti, A; Zaw, I; Zetti, F; Zhang, X; Zhou, J; Zucchelli, S

    2006-05-05

    We searched for scalar bottom quarks 156 pb(-1) of pp collisions at radicalS = 1.96 recorded by the Collider Detector at Fermilab II experiment at the Tevatron. Scalar bottom quarks can be produced from gluino decays in -parity conserving models of supersymmetry when the mass of the gluino exceeds that of the scalar bottom quark. Then, a scalar bottom quark can decay into a bottom quark and a neutralino. To search for this scenario, we investigated events with large missing transverse energy and at least three jets, two or more of which were identified as containing a secondary vertex from the hadronization of quarks. We found four candidate events, where 2.6 +/- 0.7 are expected from standard model processes, and placed 95% confidence level lower limits on gluino and scalar bottom quark masses of up to 280 and 240 GeV/c(2), respectively.

  10. Man induced change in community control in the north-western Black Sea: The top-down bottom-up balance.

    PubMed

    Bănaru, Daniela; Harmelin-Vivien, Mireille; Boudouresque, Charles F

    2010-05-01

    The present study shows how marine commercial fish food webs dramatically changed in the north-western Black Sea on both pelagic and benthic environments. Fisheries landings, diversity and equitability strongly decreased between 1965-1970 and 2001-2005. Fishes adapted their feeding behaviour to the increasingly low species diversity of the Black Sea communities. Their food web became poor and simplified following the loss of many top predator species and their trophic links. Linkage density, connectivity and Lyapunov stability proxy strongly decreased. The north-western Black Sea system switched from a complex top-down and bottom-up functioning pattern to a dominantly bottom-up functioning pattern. This study contributes to a better understanding of these transformations within the Danube-Black Sea system in the last decades. An attempt is made to relate these changes with river inputs, fisheries and coastal pollution. Copyright 2009 Elsevier Ltd. All rights reserved.

  11. Height-resolved large-sample INAA of a 1 m long, 13 cm diameter ditch-bottom sample

    NASA Astrophysics Data System (ADS)

    Blaauw, M.; Baas, H. W.; Donze, M.

    2003-06-01

    A facility for instrumental neutron activation analysis (INAA) of large samples (up to 1 m long and 15 cm diameter) has been built. Correction methods for the simultaneous occurrence of neutron self-shielding and gamma-ray self-attenuation effects have been implemented and tested with a variety of samples. Now, the method has been extended to allow for the interpretation of scanned, collimated measurements, where results are obtained for individual voxels. As a validation and demonstration, a ditch-bottom sample of the maximum size was taken in a frozen condition. It was cut in 2 cm slices, still frozen, and put together again with each slice in a 2 cm height Petri dish divided in three sections. This allowed for verification of the results by ordinary INAA. Possible explanations for the discrepancies we observed between ordinary and large-sample INAA in the region where the concentration gradients are the steepest are discussed.

  12. Analysis and interpretation of the leaching behaviour of waste thermal treatment bottom ash by batch and column tests.

    PubMed

    Di Gianfilippo, Martina; Costa, Giulia; Verginelli, Iason; Gavasci, Renato; Lombardi, Francesco

    2016-10-01

    This paper investigates the leaching behaviour of specific types of waste thermal treatment bottom ash (BA) as a function of both pH and the liquid-to-solid ratio (L/S). Specifically, column percolation tests and different types of batch tests (including pH-dependence) were applied to BA produced by hospital waste incineration (HW-I), Refuse Derived Fuel (RDF) gasification (RDF-G) and RDF incineration (RDF-I). The results of these tests were interpreted applying an integrated graphical and modelling approach aimed at identifying the main mechanisms (solubility, availability or time-controlled dissolution and diffusion) governing the release of specific constituents from each type of BA. The final aim of this work was in fact to gain insight on the information that can be provided by the leaching tests applied, and hence on which ones may be more suitable to apply for assessing the leaching concentrations expected in the field. The results of the leaching tests showed that the three samples of analysed BA presented differences of orders of magnitude in their leaching behaviour, especially as a function of pH, but also in terms of the L/S. These were mainly related to the differences in mineralogy of the samples. In addition, for the same type of bottom ash, the comparison between the results of batch and percolation column tests, expressed in terms of cumulative release, showed that for some constituents (e.g. Mg for HW-I BA and Cu for RDF-G BA) differences of over one order of magnitude were obtained due to variations in pH and DOC release. Similarly, the eluate concentrations observed in the percolation tests, for most of the investigated elements, were not directly comparable with the results of the pH-dependence tests. In particular, in some cases the percolation test results showed eluate concentrations of some constituents (e.g. K and Ca in HW-I BA) of up to one order of magnitude higher than the values obtained from the pH-dependence experiments at the same pH value. This was attributed to a rapid washout from the column of the soluble phases present in the BA. In contrast, for other constituents (e.g. Mg and Ba for the RDF-G BA), especially at high L/S ratios, the concentrations in the column tests were of up to one order of magnitude lower than the solubility value, indicating release under non-equilibrium conditions. In these cases, batch pH-dependence tests should be preferred, since column tests results could underestimate the concentrations expected in the field. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. DAnTE: a statistical tool for quantitative analysis of –omics data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Polpitiya, Ashoka D.; Qian, Weijun; Jaitly, Navdeep

    2008-05-03

    DAnTE (Data Analysis Tool Extension) is a statistical tool designed to address challenges unique to quantitative bottom-up, shotgun proteomics data. This tool has also been demonstrated for microarray data and can easily be extended to other high-throughput data types. DAnTE features selected normalization methods, missing value imputation algorithms, peptide to protein rollup methods, an extensive array of plotting functions, and a comprehensive ANOVA scheme that can handle unbalanced data and random effects. The Graphical User Interface (GUI) is designed to be very intuitive and user friendly.

  14. Utilization of power plant bottom ash as aggregates in fiber-reinforced cellular concrete.

    PubMed

    Lee, H K; Kim, H K; Hwang, E A

    2010-02-01

    Recently, millions tons of bottom ash wastes from thermoelectric power plants have been disposed of in landfills and coastal areas, regardless of its recycling possibility in construction fields. Fiber-reinforced cellular concrete (FRCC) of low density and of high strength may be attainable through the addition of bottom ash due to its relatively high strength. This paper focuses on evaluating the feasibility of utilizing bottom ash of thermoelectric power plant wastes as aggregates in FRCC. The flow characteristics of cement mortar with bottom ash aggregates and the effect of aggregate type and size on concrete density and compressive strength were investigated. In addition, the effects of adding steel and polypropylene fibers for improving the strength of concrete were also investigated. The results from this study suggest that bottom ash can be applied as a construction material which may not only improve the compressive strength of FRCC significantly but also reduce problems related to bottom ash waste.

  15. Climate variability and demand growth as drivers of water scarcity in the Turkwel river basin: a bottom-up risk assessment of a data-sparse basin in Kenya

    NASA Astrophysics Data System (ADS)

    Hirpa, F. A.; Dyer, E.; Hope, R.; Dadson, S. J.

    2017-12-01

    Sustainable water management and allocation are essential for maintaining human well-being, sustaining healthy ecosystems, and supporting steady economic growth. The Turkwel river basin, located in north-western Kenya, experiences a high level of water scarcity due to its arid climate, high rainfall variability, and rapidly growing water demand. However, due to sparse hydro-climatic data and limited literature, the water resources system of the basin has been poorly understood. Here we apply a bottom-up climate risk assessment method to estimate the resilience of the basin's water resources system to growing demand and climate stressors. First, using a water resource system model and historical climate data, we construct a climate risk map that depicts the way in which the system responds to climate change and variability. Then we develop a set of water demand scenarios to identify the conditions that potentially lead to the risk of unmet water demand and groundwater depletion. Finally, we investigate the impact of climate change and variability by stress testing these development scenarios against historically strong El Niño/Southern Oscillation (ENSO) years and future climate projections from multiple Global Circulation Models (GCMs). The results reveal that climate variability and increased water demand are the main drivers of water scarcity in the basin. Our findings show that increases in water demand due to expanded irrigation and population growth exert the strongest influence on the ability of the system to meet water resource supply requirements, and in all cases considered increase the impacts of droughts caused by future climate variability. Our analysis illustrates the importance of combining analysis of future climate risks with other development decisions that affect water resources planning. Policy and investment decisions which maximise water use efficiency in the present day are likely to impart resilience to climate change and variability under a wide range of future scenarios and therefore constitute low regret measures for climate adaptation.

  16. Using the RE-AIM framework to evaluate a school-based municipal programme tripling time spent on PE.

    PubMed

    Nielsen, Jonas Vestergaard; Skovgaard, Thomas; Bredahl, Thomas Viskum Gjelstrup; Bugge, Anna; Wedderkopp, Niels; Klakk, Heidi

    2018-06-01

    Documenting the implementation of effective real-world programmes is considered an important step to support the translation of evidence into practice. Thus, the aim of this study was to identify factors influencing the adoption, implementation and maintenance of the Svendborgproject (SP) - an effective real-world programme comprising schools to implement triple the amount of physical education (PE) in pre-school to sixth grade in six primary schools in the municipality of Svendborg, Denmark. SP has been maintained for ten years and scaled up to all municipal schools since it was initiated in 2008. The Reach, Effectiveness, Adoption, Implementation and Maintenance framework (RE-AIM) was applied as an analytic tool through a convergent mixed method triangulation design. Results show that SP has been implemented with high fidelity and become an established part of the municipality and school identity. The successful implementation and dissemination of the programme has been enabled through the introduction of a predominantly bottom-up approach combined with simple non-negotiable requirements. The results show that this combination has led to a better fit of programmes to the individual school context while still obtaining high implementation fidelity. Finally, the early integration of research has legitimated and benefitted the programme. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Wrinkling of solidifying polymeric coatings

    NASA Astrophysics Data System (ADS)

    Basu, Soumendra Kumar

    2005-07-01

    In coatings, wrinkles are viewed as defects or as desired features for low gloss, and texture. In either case, discovering the origin of wrinkles and the conditions that lead to their formation is important. This research examines what wrinkling requires and proposes a mechanism to explain the observations. All curing wrinkling coatings contain multi-functional reactants. Upon curing, all develop a depth-wise gradient in solidification that result in a cross-linked elastic skin atop a viscous bottom layer. It is hypothesized that compressive stress develops in the skin when liquid below diffuses up into the skin. High enough compressive stress buckles the skin to produce wrinkles. The hypothesis is substantiated by experimental and theoretical evidences. Effects of various application and compositional parameters on wrinkle size in a liquid-applied acrylic coating and a powder-applied epoxy coating were examined. All three components, namely resin, cross-linker and catalyst blocked with at least equimolar volatile blocker, proved to be required for wrinkling. The wrinkling phenomenon was modeled with a theory that accounts for gradient generation, cross-linking reaction and skinning; predictions compared well with observations. Two-layer non-curing coatings that have a stiff elastic layer atop a complaint elastic bottom layer wrinkled when the top layer is compressed. The top layer was compressed by either moisture absorption or differential thermal expansion. Experimental observations compared well with predictions from a theory based on force balance in multilayer systems subjected to differential contraction or expansion. A model based on the Flory-Rehner free energy of a constrained cross-linked gel was constructed that predicts the compressive stress generated in a coating when it absorbs solvent. Linear stability analysis predicts that when a compressed elastic layer is attached atop a viscous layer, it is always unstable to buckles whose wavelength exceeds a critical value; more cross-linking and poor solvent produce higher wavelength, lower amplitude wrinkles. When a compressed elastic layer is attached atop an elastic layer and subjected to more than a critical compressive stress, it is unstable to intermediate wavelengths of buckling; better solvent, higher ratio of bottom-to-top layer thickness, and lower bottom layer modulus produce higher wavelength, higher amplitude wrinkles.

  18. A new method of Curie depth evaluation from magnetic data: Theory

    NASA Technical Reports Server (NTRS)

    Won, I. J. (Principal Investigator)

    1981-01-01

    An approach to estimating the Curie point isotherm uses the classical Gauss method inverting a system of nonlinear equations. The method, slightly modified by a differential correction technique, directly inverts filtered Magsat data to calculate the crustal structure above the Curie depth, which is modeled as a magnetized layer of varying thickness and susceptibility. Since the depth below the layer is assumed to be nonmagnetic, the bottom of the layer is interpreted as the Curie depth. The method, once fully developed, tested, and compared with previous work by others, is to be applied to a portion of the eastern U.S. when sufficient Magsat data are accumulated for the region.

  19. Large Scale Brownian Dynamics of Confined Suspensions of Rigid Particles

    NASA Astrophysics Data System (ADS)

    Donev, Aleksandar; Sprinkle, Brennan; Balboa, Florencio; Patankar, Neelesh

    2017-11-01

    We introduce new numerical methods for simulating the dynamics of passive and active Brownian colloidal suspensions of particles of arbitrary shape sedimented near a bottom wall. The methods also apply for periodic (bulk) suspensions. Our methods scale linearly in the number of particles, and enable previously unprecedented simulations of tens to hundreds of thousands of particles. We demonstrate the accuracy and efficiency of our methods on a suspension of boomerang-shaped colloids. We also model recent experiments on active dynamics of uniform suspensions of spherical microrollers. This work was supported in part by the National Science Foundation under award DMS-1418706, and by the U.S. Department of Energy under award DE-SC0008271.

  20. Tsunami Simulation Method Assimilating Ocean Bottom Pressure Data Near a Tsunami Source Region

    NASA Astrophysics Data System (ADS)

    Tanioka, Yuichiro

    2018-02-01

    A new method was developed to reproduce the tsunami height distribution in and around the source area, at a certain time, from a large number of ocean bottom pressure sensors, without information on an earthquake source. A dense cabled observation network called S-NET, which consists of 150 ocean bottom pressure sensors, was installed recently along a wide portion of the seafloor off Kanto, Tohoku, and Hokkaido in Japan. However, in the source area, the ocean bottom pressure sensors cannot observe directly an initial ocean surface displacement. Therefore, we developed the new method. The method was tested and functioned well for a synthetic tsunami from a simple rectangular fault with an ocean bottom pressure sensor network using 10 arc-min, or 20 km, intervals. For a test case that is more realistic, ocean bottom pressure sensors with 15 arc-min intervals along the north-south direction and sensors with 30 arc-min intervals along the east-west direction were used. In the test case, the method also functioned well enough to reproduce the tsunami height field in general. These results indicated that the method could be used for tsunami early warning by estimating the tsunami height field just after a great earthquake without the need for earthquake source information.

  1. Theory-Driven Intervention for Changing Personality: Expectancy Value Theory, Behavioral Activation, and Conscientiousness

    PubMed Central

    Magidson, Jessica F.; Roberts, Brent; Collado-Rodriguez, Anahi; Lejuez, C.W.

    2013-01-01

    Considerable evidence suggests that personality traits may be changeable, raising the possibility that personality traits most linked to health problems can be modified with intervention. A growing body of research suggests that problematic personality traits may be altered with behavioral intervention using a bottom-approach. That is, by targeting core behaviors that underlie personality traits with the goal of engendering new, healthier patterns of behavior that over time become automatized and manifest in changes in personality traits. Nevertheless, a bottom-up model for changing personality traits is somewhat diffuse and requires clearer integration of theory and relevant interventions to enable real clinical application. As such, this manuscript proposes a set of guiding principles for theory-driven modification of targeted personality traits using a bottom-up approach, focusing specifically on targeting the trait of conscientiousness using a relevant behavioral intervention, Behavioral Activation (BA), considered within the motivational framework of Expectancy Value Theory (EVT). We conclude with a real case example of the application of BA to alter behaviors counter to conscientiousness in a substance dependent patient, highlighting the EVT principles most relevant to the approach and the importance and viability of a theoretically-driven, bottom-up approach to changing personality traits. PMID:23106844

  2. Estimating Bottom Water Dissolved Oxygen in the Mississippi River Gulf Outlet and Gulf Intracoastal Waterway Resulting from Proposed Structures

    DTIC Science & Technology

    2008-09-01

    used in the analysis. The analytical approach assumes steady-state, summer conditions applied to a continuously stirred tank reactor ( CSTR ). CSTR ...constituent due to the fully mixed CSTR assumption. Thus, there is no spatial dimensionality. The DO CSTR model is solved using a spreadsheet...For this study, the CSTR represents the bottom meter of water along the reach of the channel being assessed. A unit bottom layer thickness of 1 m

  3. High-performance mc-Si ingot grown by modified DS system: Numerical investigation

    NASA Astrophysics Data System (ADS)

    Thiyagaragjan, M.; Aravindan, G.; Srinivasan, M.; Ramasamy, P.

    2018-04-01

    Numerical investigation is carried out on multi-crystalline silicon ingot grown by using side-top and side-bottom heaters and the temperature distribution, von Mises stress and maximum shear stress are analyzed. In order to analyze the changes, results from the side-top and side-bottom heaters are compared. The stress values are reduced, when the side-bottom heaters are placed. A 2D numerical approach is successfully applied to study the stress parameters in directional solidification silicon.

  4. The reconstruction of f(ϕ)R and mimetic gravity from viable slow-roll inflation

    NASA Astrophysics Data System (ADS)

    Odintsov, S. D.; Oikonomou, V. K.

    2018-04-01

    In this work, we extend the bottom-up reconstruction framework of F (R) gravity to other modified gravities, and in particular for f (ϕ) R and mimetic F (R) gravities. We investigate which are the important conditions in order for the method to work, and we study several viable cosmological evolutions, focusing on the inflationary era. Particularly, for the f (ϕ) R theory case, we specify the functional form of the Hubble rate and of the scalar-to-tensor ratio as a function of the e-foldings number and accordingly, the rest of the physical quantities and also the slow-roll and the corresponding observational indices can be calculated. The same method is applied in the mimetic F (R) gravity case, and in both cases we thoroughly analyze the resulting free parameter space, in order to show that the viability of the models presented is guaranteed and secondly that there is a wide range of values of the free parameters for which the viability of the models occurs. In addition, the reconstruction method is also studied in the context of mimetic F (R) = R gravity. As we demonstrate, the resulting theory is viable, and also in this case, only the scalar-to-tensor ratio needs to be specified, since the rest follow from this condition. Finally, we discuss in brief how the reconstruction method could function for other modified gravities.

  5. Analyse du potentiel de la radiometrie infrarouge thermique pour la caracterisation des nuages de glace en Arctique

    NASA Astrophysics Data System (ADS)

    Blanchard, Yann

    An important goal, within the context of improving climate change modelling, is to enhance our understanding of aerosols and their radiative effects (notably their indirect impact as cloud condensation nuclei). The cloud optical depth (COD) and average ice particle size of thin ice clouds (TICs) are two key parameters whose variations could strongly influence radiative effects and climate in the Arctic environment. Our objective was to assess the potential of using multi-band thermal radiance measurements of zenith sky radiance for retrieving COD and effective particle diameter (Deff) of TICs in the Arctic. We analyzed and quantified the sensitivity of thermal radiance on many parameters, such as COD, Deff, water vapor content, cloud bottom altitude and thickness, size distribution and shape. Using the sensitivity of IRT to COD and Deff, the developed retrieval technique is validated in comparison with retrievals from LIDAR and RADAR. Retrievals were applied to ground-based thermal infrared data acquired for 100 TICs at the high-Arctic PEARL observatory in Eureka, Nunavut, Canada and were validated using AHSRL LIDAR and MMCR RADAR data. The results of the retrieval method were used to successfully extract COD up to values of 3 and to separate TICs into two types : TIC1 characterized by small crystals (Deff < 30 mum) and TIC2 by large ice crystals (Deff > 30 mum, up to 300 mum). Inversions were performed across two polar winters. At the end of this research, we proposed different alternatives to apply our methodology in the Arctic. Keywords : Remote sensing ; ice clouds ; thermal infrared multi-band radiometry ; Arctic.

  6. Assessing Top-Down and Bottom-Up Contributions to Auditory Stream Segregation and Integration With Polyphonic Music

    PubMed Central

    Disbergen, Niels R.; Valente, Giancarlo; Formisano, Elia; Zatorre, Robert J.

    2018-01-01

    Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together), via a task requiring the detection of temporal modulations (i.e., triplets) incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s). Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI) environment. Experiment 1 subjects (N = 29, non-musicians) completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen listeners also participated in Experiment 2, showing a main effect of instrument timbre distance, even though within attention-condition timbre-distance contrasts did not demonstrate any timbre effect. Correlation of overall scores with morph-distance effects, computed by subtracting the largest from the smallest timbre distance scores, showed an influence of general task difficulty on the timbre distance effect. Comparison of laboratory and fMRI data showed scanner noise had no adverse effect on task performance. These Experimental paradigms enable to study both bottom-up and top-down contributions to auditory stream segregation and integration within psychophysical and neuroimaging experiments. PMID:29563861

  7. The Influence of Atmosphere Parameters on the Signal for Remote Sensing Polarimetric Electro-Optical Systems

    NASA Astrophysics Data System (ADS)

    Budak, Vladimir P.; Korkin, Sergey V.

    2009-03-01

    The singularity subtraction on the vectorial modification of spherical harmonics method (VMSH) of the solution of the vectorial radiative transfer equation boundary problem is applied to the problem of influence of atmosphere parameters on the polarimetric system signal. We assume in this model different phase matrices (Mie, Rayleigh, and Henyey-Greenstein), reflecting bottom and particle size distribution. The authors describe the main features of the model and some results of its implementation.

  8. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    PubMed

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to derive the combined standard uncertainty. Finally, an expanded uncertainty is calculated at 99% one-tailed confidence level by multiplying the standard uncertainty with an appropriate coverage factor (k). A sample is considered positive if the determined concentration of the threshold substance exceeds its threshold by the expanded uncertainty. In addition, other important considerations, which can have a significant impact on quantitative analyses, will be presented.

  9. Vertical Cable Seismic Survey for SMS exploration

    NASA Astrophysics Data System (ADS)

    Asakawa, Eiichi; Murakami, Fumitoshi; Tsukahara, Hotoshi; Mizohata, Shigeharu

    2014-05-01

    The Vertical Cable Seismic (VCS) survey is one of the reflection seismic methods. It uses hydrophone arrays vertically moored from the seafloor to record acoustic waves generated by sea-surface, deep-towed or ocean bottom sources. Analyzing the reflections from the sub-seabed, we could look into the subsurface structure. Because the VCS is an efficient high-resolution 3D seismic survey method for a spatially-bounded area, we proposed it for the SMS survey tool development program that the Ministry of Education, Culture, Sports, Science and Technology (MEXT) started in 2009. We have been developing the VCS survey system, including not only data acquisition hardware but data processing and analysis technique. We carried out several VCS surveys combining with surface towed source, deep towed source and ocean bottom source. The water depths of these surveys are from 100m up to 2100 m. Through these experiments, our VCS data acquisition system has been also completed. But the data processing techniques are still on the way. One of the most critical issues is the positioning in the water. The uncertainty in the positions of the source and of the hydrophones in water degraded the quality of subsurface image. GPS navigation system is available on sea surface, but in case of deep-towed source or ocean bottom source, the accuracy of shot position with SSBL/USBL is not sufficient for the very high-resolution imaging. We have developed a new approach to determine the positions in water using the travel time data from the source to VCS hydrophones. In 2013, we have carried out the second VCS survey using the surface-towed high-voltage sparker and ocean bottom source in the Izena Cauldron, which is one of the most promising SMS areas around Japan. The positions of ocean bottom source estimated by this method are consistent with the VCS field records. The VCS data with the sparker have been processed with 3D PSTM. It gives the very high resolution 3D volume deeper than two hundred meters. Our VCS system has been demonstrated as a promising survey tool for the SMS exploration.

  10. Electrical analysis of c-Si/CGSe monolithic tandem solar cells by using a cell-selective light absorption scheme.

    PubMed

    Jeong, Ah Reum; Choi, Sung Bin; Kim, Won Mok; Park, Jong-Keuk; Choi, Jihye; Kim, Inho; Jeong, Jeung-Hyun

    2017-11-16

    A monolithic tandem solar cell consisting of crystalline Si (c-Si)/indium tin oxide (ITO)/CuGaSe 2 (CGSe) was demonstrated by stacking a CGSe solar cell on a c-Si/ITO solar cell to obtain a photovoltaic conversion efficiency of about 10%. Electrical analyses based on cell-selective light absorption were applied to individually characterize the photovoltaic performances of the top and bottom subcells. Illumination at a frequency that could be absorbed only by a targeted top or bottom subcell permitted measurement of the open-circuit voltage of the target subcell and the shunt resistance of the non-target subcell. The cell parameters measured from each subcell were very similar to those of the corresponding single cell, confirming the validity of the suggested method. In addition, separating the light absorption intensities at the top and bottom subcells made us measure the bias-dependent photocurrent for each subcell. The series resistance of a c-Si/ITO/CGSe cell subjected to bottom-cell limiting conditions was slightly large, implying that the tunnel junction was a little resistive or slightly beyond ohmic. This analysis demonstrated that aside from producing a slightly resistive tunnel junction, our fabrication processes were successful in monolithically integrating a CGSe cell onto a c-Si/ITO cell without degrading the performances of both cells.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ge, Zhen-Hua; Wei, Kaya; Lewis, Hutton

    A hydrothermal approach was employed to efficiently synthesize SnSe nanorods. The nanorods were consolidated into polycrystalline SnSe by spark plasma sintering for low temperature electrical and thermal properties characterization. The low temperature transport properties indicate semiconducting behavior with a typical dielectric temperature dependence of the thermal conductivity. The transport properties are discussed in light of the recent interest in this material for thermoelectric applications. The nanorod growth mechanism is also discussed in detail. - Graphical abstract: SnSe nanorods were synthesized by a simple hydrothermal method through a bottom-up approach. Micron sized flower-like crystals changed to nanorods with increasing hydrothermal temperature.more » Low temperature transport properties of polycrystalline SnSe, after SPS densification, were reported for the first time. This bottom-up synthetic approach can be used to produce phase-pure dense polycrystalline materials for thermoelectrics applications. - Highlights: • SnSe nanorods were synthesized by a simple and efficient hydrothermal approach. • The role of temperature, time and NaOH content was investigated. • SPS densification allowed for low temperature transport properties measurements. • Transport measurements indicate semiconducting behavior.« less

  12. Macro Scale Independently Homogenized Subcells for Modeling Braided Composites

    NASA Technical Reports Server (NTRS)

    Blinzler, Brina J.; Goldberg, Robert K.; Binienda, Wieslaw K.

    2012-01-01

    An analytical method has been developed to analyze the impact response of triaxially braided carbon fiber composites, including the penetration velocity and impact damage patterns. In the analytical model, the triaxial braid architecture is simulated by using four parallel shell elements, each of which is modeled as a laminated composite. Currently, each shell element is considered to be a smeared homogeneous material. The commercial transient dynamic finite element code LS-DYNA is used to conduct the simulations, and a continuum damage mechanics model internal to LS-DYNA is used as the material constitutive model. To determine the stiffness and strength properties required for the constitutive model, a top-down approach for determining the strength properties is merged with a bottom-up approach for determining the stiffness properties. The top-down portion uses global strengths obtained from macro-scale coupon level testing to characterize the material strengths for each subcell. The bottom-up portion uses micro-scale fiber and matrix stiffness properties to characterize the material stiffness for each subcell. Simulations of quasi-static coupon level tests for several representative composites are conducted along with impact simulations.

  13. Relative Importance of Biotic and Abiotic Forces on the Composition and Dynamics of a Soft-Sediment Intertidal Community

    PubMed Central

    Barbeau, Myriam A.

    2016-01-01

    Top-down, bottom-up, middle-out and abiotic factors are usually viewed as main forces structuring biological communities, although assessment of their relative importance, in a single study, is rarely done. We quantified, using multivariate methods, associations between abiotic and biotic (top-down, bottom-up and middle-out) variables and infaunal population/community variation on intertidal mudflats in the Bay of Fundy, Canada, over two years. Our analysis indicated that spatial structural factors like site and plot accounted for most of the community and population variation. Although we observed a significant relationship between the community/populations and the biotic and abiotic variables, most were of minor importance relative to the structural factors. We suggest that community and population structure were relatively uncoupled from the structuring influences of biotic and abiotic factors in this system because of high concentrations of resources that sustain high densities of infauna and limit exploitative competition. Furthermore, we hypothesize that the infaunal community primarily reflects stochastic spatial events, namely a “first come, first served” process. PMID:26790098

  14. Characteristics Air Flow in Room Chamber Test Refrigerator Household Energy Consumption with Inlet Flow Variation

    NASA Astrophysics Data System (ADS)

    Susanto, Edy; Idrus Alhamid, M.; Nasruddin; Budihardjo

    2018-03-01

    Room Chamber is the most important in making a good Testing Laboratory. In this study, the 2-D modeling conducted to assess the effect placed the inlet on designing a test chamber room energy consumption of household refrigerators. Where the geometry room chamber is rectangular and approaching the enclosure conditions. Inlet varied over the side parallel to the outlet and compared to the inlet where the bottom is made. The purpose of this study was to determine and define the characteristics of the airflow in the room chamber using CFD simulation. CFD method is used to obtain flow characteristics in detail, in the form of vector flow velocity and temperature distribution inside the chamber room. The result found that the position of the inlet parallel to the outlet causes air flow cannot move freely to the side of the floor, even flow of air moves up toward the outlet. While by making the inlet is below, the air can move freely from the bottom up to the side of the chamber room wall as well as to help uniform flow.

  15. Programmable DNA tile self-assembly using a hierarchical sub-tile strategy.

    PubMed

    Shi, Xiaolong; Lu, Wei; Wang, Zhiyu; Pan, Linqiang; Cui, Guangzhao; Xu, Jin; LaBean, Thomas H

    2014-02-21

    DNA tile based self-assembly provides a bottom-up approach to construct desired nanostructures. DNA tiles have been directly constructed from ssDNA and readily self-assembled into 2D lattices and 3D superstructures. However, for more complex lattice designs including algorithmic assemblies requiring larger tile sets, a more modular approach could prove useful. This paper reports a new DNA 'sub-tile' strategy to easily create whole families of programmable tiles. Here, we demonstrate the stability and flexibility of our sub-tile structures by constructing 3-, 4- and 6-arm DNA tiles that are subsequently assembled into 2D lattices and 3D nanotubes according to a hierarchical design. Assembly of sub-tiles, tiles, and superstructures was analyzed using polyacrylamide gel electrophoresis and atomic force microscopy. DNA tile self-assembly methods provide a bottom-up approach to create desired nanostructures; the sub-tile strategy adds a useful new layer to this technique. Complex units can be made from simple parts. The sub-tile approach enables the rapid redesign and prototyping of complex DNA tile sets and tiles with asymmetric designs.

  16. Implementation and testing of a Deep Water Correlation Velocity Sonar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickey, F.R.; Bookheimer, W.C.; Rhoades, K.W.

    1983-05-01

    The paper describes a new sonar designated the Magnavox MX 810 Deep Water Correlation Sonar which is under development by the General Electric Company and the Magnavox Advanced Products and Systems Company. The sonar measures ship's velocity relative to the bottom but instead of using the conventional doppler effect, it uses the correlation method described by Dickey and Edward in 1978. In this method, the narrow beams required for doppler are not needed and a low frequency that penetrates to the bottom in deep water is used. The sonar was designed with the constraint that it use a transducer thatmore » mounts through a single 12 inch gate valve. Most offshore geophysical surveys at present make use of an integrated navigation system with bottom referenced velocity input from a doppler sonar which, because of limitations on the sonar bottomtracking range, has difficulty in areas where the water depth is greater than about 500 meters. The MX 810 provides bottom tracking in regions of much greater water depth. It also may be applied as an aid in continuous positioning of a vessel over a fixed location. It also should prove useful as a more general navigation aid. The sonar is undergoing a series of tests using Magnavox's facilities for the purpose of verifying the performance and obtaining data to support and quantify planned improvements in both software and hardware. A prototype transducer of only 5 watts power output was used, but in spite of this low power, successful operation to depths of 1900 meters was obtained. Extrapolation to system parameters to be implemented in production models predicts operation to depths of 5000 meters.« less

  17. Crossbar nanoarchitectonics of the crosslinked self-assembled monolayer

    PubMed Central

    2014-01-01

    A bottom-up approach was devised to build a crossbar device using the crosslinked SAM of the 5,5′-bis (mercaptomethyl)-2,2′-bipyridine-Ni2+ (BPD- Ni2+) on a gold surface. To avoid metal diffusion through the organic film, the author used (i) nanoscale bottom electrodes to reduce the probability of defects on the bottom electrodes and (ii) molecular crosslinked technology to avoid metal diffusion through the SAMs. The properties of the crosslinked self-assembled monolayer were determined by XPS. I-V characteristics of the device show thermally activated hopping transport. The implementation of this type of architecture will open up new vistas for a new class of devices for transport, storage, and computing. PMID:24994952

  18. Evaluation of modeling NO2 concentrations driven by satellite-derived and bottom-up emission inventories using in situ measurements over China

    NASA Astrophysics Data System (ADS)

    Liu, Fei; van der A, Ronald J.; Eskes, Henk; Ding, Jieying; Mijling, Bas

    2018-03-01

    Chemical transport models together with emission inventories are widely used to simulate NO2 concentrations over China, but validation of the simulations with in situ measurements has been extremely limited. Here we use ground measurements obtained from the air quality monitoring network recently developed by the Ministry of Environmental Protection of China to validate modeling surface NO2 concentrations from the CHIMERE regional chemical transport model driven by the satellite-derived DECSO and the bottom-up MIX emission inventories. We applied a correction factor to the observations to account for the interferences of other oxidized nitrogen compounds (NOz), based on the modeled ratio of NO2 to NOz. The model accurately reproduces the spatial variability in NO2 from in situ measurements, with a spatial correlation coefficient of over 0.7 for simulations based on both inventories. A negative and positive bias is found for the simulation with the DECSO (slope = 0.74 and 0.64 for the daily mean and daytime only) and the MIX (slope = 1.3 and 1.1) inventories, respectively, suggesting an underestimation and overestimation of NOx emissions from corresponding inventories. The bias between observed and modeled concentrations is reduced, with the slope dropping from 1.3 to 1.0 when the spatial distribution of NOx emissions in the DECSO inventory is applied as the spatial proxy for the MIX inventory, which suggests an improvement of the distribution of emissions between urban and suburban or rural areas in the DECSO inventory compared to that used in the bottom-up inventory. A rough estimate indicates that the observed concentrations, from sites predominantly placed in the populated urban areas, may be 10-40 % higher than the corresponding model grid cell mean. This reduces the estimate of the negative bias of the DECSO-based simulation to the range of -30 to 0 % on average and more firmly establishes that the MIX inventory is biased high over major cities. The performance of the model is comparable over seasons, with a slightly worse spatial correlation in summer due to the difficulties in resolving the more active NOx photochemistry and larger concentration gradients in summer by the model. In addition, the model well captures the daytime diurnal cycle but shows more significant disagreement between simulations and measurements during nighttime, which likely produces a positive model bias of about 15 % in the daily mean concentrations. This is most likely related to the uncertainty in vertical mixing in the model at night.

  19. Evaluation of Modeling NO2 Concentrations Driven by Satellite-Derived and Bottom-Up Emission Inventories Using In-Situ Measurements Over China

    NASA Technical Reports Server (NTRS)

    Liu, Fei; van der A, Ronald J.; Eskes, Henk; Ding, Jieying; Mijling, Bas

    2018-01-01

    Chemical transport models together with emission inventories are widely used to simulate NO2 concentrations over China, but validation of the simulations with in situ measurements has been extremely limited. Here we use ground measurements obtained from the air quality monitoring network recently developed by the Ministry of Environmental Protection of China to validate modeling surface NO2 concentrations from the CHIMERE regional chemical transport model driven by the satellite-derived DECSO and the bottom-up MIX emission inventories. We applied a correction factor to the observations to account for the interferences of other oxidized nitrogen compounds (NOz), based on the modeled ratio of NO2 to NOz. The model accurately reproduces the spatial variability in NO2 from in situ measurements, with a spatial correlation coefficient of over 0.7 for simulations based on both inventories. A negative and positive bias is found for the simulation with the DECSO (slopeD0.74 and 0.64 for the daily mean and daytime only) and the MIX (slopeD1.3 and 1.1) inventories, respectively, suggesting an underestimation and overestimation of NOx emissions from corresponding inventories. The bias between observed and modeled concentrations is reduced, with the slope dropping from 1.3 to 1.0 when the spatial distribution of NOx emissions in the DECSO inventory is applied as the spatial proxy for the MIX inventory, which suggests an improvement of the distribution of emissions between urban and suburban or rural areas in the DECSO inventory compared to that used in the bottom-up inventory. A rough estimate indicates that the observed concentrations, from sites predominantly placed in the populated urban areas, may be 10-40% higher than the corresponding model grid cell mean. This reduces the estimate of the negative bias of the DECSO-based simulation to the range of -30 to 0% on average and more firmly establishes that the MIX inventory is biased high over major cities. The performance of the model is comparable over seasons, with a slightly worse spatial correlation in summer due to the difficulties in resolving the more active NOx photochemistry and larger concentration gradients in summer by the model. In addition, the model well captures the daytime diurnal cycle but shows more significant disagreement between simulations and measurements during nighttime, which likely produces a positive model bias of about 15% in the daily mean concentrations. This is most likely related to the uncertainty in vertical mixing in the model at night.

  20. Distributed Memory Breadth-First Search Revisited: Enabling Bottom-Up Search

    DTIC Science & Technology

    2013-01-03

    Jun. 1972. [2] W. McLendon III, B. Hendrickson, S . J. Plimpton , and L. Rauchwerger, “Finding strongly connected components in distributed graphs,” J...Breadth-First Search Revisited: Enabling Bottom-Up Search 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT...NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) University of California at Berkeley,Electrical

  1. Density- and trait-mediated top-down effects modify bottom-up control of a highly endemic tropical aquatic food web

    Treesearch

    C. M. Dalton; A. Mokiao-Lee; T. S. Sakihara; M. G. Weber; C. A. Roco; Z. Han; B. Dudley; R. A. MacKenzie; N. G. Hairston Jr.

    2013-01-01

    Benthic invertebrates mediate bottom–up and top–down influences in aquatic food webs, and changes in the abundance or traits of invertebrates can alter the strength of top–down effects. Studies assessing the role of invertebrate abundance and behavior as controls on food web structure are rare at the whole ecosystem scale. Here we use a comparative approach to...

  2. The Civil-Military Gap in the United States. Does It Exist, Why, and Does It Matter?

    DTIC Science & Technology

    2007-01-01

    and/or the general public. Based on this framework, our analysis then compares the char- acteristics of military and civilian respondents using a...armed forces, three major force structure reviews (1990 Base Force, 1993 Bottom-Up Review, 1997 Quadrennial Defense Review) took place in the space...Defense Planning in a Decade of Change: Lessons from the Base Force, Bottom-Up Review, and Quadrennial Defense Review, Santa Monica, CA: RAND

  3. A New Global Anthropogenic SO2 Emission Inventory for the Last Decade: A Mosaic of Satellite-derived and Bottom-up Emissions

    NASA Astrophysics Data System (ADS)

    Liu, F.; Joiner, J.; Choi, S.; Krotkov, N. A.; Li, C.; Fioletov, V. E.; McLinden, C. A.

    2017-12-01

    Sulfur dioxide (SO2) measurements from the Ozone Monitoring Instrument (OMI) satellite sensor have been used to detect emissions from large point sources using an innovative estimation technique. Emissions from about 500 sources have been quantified individually based on OMI observations, accounting for about a half of total reported anthropogenic SO2 emissions. We developed a new emission inventory, OMI-HTAP, by combining these OMI-based emission estimates and the conventional bottom-up inventory. OMI-HTAP includes OMI-based estimates for over 400 point sources and is gap-filled with the emission grid map of the latest available global bottom-up emission inventory (HTAP v2.2) for the rest of sources. We have evaluated the OMI-HTAP inventory by performing simulations with the Goddard Earth Observing System version 5 (GEOS-5) model. The GEOS-5 simulated SO2 concentrations driven by both the HTAP and the OMI-HTAP inventory were compared against in-situ and satellite measurements. Results show that the OMI-HTAP inventory improves the model agreement with observations, in particular over the US, India and the Middle East. Additionally, simulations with the OMI-HTAP inventory capture the major trends of anthropogenic SO2 emissions over the world and highlight the influence of missing sources in the bottom-up inventory.

  4. Age-related decline in bottom-up processing and selective attention in the very old.

    PubMed

    Zhuravleva, Tatyana Y; Alperin, Brittany R; Haring, Anna E; Rentz, Dorene M; Holcomb, Philip J; Daffner, Kirk R

    2014-06-01

    Previous research demonstrating age-related deficits in selective attention have not included old-old adults, an increasingly important group to study. The current investigation compared event-related potentials in 15 young-old (65-79 years old) and 23 old-old (80-99 years old) subjects during a color-selective attention task. Subjects responded to target letters in a specified color (Attend) while ignoring letters in a different color (Ignore) under both low and high loads. There were no group differences in visual acuity, accuracy, reaction time, or latency of early event-related potential components. The old-old group showed a disruption in bottom-up processing, indexed by a substantially diminished posterior N1 (smaller amplitude). They also demonstrated markedly decreased modulation of bottom-up processing based on selected visual features, indexed by the posterior selection negativity (SN), with similar attenuation under both loads. In contrast, there were no group differences in frontally mediated attentional selection, measured by the anterior selection positivity (SP). There was a robust inverse relationship between the size of the SN and SP (the smaller the SN, the larger the SP), which may represent an anteriorly supported compensatory mechanism. In the absence of a decline in top-down modulation indexed by the SP, the diminished SN may reflect age-related degradation of early bottom-up visual processing in old-old adults.

  5. Habitat selection models for Pacific sand lance (Ammodytes hexapterus) in Prince William Sound, Alaska

    USGS Publications Warehouse

    Ostrand, William D.; Gotthardt, Tracey A.; Howlin, Shay; Robards, Martin D.

    2005-01-01

    We modeled habitat selection by Pacific sand lance (Ammodytes hexapterus) by examining their distribution in relation to water depth, distance to shore, bottom slope, bottom type, distance from sand bottom, and shoreline type. Through both logistic regression and classification tree models, we compared the characteristics of 29 known sand lance locations to 58 randomly selected sites. The best models indicated a strong selection of shallow water by sand lance, with weaker association between sand lance distribution and beach shorelines, sand bottoms, distance to shore, bottom slope, and distance to the nearest sand bottom. We applied an information-theoretic approach to the interpretation of the logistic regression analysis and determined importance values of 0.99, 0.54, 0.52, 0.44, 0.39, and 0.25 for depth, beach shorelines, sand bottom, distance to shore, gradual bottom slope, and distance to the nearest sand bottom, respectively. The classification tree model indicated that sand lance selected shallow-water habitats and remained near sand bottoms when located in habitats with depths between 40 and 60 m. All sand lance locations were at depths <60 m and 93% occurred at depths <40 m. Probable reasons for the modeled relationships between the distribution of sand lance and the independent variables are discussed.

  6. Gravitropism in Higher Plant Shoots 1

    PubMed Central

    Wheeler, Raymond M.; White, Rosemary G.; Salisbury, Frank B.

    1986-01-01

    Ethylene at 1.0 and 10.0 cubic centimeters per cubic meter decreased the rate of gravitropic bending in stems of cocklebur (Xanthium strumarium L.) and tomato (Lycopersicon esculentum Mill), but 0.1 cubic centimeter per cubic meter ethylene had little effect. Treating cocklebur plants with 1.0 millimolar aminoethoxyvinylglycine (AVG) (ethylene synthesis inhibitor) delayed stem bending compared with controls, but adding 0.1 cubic centimeter per cubic meter ethylene in the surrounding atmosphere (or applying 0.1% ethephon solution) partially restored the rate of bending of AVG-treated plants. Ethylene increases in bending stems, and AVG inhibits this. Virtually all newly synthesized ethylene appeared in bottom halves of horizontal stems, where ethylene concentrations were as much as 100 times those in upright stems or in top halves of horizontal stems. This was especially true when horizontal stems were physically restrained from bending. Ethylene might promote cell elongation in bottom tissues of a horizontal stem or indicate other factors there (e.g. a large amount of `functioning' auxin). Or top and bottom tissues may become differentially sensitive to ethylene. Auxin applied to one side of a vertical stem caused extreme bending away from that side; gibberellic acid, kinetin, and abscisic acid were without effect. Acidic ethephon solutions applied to one side of young seedlings of cocklebur, tomato, sunflower (Helianthus annuus L.), and soybean (Glycine max [L.] Merr.) caused bending away from that side, but neutral ethephon solutions did not cause bending. Buffered or unbuffered acid (HCl) caused similar bending. Neutral ethephon solutions produced typical ethylene symptoms (i.e. epinasty, inhibition of stem elongation). HCl or acidic ethephon applied to the top of horizontal stems caused downward bending, but these substances applied to the bottom of such stems inhibited growth and upward bending—an unexpected result. PMID:11539089

  7. Association of choline and betaine levels with cancer incidence and survival: A meta-analysis.

    PubMed

    Youn, Jiyoung; Cho, Eunyoung; Lee, Jung Eun

    2018-03-22

    Evidences suggest possible link between betaine and choline, methyl group donors, and cancer progression. We examined the association between choline and betaine levels and cancer incidence and survival in a meta-analysis of observational studies. We identified observational studies examining the association between choline and/or betaine levels from diet or blood and cancer incidence and survival by searching the PubMed and Web of Science databases for studies published up to Jan, 2018. After applying the selection criteria, 28 observational studies (9 case-control, 1 cross-sectional, and 18 cohort studies) were included. Relative risks (RRs) and 95% confidence intervals (CIs) were extracted, and combined RRs were calculated using random-effects models. Choline levels were not associated with cancer incidence in a meta-analysis of cohort studies. Betaine levels reduced the risk of cancer incidence in a meta-analysis of cohort studies; combined relative risks (RRs) (95% CIs) comparing the top with the bottom categories were 0.93 (0.87-0.99). When we analyzed separately according to exposure assessment method, combined RRs (95% CIs) comparing the top with the bottom categories of betaine levels were 0.87 (95% CI: 0.78-0.95) for dietary betaine and 0.88 (95% CI: 0.77-0.99) for blood levels of betaine. There were no significant associations with cancer survivorship of choline or betaine levels. We concluded that high betaine levels were associated with lower risk of the cancer incidence, especially for colorectal cancer. Copyright © 2018 Elsevier Ltd and European Society for Clinical Nutrition and Metabolism. All rights reserved.

  8. Milk Bottom-Up Proteomics: Method Optimization

    PubMed Central

    Vincent, Delphine; Ezernieks, Vilnis; Elkins, Aaron; Nguyen, Nga; Moate, Peter J.; Cocks, Benjamin G.; Rochfort, Simone

    2016-01-01

    Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones, and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows' milk. Method A (urea) involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone) involved a trichloroacetic acid (TCA)/acetone precipitation, and method C (methanol/chloroform) involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionization-tandem mass spectrometry (nLC-ESI-MS/MS) analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection) to assess reproducibility. Mass spectrometry (MS) data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform) yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea) yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone) was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows. PMID:26793233

  9. Developing the evidence base for mainstreaming adaptation of stormwater systems to climate change.

    PubMed

    Gersonius, B; Nasruddin, F; Ashley, R; Jeuken, A; Pathirana, A; Zevenbergen, C

    2012-12-15

    In a context of high uncertainty about hydro-climatic variables, the development of updated methods for climate impact and adaptation assessment is as important, if not more important than the provision of improved climate change data. In this paper, we introduce a hybrid method to facilitate mainstreaming adaptation of stormwater systems to climate change: i.e., the Mainstreaming method. The Mainstreaming method starts with an analysis of adaptation tipping points (ATPs), which is effect-based. These are points of reference where the magnitude of climate change is such that acceptable technical, environmental, societal or economic standards may be compromised. It extends the ATP analysis to include aspects from a bottom-up approach. The extension concerns the analysis of adaptation opportunities in the stormwater system. The results from both analyses are then used in combination to identify and exploit Adaptation Mainstreaming Moments (AMMs). Use of this method will enhance the understanding of the adaptive potential of stormwater systems. We have applied the proposed hybrid method to the management of flood risk for an urban stormwater system in Dordrecht (the Netherlands). The main finding of this case study is that the application of the Mainstreaming method helps to increase the no-/low-regret character of adaptation for several reasons: it focuses the attention on the most urgent effects of climate change; it is expected to lead to potential cost reductions, since adaptation options can be integrated into infrastructure and building design at an early stage instead of being applied separately; it will lead to the development of area-specific responses, which could not have been developed on a higher scale level; it makes it possible to take account of local values and sensibilities, which contributes to increased public and political support for the adaptive strategies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. The REFINEMENT Glossary of Terms: An International Terminology for Mental Health Systems Assessment.

    PubMed

    Montagni, Ilaria; Salvador-Carulla, Luis; Mcdaid, David; Straßmayr, Christa; Endel, Florian; Näätänen, Petri; Kalseth, Jorid; Kalseth, Birgitte; Matosevic, Tihana; Donisi, Valeria; Chevreul, Karine; Prigent, Amélie; Sfectu, Raluca; Pauna, Carmen; Gutiérrez-Colosia, Mencia R; Amaddeo, Francesco; Katschnig, Heinz

    2018-03-01

    Comparing mental health systems across countries is difficult because of the lack of an agreed upon terminology covering services and related financing issues. Within the European Union project REFINEMENT, international mental health care experts applied an innovative mixed "top-down" and "bottom-up" approach following a multistep design thinking strategy to compile a glossary on mental health systems, using local services as pilots. The final REFINEMENT glossary consisted of 432 terms related to service provision, service utilisation, quality of care and financing. The aim of this study was to describe the iterative process and methodology of developing this glossary.

  11. 24 CFR 3285.804 - Bottom board repair.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Bottom board repair. 3285.804 Section 3285.804 Housing and Urban Development Regulations Relating to Housing and Urban Development... URBAN DEVELOPMENT MODEL MANUFACTURED HOME INSTALLATION STANDARDS Exterior and Interior Close-Up § 3285...

  12. 14 CFR 29.501 - Ground loading conditions: landing gear with skids.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) The ground reactions rationally distributed along the bottom of the skid tube. (b) Vertical reactions... along the bottom of both skids, the vertical reactions must be applied as prescribed in paragraph (a) of this section. (c) Drag reactions in the level landing attitude. In the level attitude, and with the...

  13. 14 CFR 29.501 - Ground loading conditions: landing gear with skids.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) The ground reactions rationally distributed along the bottom of the skid tube. (b) Vertical reactions... along the bottom of both skids, the vertical reactions must be applied as prescribed in paragraph (a) of this section. (c) Drag reactions in the level landing attitude. In the level attitude, and with the...

  14. 14 CFR 27.501 - Ground loading conditions: landing gear with skids.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...) The ground reactions rationally distributed along the bottom of the skid tube. (b) Vertical reactions... along the bottom of both skids, the vertical reactions must be applied as prescribed in paragraph (a) of this section. (c) Drag reactions in the level landing attitude. In the level attitude, and with the...

  15. 14 CFR 27.501 - Ground loading conditions: landing gear with skids.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...) The ground reactions rationally distributed along the bottom of the skid tube. (b) Vertical reactions... along the bottom of both skids, the vertical reactions must be applied as prescribed in paragraph (a) of this section. (c) Drag reactions in the level landing attitude. In the level attitude, and with the...

  16. Depth investigation of rapid sand filters for drinking water production reveals strong stratification in nitrification biokinetic behavior.

    PubMed

    Tatari, K; Smets, B F; Albrechtsen, H-J

    2016-09-15

    The biokinetic behavior of NH4(+) removal was investigated at different depths of a rapid sand filter treating groundwater for drinking water preparation. Filter materials from the top, middle and bottom layers of a full-scale filter were exposed to various controlled NH4(+) loadings in a continuous-flow lab-scale assay. NH4(+) removal capacity, estimated from short term loading up-shifts, was at least 10 times higher in the top than in the middle and bottom filter layers, consistent with the stratification of Ammonium Oxidizing Bacteria (AOB). AOB density increased consistently with the NH4(+) removal rate, indicating their primarily role in nitrification under the imposed experimental conditions. The maximum AOB cell specific NH4(+) removal rate observed at the bottom was at least 3 times lower compared to the top and middle layers. Additionally, a significant up-shift capacity (4.6 and 3.5 times) was displayed from the top and middle layers, but not from the bottom layer at increased loading conditions. Hence, AOB with different physiological responses were active at the different depths. The biokinetic analysis predicted that despite the low NH4(+) removal capacity at the bottom layer, the entire filter is able to cope with a 4-fold instantaneous loading increase without compromising the effluent NH4(+). Ultimately, this filter up-shift capacity was limited by the density of AOB and their biokinetic behavior, both of which were strongly stratified. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Bottom-up laboratory testing of the DKIST Visible Broadband Imager (VBI)

    NASA Astrophysics Data System (ADS)

    Ferayorni, Andrew; Beard, Andrew; Cole, Wes; Gregory, Scott; Wöeger, Friedrich

    2016-08-01

    The Daniel K. Inouye Solar Telescope (DKIST) is a 4-meter solar observatory under construction at Haleakala, Hawaii [1]. The Visible Broadband Imager (VBI) is a first light instrument that will record images at the highest possible spatial and temporal resolution of the DKIST at a number of scientifically important wavelengths [2]. The VBI is a pathfinder for DKIST instrumentation and a test bed for developing processes and procedures in the areas of unit, systems integration, and user acceptance testing. These test procedures have been developed and repeatedly executed during VBI construction in the lab as part of a "test early and test often" philosophy aimed at identifying and resolving issues early thus saving cost during integration test and commissioning on summit. The VBI team recently completed a bottom up end-to-end system test of the instrument in the lab that allowed the instrument's functionality, performance, and usability to be validated against documented system requirements. The bottom up testing approach includes four levels of testing, each introducing another layer in the control hierarchy that is tested before moving to the next level. First the instrument mechanisms are tested for positioning accuracy and repeatability using a laboratory position-sensing detector (PSD). Second the real-time motion controls are used to drive the mechanisms to verify speed and timing synchronization requirements are being met. Next the high-level software is introduced and the instrument is driven through a series of end-to-end tests that exercise the mechanisms, cameras, and simulated data processing. Finally, user acceptance testing is performed on operational and engineering use cases through the use of the instrument engineering graphical user interface (GUI). In this paper we present the VBI bottom up test plan, procedures, example test cases and tools used, as well as results from test execution in the laboratory. We will also discuss the benefits realized through completion of this testing, and share lessons learned from the bottoms up testing process.

  18. Atmospheric Carbon Tetrachloride: Mysterious Emissions Gap Almost Closed

    NASA Astrophysics Data System (ADS)

    Liang, Q.; Newman, P. A.; Reimann, S.

    2016-12-01

    Carbon tetrachloride (CCl4) is a major ozone-depleting substance and its production and consumption is controlled under the Montreal Protocol for emissive uses. The most recent WMO/UNEP Scientific Assessment of Ozone Depletion [WMO, 2014] estimated a 2007-2012 CCl4 bottom-up emission of 1-4 Gg yr-1, based on country-by-country reports to UNEP, vs. a global top-down emissions estimate of 57 Gg yr-1, based on atmospheric measurements. To understand the gap between the top-down and bottom-up emissions estimates, a CCl4 activity was formed under the auspices of the Stratosphere-Troposphere Processes And their Role in Climate (SPARC) project. Several new findings were brought forward by the SPARC CCl4 activity. CCl4 is destroyed in the stratosphere, oceans, and soils. The total lifetime estimate has been increased from 26 to 33 years. The new 33-year total lifetime lowers the top-down emissions estimate to 40 (25-55) Gg yr-1. In addition, a persistent hemispheric difference implies substantial ongoing Northern Hemisphere emissions, yielding an independent emissions estimate of 30 Gg yr-1. The combination of these two yields an emissions estimate of 35 Gg yr-1. Regional estimates have been made for Australia, North America, East Asia, and Western Europe. The sum of these estimates results in emissions of 21 Gg yr-1, albeit this does not include all regions of the world. Four bottom-up CCl4 emissions pathways have been identified, i.e., fugitive, unreported non-feedstock, unreported inadvertent, and legacy emissions. The new industrial bottom-up emissions estimate includes emissions from chloromethanes plants (13 Gg yr-1) and feedstock fugitive emissions (2 Gg yr-1). When combined with legacy emissions and unreported inadvertent emissions ( 10 Gg yr-1), the total global emissions are 20±5 Gg yr-1. While the new bottom-up value is still less than the aggregated top-down values, these estimates reconcile the CCl4 budget discrepancy when considered at the edges of their uncertainties.

  19. Nested Global Inversion for the Carbon Flux Distribution in Canada and USA from 1994 to 2003

    NASA Astrophysics Data System (ADS)

    Chen, J. M.; Deng, F.; Ishizawa, M.; Ju, W.; Mo, G.; Chan, D.; Higuchi, K.; Maksyutov, S.

    2007-12-01

    Based on TransCom inverse modeling for 22 global regions, we developed a nested global inversion system for estimating carbon fluxes of 30 regions in North America (2 of the 22 regions are divided into 30). Irregular boundaries of these 30 regions are delineated based on ecosystem types and provincial/state borders. Synthesis Bayesian inversion is conducted in monthly steps using CO2 concentration measurements at 88 coastal and continental stations of the globe for the 1994-2003 period (NOAA GlobalView database). Responses of these stations to carbon fluxes from the 50 regions are simulated using the transport model of National Institute for Environmental Studies of Japan and reanalysis wind fields of the National Centers for Environmental Prediction (NCEP). Terrestrial carbon flux fields modeled using BEPS and Biome-BGC driven by NCEP reanalysis meteorological data are used as two different a priori to constrain the inversion. The inversion (top- down) results are compared with remote sensing-based ecosystem modeling (bottom-up) results in Canada's forests and wetlands. There is a broad consistency in the spatial pattern of the carbon source and sink distributions obtained using these two independent methods. Both sets of results also indicate that Canada's forests and wetlands are carbon sinks in 1994-2003, but the top-down method produces consistently larger sinks than the bottom-up results. Reasons for this discrepancy may lie in both methods, and several issues are identified for further investigation.

  20. Measurements of traffic emissions over a medium-sized city using long-path measurements and comparison against bottom-up city estimates

    NASA Astrophysics Data System (ADS)

    Waxman, E.; Cossel, K.; Truong, G. W.; Giorgetta, F.; Swann, W.; Coddington, I.; Newbury, N.

    2017-12-01

    Understanding emissions from cities is increasingly important as a growing fraction of the world's population moves to cities. Here we use a novel technology, dual frequency comb spectroscopy, to measure city emissions using a long outdoor open path. We simultaneously measured CO2, CH4, and H2O over the city of Boulder, Colorado and over a clean-air reference path for two months in the fall of 2016. Because of the spatial coverage of our measurements, the layout of the city and power plant locations, and the predominant wind direction, our measurements primarily pick up vehicle emissions. We choose two days with consistent CO2 enhancements over the city relative to the reference path and use a simple 0-D box model to calculate city emissions for these days. We scale these up to annual emissions and compare our measurements with the City of Boulder bottom-up vehicle emissions inventory based on total vehicle miles traveled, fuel efficiency, and vehicle type distribution. We find good agreement (within about a factor of two) between our top-down measurements and the city's bottom-up inventory value.

Top