Integral Design Methodology of Photocatalytic Reactors for Air Pollution Remediation.
Passalía, Claudio; Alfano, Orlando M; Brandi, Rodolfo J
2017-06-07
An integral reactor design methodology was developed to address the optimal design of photocatalytic wall reactors to be used in air pollution control. For a target pollutant to be eliminated from an air stream, the proposed methodology is initiated with a mechanistic derived reaction rate. The determination of intrinsic kinetic parameters is associated with the use of a simple geometry laboratory scale reactor, operation under kinetic control and a uniform incident radiation flux, which allows computing the local superficial rate of photon absorption. Thus, a simple model can describe the mass balance and a solution may be obtained. The kinetic parameters may be estimated by the combination of the mathematical model and the experimental results. The validated intrinsic kinetics obtained may be directly used in the scaling-up of any reactor configuration and size. The bench scale reactor may require the use of complex computational software to obtain the fields of velocity, radiation absorption and species concentration. The complete methodology was successfully applied to the elimination of airborne formaldehyde. The kinetic parameters were determined in a flat plate reactor, whilst a bench scale corrugated wall reactor was used to illustrate the scaling-up methodology. In addition, an optimal folding angle of the corrugated reactor was found using computational fluid dynamics tools.
ERIC Educational Resources Information Center
Moore, Keith M.; Lamb, Jennifer N.; Sikuku, Dominic Ngosia; Ashilenje, Dennis S.; Laker-Ojok, Rita; Norton, Jay
2014-01-01
Purpose: This article investigates the extent of multiple knowledges among smallholders and connected non-farm agents around Mount Elgon in Kenya and Uganda in order to build the communicative competence needed to scale up conservation agriculture production systems (CAPS). Design/methodology/approach: Our methodological approach examines local…
Gas-solid fluidized bed reactors: Scale-up, flow regimes identification and hydrodynamics
NASA Astrophysics Data System (ADS)
Zaid, Faraj Muftah
This research studied the scale-up, flow regimes identification and hydrodynamics of fluidized beds using 6-inch and 18- inch diameter columns and different particles. One of the objectives was to advance the scale-up of gas-solid fluidized bed reactors by developing a new mechanistic methodology for hydrodynamic similarity based on matching the radial or diameter profile of gas phase holdup, since gas dynamics dictate the hydrodynamics of these reactors. This has been successfully achieved. However, the literature reported scale-up methodology based on matching selected dimensionless groups was examined and it was found that it was not easy to match the dimensionless groups and hence, there was some deviation in the hydrodynamics of the studied two different fluidized beds. A new technique based on gamma ray densitometry (GRD) was successfully developed and utilized to on-line monitor the implementation of scale-up, to identify the flow regime, and to measure the radial or diameter profiles of gas and solids holdups. CFD has been demonstrated as a valuable tool to enable the implementation of the newly developed scale-up methodology based on finding the conditions that provide similar or closer radial profile or cross sectional distribution of the gas holdup. As gas velocity increases, solids holdup in the center region of the column decreases in the fully developed region of both 6 inch and 18 inch diameter columns. Solids holdup increased with the increase in the particles size and density. Upflowing particles velocity increased with the gas velocity and became steeper at high superficial gas velocity at all axial heights where the center line velocity became higher than that in the wall region. Smaller particles size and lower density gave larger upflowing particles velocity. Minimum fluidization velocity and transition velocity from bubbly to churn turbulent flow regimes were found to be lower in 18 inch diameter column compared to those obtained in 6 inch diameter column. Also the absolute fluctuation of upflowing particles velocity multiplied by solids holdups vś 3ś as one of the terms for solids mass flux estimation was found to be larger in 18-inch diameter column than that in 6-inch diameter column using same particles size and density.
A quality by design approach to scale-up of high-shear wet granulation process.
Pandey, Preetanshu; Badawy, Sherif
2016-01-01
High-shear wet granulation is a complex process that in turn makes scale-up a challenging task. Scale-up of high-shear wet granulation process has been studied extensively in the past with various different methodologies being proposed in the literature. This review article discusses existing scale-up principles and categorizes the various approaches into two main scale-up strategies - parameter-based and attribute-based. With the advent of quality by design (QbD) principle in drug product development process, an increased emphasis toward the latter approach may be needed to ensure product robustness. In practice, a combination of both scale-up strategies is often utilized. In a QbD paradigm, there is also a need for an increased fundamental and mechanistic understanding of the process. This can be achieved either by increased experimentation that comes at higher costs, or by using modeling techniques, that are also discussed as part of this review.
Understanding pathways for scaling up health services through the lens of complex adaptive systems.
Paina, Ligia; Peters, David H
2012-08-01
Despite increased prominence and funding of global health initiatives, efforts to scale up health services in developing countries are falling short of the expectations of the Millennium Development Goals. Arguing that the dominant assumptions for scaling up are inadequate, we propose that interpreting change in health systems through the lens of complex adaptive systems (CAS) provides better models of pathways for scaling up. Based on an understanding of CAS behaviours, we describe how phenomena such as path dependence, feedback loops, scale-free networks, emergent behaviour and phase transitions can uncover relevant lessons for the design and implementation of health policy and programmes in the context of scaling up health services. The implications include paying more attention to local context, incentives and institutions, as well as anticipating certain types of unintended consequences that can undermine scaling up efforts, and developing and implementing programmes that engage key actors through transparent use of data for ongoing problem-solving and adaptation. We propose that future efforts to scale up should adapt and apply the models and methodologies which have been used in other fields that study CAS, yet are underused in public health. This can help policy makers, planners, implementers and researchers to explore different and innovative approaches for reaching populations in need with effective, equitable and efficient health services. The old assumptions have led to disappointed expectations about how to scale up health services, and offer little insight on how to scale up effective interventions in the future. The alternative perspectives offered by CAS may better reflect the complex and changing nature of health systems, and create new opportunities for understanding and scaling up health services.
Parallelization of fine-scale computation in Agile Multiscale Modelling Methodology
NASA Astrophysics Data System (ADS)
Macioł, Piotr; Michalik, Kazimierz
2016-10-01
Nowadays, multiscale modelling of material behavior is an extensively developed area. An important obstacle against its wide application is high computational demands. Among others, the parallelization of multiscale computations is a promising solution. Heterogeneous multiscale models are good candidates for parallelization, since communication between sub-models is limited. In this paper, the possibility of parallelization of multiscale models based on Agile Multiscale Methodology framework is discussed. A sequential, FEM based macroscopic model has been combined with concurrently computed fine-scale models, employing a MatCalc thermodynamic simulator. The main issues, being investigated in this work are: (i) the speed-up of multiscale models with special focus on fine-scale computations and (ii) on decreasing the quality of computations enforced by parallel execution. Speed-up has been evaluated on the basis of Amdahl's law equations. The problem of `delay error', rising from the parallel execution of fine scale sub-models, controlled by the sequential macroscopic sub-model is discussed. Some technical aspects of combining third-party commercial modelling software with an in-house multiscale framework and a MPI library are also discussed.
Residual Strength Analysis Methodology: Laboratory Coupons to Structural Components
NASA Technical Reports Server (NTRS)
Dawicke, D. S.; Newman, J. C., Jr.; Starnes, J. H., Jr.; Rose, C. A.; Young, R. D.; Seshadri, B. R.
2000-01-01
The NASA Aircraft Structural Integrity (NASIP) and Airframe Airworthiness Assurance/Aging Aircraft (AAA/AA) Programs have developed a residual strength prediction methodology for aircraft fuselage structures. This methodology has been experimentally verified for structures ranging from laboratory coupons up to full-scale structural components. The methodology uses the critical crack tip opening angle (CTOA) fracture criterion to characterize the fracture behavior and a material and a geometric nonlinear finite element shell analysis code to perform the structural analyses. The present paper presents the results of a study to evaluate the fracture behavior of 2024-T3 aluminum alloys with thickness of 0.04 inches to 0.09 inches. The critical CTOA and the corresponding plane strain core height necessary to simulate through-the-thickness effects at the crack tip in an otherwise plane stress analysis, were determined from small laboratory specimens. Using these parameters, the CTOA fracture criterion was used to predict the behavior of middle crack tension specimens that were up to 40 inches wide, flat panels with riveted stiffeners and multiple-site damage cracks, 18-inch diameter pressurized cylinders, and full scale curved stiffened panels subjected to internal pressure and mechanical loads.
NASA Astrophysics Data System (ADS)
Fraser, R.; Coulaud, M.; Aeschlimann, V.; Lemay, J.; Deschenes, C.
2016-11-01
With the growing proportion of inconstant energy source as wind and solar, hydroelectricity becomes a first class source of peak energy in order to regularize the grid. The important increase of start - stop cycles may then cause a premature ageing of runners by both a higher number of cycles in stress fluctuations and by reaching a higher stress level in absolute. Aiming to sustain good quality development on fully homologous scale model turbines, the Hydraulic Machines Laboratory (LAMH) of Laval University has developed a methodology to operate model size turbines on transient regimes such as start-up, stop or load rejection on its test stand. This methodology allows maintaining a constant head while the wicket gates are opening or closing in a representative speed on the model scale of what is made on the prototype. This paper first presents the opening speed on model based on dimensionless numbers, the methodology itself and its application. Then both its limitation and the first results using a bulb turbine are detailed.
Selected methods for quantification of community exposure to aircraft noise
NASA Technical Reports Server (NTRS)
Edge, P. M., Jr.; Cawthorn, J. M.
1976-01-01
A review of the state-of-the-art for the quantification of community exposure to aircraft noise is presented. Physical aspects, people response considerations, and practicalities of useful application of scales of measure are included. Historical background up through the current technology is briefly presented. The developments of both single-event and multiple-event scales are covered. Selective choice is made of scales currently in the forefront of interest and recommended methodology is presented for use in computer programing to translate aircraft noise data into predictions of community noise exposure. Brief consideration is given to future programing developments and to supportive research needs.
Pitkänen, Janne; Nieminen, Marko
2017-01-01
Participation of healthcare professionals in information technology development has emerged as an important challenge. As end-users, the professionals are willing to participate in the development activities, but their experiences on the current methods of participation remain mostly negative. There is lack of applicable methods for meeting the needs of agile development approach and scaling up to the largest implementation projects, while maintaining the interest of the professional users to participate in development activities and keeping up their ability to continue working in a productive manner. In this paper, we describe the Agile Instrumented Monitoring as a methodology, based on the methods of instrumented usability evaluation, for improving user experience in HealthIT development. The contribution of the proposed methodology is analyzed in relation to activities of whole iteration cycle and chosen usability evaluation methods, while the user experience of participation is addressed regarding healthcare professionals. Prospective weak and strong market tests for AIM are discussed in the conclusions for future work.
The U.S. Environmental Protection Agency (EPA), in collaboration with the National Marine Fisheries Service and the U.S. Fish and Wildlife Service is currently developing a methodology to assess the risks of pesticides to federally-listed threatened and endangered species. In thi...
Large Composite Structures Processing Technologies for Reusable Launch Vehicles
NASA Technical Reports Server (NTRS)
Clinton, R. G., Jr.; Vickers, J. H.; McMahon, W. M.; Hulcher, A. B.; Johnston, N. J.; Cano, R. J.; Belvin, H. L.; McIver, K.; Franklin, W.; Sidwell, D.
2001-01-01
Significant efforts have been devoted to establishing the technology foundation to enable the progression to large scale composite structures fabrication. We are not capable today of fabricating many of the composite structures envisioned for the second generation reusable launch vehicle (RLV). Conventional 'aerospace' manufacturing and processing methodologies (fiber placement, autoclave, tooling) will require substantial investment and lead time to scale-up. Out-of-autoclave process techniques will require aggressive efforts to mature the selected technologies and to scale up. Focused composite processing technology development and demonstration programs utilizing the building block approach are required to enable envisioned second generation RLV large composite structures applications. Government/industry partnerships have demonstrated success in this area and represent best combination of skills and capabilities to achieve this goal.
Edgren, Gustaf; Hjalgrim, Henrik
2010-11-01
At current safety levels, with adverse events from transfusions being relatively rare, further progress in risk reductions will require large-scale investigations. Thus, truly prospective studies may prove unfeasible and other alternatives deserve consideration. In this review, we will try to give an overview of recent and historical developments in the use of blood donation and transfusion databases in research. In addition, we will go over important methodological issues. There are at least three nationwide or near-nationwide donation/transfusion databases with the possibility for long-term follow-up of donors and recipients. During the past few years, a large number of reports have been published utilizing such data sources to investigate transfusion-associated risks. In addition, numerous clinics systematically collect and use such data on a smaller scale. Combining systematically recorded donation and transfusion data with long-term health follow-up opens up exciting opportunities for transfusion medicine research. However, the correct analysis of such data requires close attention to methodological issues, especially including the indication for transfusion and reverse causality.
Experimental cocrystal screening and solution based scale-up cocrystallization methods.
Malamatari, Maria; Ross, Steven A; Douroumis, Dennis; Velaga, Sitaram P
2017-08-01
Cocrystals are crystalline single phase materials composed of two or more different molecular and/or ionic compounds generally in a stoichiometric ratio which are neither solvates nor simple salts. If one of the components is an active pharmaceutical ingredient (API), the term pharmaceutical cocrystal is often used. There is a growing interest among drug development scientists in exploring cocrystals, as means to address physicochemical, biopharmaceutical and mechanical properties and expand solid form diversity of the API. Conventionally, coformers are selected based on crystal engineering principles, and the equimolar mixtures of API and coformers are subjected to solution-based crystallization that are commonly employed in polymorph and salt screening. However, the availability of new knowledge on cocrystal phase behaviour in solid state and solutions has spurred the development and implementation of more rational experimental cocrystal screening as well as scale-up methods. This review aims to provide overview of commonly employed solid form screening techniques in drug development with an emphasis on cocrystal screening methodologies. The latest developments in understanding and the use of cocrystal phase diagrams in both screening and solution based scale-up methods are also presented. Final section is devoted to reviewing the state of the art research covering solution based scale-up cocrystallization process for different cocrystals besides more recent continuous crystallization methods. Copyright © 2017 Elsevier B.V. All rights reserved.
Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis
NASA Technical Reports Server (NTRS)
Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.
2015-01-01
This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.
Estimating unbiased economies of scale of HIV prevention projects: a case study of Avahan.
Lépine, Aurélia; Vassall, Anna; Chandrashekar, Sudha; Blanc, Elodie; Le Nestour, Alexis
2015-04-01
Governments and donors are investing considerable resources on HIV prevention in order to scale up these services rapidly. Given the current economic climate, providers of HIV prevention services increasingly need to demonstrate that these investments offer good 'value for money'. One of the primary routes to achieve efficiency is to take advantage of economies of scale (a reduction in the average cost of a health service as provision scales-up), yet empirical evidence on economies of scale is scarce. Methodologically, the estimation of economies of scale is hampered by several statistical issues preventing causal inference and thus making the estimation of economies of scale complex. In order to estimate unbiased economies of scale when scaling up HIV prevention services, we apply our analysis to one of the few HIV prevention programmes globally delivered at a large scale: the Indian Avahan initiative. We costed the project by collecting data from the 138 Avahan NGOs and the supporting partners in the first four years of its scale-up, between 2004 and 2007. We develop a parsimonious empirical model and apply a system Generalized Method of Moments (GMM) and fixed-effects Instrumental Variable (IV) estimators to estimate unbiased economies of scale. At the programme level, we find that, after controlling for the endogeneity of scale, the scale-up of Avahan has generated high economies of scale. Our findings suggest that average cost reductions per person reached are achievable when scaling-up HIV prevention in low and middle income countries. Copyright © 2015 Elsevier Ltd. All rights reserved.
Design-Based School Improvement: A Practical Guide for Education Leaders
ERIC Educational Resources Information Center
Mintrop, Rick
2016-01-01
At the heart of the effort to enact and scale up successful school reforms is the need for more robust links between research and practice. One promising approach is design development, a methodology widely used in other fields and only recently adapted to education, which offers a disciplined process for identifying practical problems, assessing…
Stereo particle image velocimetry set up for measurements in the wake of scaled wind turbines
NASA Astrophysics Data System (ADS)
Campanardi, Gabriele; Grassi, Donato; Zanotti, Alex; Nanos, Emmanouil M.; Campagnolo, Filippo; Croce, Alessandro; Bottasso, Carlo L.
2017-08-01
Stereo particle image velocimetry measurements were carried out in the boundary layer test section of Politecnico di Milano large wind tunnel to survey the wake of a scaled wind turbine model designed and developed by Technische Universität München. The stereo PIV instrumentation was set up to survey the three velocity components on cross-flow planes at different longitudinal locations. The area of investigation covered the entire extent of the wind turbines wake that was scanned by the use of two separate traversing systems for both the laser and the cameras. Such instrumentation set up enabled to gain rapidly high quality results suitable to characterise the behaviour of the flow field in the wake of the scaled wind turbine. This would be very useful for the evaluation of the performance of wind farm control methodologies based on wake redirection and for the validation of CFD tools.
Piovesana, Adina M; Harrison, Jessica L; Ducat, Jacob J
2017-12-01
This study aimed to develop a motor-free short-form of the Wechsler Intelligence Scale for Children-Fifth Edition (WISC-V) that allows clinicians to estimate the Full Scale Intelligence Quotients of youths with motor impairments. Using the reliabilities and intercorrelations of six WISC-V motor-free subtests, psychometric methodologies were applied to develop look-up tables for four Motor-free Short-form indices: Verbal Comprehension Short-form, Perceptual Reasoning Short-form, Working Memory Short-form, and a Motor-free Intelligence Quotient. Index-level discrepancy tables were developed using the same methods to allow clinicians to statistically compare visual, verbal, and working memory abilities. The short-form indices had excellent reliabilities ( r = .92-.97) comparable to the original WISC-V. This motor-free short-form of the WISC-V is a reliable alternative for the assessment of intellectual functioning in youths with motor impairments. Clinicians are provided with user-friendly look-up tables, index level discrepancy tables, and base rates, displayed similar to those in the WISC-V manuals to enable interpretation of assessment results.
NASA Astrophysics Data System (ADS)
Wayson, Michael B.; Bolch, Wesley E.
2018-04-01
Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.
Wayson, Michael B; Bolch, Wesley E
2018-04-13
Various computational tools are currently available that facilitate patient organ dosimetry in diagnostic nuclear medicine, yet they are typically restricted to reporting organ doses to ICRP-defined reference phantoms. The present study, while remaining computational phantom based, provides straightforward tools to adjust reference phantom organ dose for both internal photon and electron sources. A wide variety of monoenergetic specific absorbed fractions were computed using radiation transport simulations for tissue spheres of varying size and separation distance. Scaling methods were then constructed for both photon and electron self-dose and cross-dose, with data validation provided from patient-specific voxel phantom simulations, as well as via comparison to the scaling methodology given in MIRD Pamphlet No. 11. Photon and electron self-dose was found to be dependent on both radiation energy and sphere size. Photon cross-dose was found to be mostly independent of sphere size. Electron cross-dose was found to be dependent on sphere size when the spheres were in close proximity, owing to differences in electron range. The validation studies showed that this dataset was more effective than the MIRD 11 method at predicting patient-specific photon doses for at both high and low energies, but gave similar results at photon energies between 100 keV and 1 MeV. The MIRD 11 method for electron self-dose scaling was accurate for lower energies but began to break down at higher energies. The photon cross-dose scaling methodology developed in this study showed gains in accuracy of up to 9% for actual patient studies, and the electron cross-dose scaling methodology showed gains in accuracy up to 9% as well when only the bremsstrahlung component of the cross-dose was scaled. These dose scaling methods are readily available for incorporation into internal dosimetry software for diagnostic phantom-based organ dosimetry.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2010-01-01
Methodologies for understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
Modeling and Characterization of Near-Crack-Tip Plasticity from Micro- to Nano-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jacob; Smith, Stephen W.; Ransom, Jonathan B.; Yamakov, Vesselin; Gupta, Vipul
2011-01-01
Methodologies for understanding the plastic deformation mechanisms related 10 crack propagation at the nano, meso- and micro-length scales are being developed. These efforts include the development and application of several computational methods including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity; and experimental methods including electron backscattered diffraction and video image correlation. Additionally, methodologies for multi-scale modeling and characterization that can be used to bridge the relevant length scales from nanometers to millimeters are being developed. The paper focuses on the discussion of newly developed methodologies in these areas and their application to understanding damage processes in aluminum and its alloys.
Developing scale for colleague solidarity among nurses in Turkey.
Uslusoy, Esin Cetinkaya; Alpar, Sule Ecevit
2013-02-01
There is a need for an appropriate instrument to measure colleague solidarity among nurses. This study was carried out to develop a Colleague Solidarity of Nurses' Scale (CSNS). This study was planned to be descriptive and methodological. The CSNS examined content validity, construct validity, test-retest reliability and internal consistency reliability. The trial form of the CSNS, which was composed of 44 items, was given to 200 nurses, followed by validity and reliability analyses. Following the analyses, 21 items were excluded from the scale, leaving an attitude scale made up of 23 items. Factor analysis of the data showed that the scale has a three sub-factor structure: emotional solidarity, academic solidarity and negative opinions about solidarity. The Cronbach's alpha reliability of the whole scale was 0.80. This study provides evidence that the CSNS possesses robust solidarity among nurses. © 2013 Wiley Publishing Asia Pty Ltd.
A multidisciplinary approach to the development of low-cost high-performance lightwave networks
NASA Technical Reports Server (NTRS)
Maitan, Jacek; Harwit, Alex
1991-01-01
Our research focuses on high-speed distributed systems. We anticipate that our results will allow the fabrication of low-cost networks employing multi-gigabit-per-second data links for space and military applications. The recent development of high-speed low-cost photonic components and new generations of microprocessors creates an opportunity to develop advanced large-scale distributed information systems. These systems currently involve hundreds of thousands of nodes and are made up of components and communications links that may fail during operation. In order to realize these systems, research is needed into technologies that foster adaptability and scaleability. Self-organizing mechanisms are needed to integrate a working fabric of large-scale distributed systems. The challenge is to fuse theory, technology, and development methodologies to construct a cost-effective, efficient, large-scale system.
High-frequency measurements of aeolian saltation flux: Field-based methodology and applications
NASA Astrophysics Data System (ADS)
Martin, Raleigh L.; Kok, Jasper F.; Hugenholtz, Chris H.; Barchyn, Thomas E.; Chamecki, Marcelo; Ellis, Jean T.
2018-02-01
Aeolian transport of sand and dust is driven by turbulent winds that fluctuate over a broad range of temporal and spatial scales. However, commonly used aeolian transport models do not explicitly account for such fluctuations, likely contributing to substantial discrepancies between models and measurements. Underlying this problem is the absence of accurate sand flux measurements at the short time scales at which wind speed fluctuates. Here, we draw on extensive field measurements of aeolian saltation to develop a methodology for generating high-frequency (up to 25 Hz) time series of total (vertically-integrated) saltation flux, namely by calibrating high-frequency (HF) particle counts to low-frequency (LF) flux measurements. The methodology follows four steps: (1) fit exponential curves to vertical profiles of saltation flux from LF saltation traps, (2) determine empirical calibration factors through comparison of LF exponential fits to HF number counts over concurrent time intervals, (3) apply these calibration factors to subsamples of the saltation count time series to obtain HF height-specific saltation fluxes, and (4) aggregate the calibrated HF height-specific saltation fluxes into estimates of total saltation fluxes. When coupled to high-frequency measurements of wind velocity, this methodology offers new opportunities for understanding how aeolian saltation dynamics respond to variability in driving winds over time scales from tens of milliseconds to days.
Future in biomolecular computation
NASA Astrophysics Data System (ADS)
Wimmer, E.
1988-01-01
Large-scale computations for biomolecules are dominated by three levels of theory: rigorous quantum mechanical calculations for molecules with up to about 30 atoms, semi-empirical quantum mechanical calculations for systems with up to several hundred atoms, and force-field molecular dynamics studies of biomacromolecules with 10,000 atoms and more including surrounding solvent molecules. It can be anticipated that increased computational power will allow the treatment of larger systems of ever growing complexity. Due to the scaling of the computational requirements with increasing number of atoms, the force-field approaches will benefit the most from increased computational power. On the other hand, progress in methodologies such as density functional theory will enable us to treat larger systems on a fully quantum mechanical level and a combination of molecular dynamics and quantum mechanics can be envisioned. One of the greatest challenges in biomolecular computation is the protein folding problem. It is unclear at this point, if an approach with current methodologies will lead to a satisfactory answer or if unconventional, new approaches will be necessary. In any event, due to the complexity of biomolecular systems, a hierarchy of approaches will have to be established and used in order to capture the wide ranges of length-scales and time-scales involved in biological processes. In terms of hardware development, speed and power of computers will increase while the price/performance ratio will become more and more favorable. Parallelism can be anticipated to become an integral architectural feature in a range of computers. It is unclear at this point, how fast massively parallel systems will become easy enough to use so that new methodological developments can be pursued on such computers. Current trends show that distributed processing such as the combination of convenient graphics workstations and powerful general-purpose supercomputers will lead to a new style of computing in which the calculations are monitored and manipulated as they proceed. The combination of a numeric approach with artificial-intelligence approaches can be expected to open up entirely new possibilities. Ultimately, the most exciding aspect of the future in biomolecular computing will be the unexpected discoveries.
Sustainability impact assessment to improve food security of smallholders in Tanzania
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schindler, Jana, E-mail: jana.schindler@zalf.de; Humboldt Universität zu Berlin, Faculty of Agriculture and Horticulture, Invalidenstr. 42, 10099 Berlin; Graef, Frieder, E-mail: graef@zalf.de
The objective of this paper was to assess the sustainability impacts of planned agricultural development interventions, so called upgrading strategies (UPS), to enhance food security and to identify what advantages and risks are assessed from the farmer's point of view in regards to social life, the economy and the environment. We developed a participatory methodological procedure that links food security and sustainable development. Farmers in four different case study villages in rural Tanzania chose their priority UPS. For these UPS, they assessed the impacts on locally relevant food security criteria. The positive impacts identified were mainly attributed to increased agriculturalmore » production and its related positive impacts such as increased income and improved access to necessary means to diversify the diet. However, several risks of certain UPS were also indicated by farmers, such as increased workload, high maintenance costs, higher competition among farmers, loss of traditional knowledge and social conflicts. We discussed the strong interdependence of socio-economic and environmental criteria to improve food security for small-scale farmers and analysed several trade-offs in regards to UPS choices and food security criteria. We also identified and discussed the advantages and challenges of our methodological approach. In conclusion, the participatory impact assessment on the farmer level allowed a locally specific analysis of the various positive and negative impacts of UPS on social life, the economy and the environment. We emphasize that only a development approach that considers social, economic and environmental challenges simultaneously can enhance food security.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biswas, Pratim; Al-Dahhan, Muthanna
2012-11-01
Tri-isotropic (TRISO) fuel particle coating is critical for the future use of nuclear energy produced byadvanced gas reactors (AGRs). The fuel kernels are coated using chemical vapor deposition in a spouted fluidized bed. The challenges encountered in operating TRISO fuel coaters are due to the fact that in modern AGRs, such as High Temperature Gas Reactors (HTGRs), the acceptable level of defective/failed coated particles is essentially zero. This specification requires processes that produce coated spherical particles with even coatings having extremely low defect fractions. Unfortunately, the scale-up and design of the current processes and coaters have been based on empiricalmore » approaches and are operated as black boxes. Hence, a voluminous amount of experimental development and trial and error work has been conducted. It has been clearly demonstrated that the quality of the coating applied to the fuel kernels is impacted by the hydrodynamics, solids flow field, and flow regime characteristics of the spouted bed coaters, which themselves are influenced by design parameters and operating variables. Further complicating the outlook for future fuel-coating technology and nuclear energy production is the fact that a variety of new concepts will involve fuel kernels of different sizes and with compositions of different densities. Therefore, without a fundamental understanding the underlying phenomena of the spouted bed TRISO coater, a significant amount of effort is required for production of each type of particle with a significant risk of not meeting the specifications. This difficulty will significantly and negatively impact the applications of AGRs for power generation and cause further challenges to them as an alternative source of commercial energy production. Accordingly, the proposed work seeks to overcome such hurdles and advance the scale-up, design, and performance of TRISO fuel particle spouted bed coaters. The overall objectives of the proposed work are to advance the fundamental understanding of the hydrodynamics by systematically investigating the effect of design and operating variables, to evaluate the reported dimensionless groups as scaling factors, and to establish a reliable scale-up methodology for the TRISO fuel particle spouted bed coaters based on hydrodynamic similarity via advanced measurement and computational techniques. An additional objective is to develop an on-line non-invasive measurement technique based on gamma ray densitometry (i.e. Nuclear Gauge Densitometry) that can be installed and used for coater process monitoring to ensure proper performance and operation and to facilitate the developed scale-up methodology. To achieve the objectives set for the project, the work will use optical probes and gamma ray computed tomography (CT) (for the measurements of solids/voidage holdup cross-sectional distribution and radial profiles along the bed height, spouted diameter, and fountain height) and radioactive particle tracking (RPT) (for the measurements of the 3D solids flow field, velocity, turbulent parameters, circulation time, solids lagrangian trajectories, and many other of spouted bed related hydrodynamic parameters). In addition, gas dynamic measurement techniques and pressure transducers will be utilized to complement the obtained information. The measurements obtained by these techniques will be used as benchmark data to evaluate and validate the computational fluid dynamic (CFD) models (two fluid model or discrete particle model) and their closures. The validated CFD models and closures will be used to facilitate the developed methodology for scale-up, design and hydrodynamic similarity. Successful execution of this work and the proposed tasks will advance the fundamental understanding of the coater flow field and quantify it for proper and safe design, scale-up, and performance. Such achievements will overcome the barriers to AGR applications and will help assure that the US maintains nuclear energy as a feasible option to meet the nation's needs for energy and environmental safety. In addition, the outcome of the proposed study will have a broader impact on other processes that utilize spouted beds, such as coal gasification, granulation, drying, catalytic reactions, etc.« less
Criticality Characteristics of Current Oil Price Dynamics
NASA Astrophysics Data System (ADS)
Drożdż, S.; Kwapień, J.; Oświęcimka, P.
2008-10-01
Methodology that recently leads us to predict to an amazing accuracy the date (July 11, 2008) of reverse of the oil price up trend is briefly summarized and some further aspects of the related oil price dynamics elaborated. This methodology is based on the concept of discrete scale invariance whose finance-prediction-oriented variant involves such elements as log-periodic self-similarity, the universal preferred scaling factor λ≈2, and allows a phenomenon of the "super-bubble". From this perspective the present (as of August 22, 2008) violent - but still log-periodically decelerating - decrease of the oil prices is associated with the decay of such a "super-bubble" that has started developing about one year ago on top of the longer-term oil price increasing phase (normal bubble) whose ultimate termination is evaluated to occur in around mid 2010.
A novel method to scale up fungal endophyte isolations
USDA-ARS?s Scientific Manuscript database
Estimations of species diversity are influenced by sampling intensity which in turn is influenced by methodology. For fungal endophyte diversity studies, the methodology includes surface-sterilization prior to isolation of endophytes. Surface-sterilization is an essential component of fungal endophy...
Evaluating Mission Drift in Microfinance: Lessons for Programs with Social Mission
ERIC Educational Resources Information Center
Hishigsuren, Gaamaa
2007-01-01
The article contributes to a better understanding of implications of scaling up on the social mission of microfinance programs. It proposes a methodology to measure the extent, if any, to which a microfinance program with a poverty alleviation mission drifts away from its mission during rapid scaling up and presents findings from a field research…
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. This research program has developed a viable methodology for producing small scale rural land use maps in semi-arid developing countries using imagery obtained from orbital multispectral scanners.
ERIC Educational Resources Information Center
Yeager, David S.; Romero, Carissa; Paunesku, Dave; Hulleman, Christopher S.; Schneider, Barbara; Hinojosa, Cintia; Lee, Hae Yeon; O'Brien, Joseph; Flint, Kate; Roberts, Alice; Trott, Jill; Greene, Daniel; Walton, Gregory M.; Dweck, Carol S.
2016-01-01
There are many promising psychological interventions on the horizon, but there is no clear methodology for preparing them to be scaled up. Drawing on design thinking, the present research formalizes a methodology for redesigning and tailoring initial interventions. We test the methodology using the case of fixed versus growth mindsets during the…
Costa, Fernanda das Neves; Vieira, Mariana Neves; Garrard, Ian; Hewitson, Peter; Jerz, Gerold; Leitão, Gilda Guimarães; Ignatova, Svetlana
2016-09-30
Countercurrent chromatography (CCC) is being widely used across the world for purification of various materials, especially in natural product research. The predictability of CCC scale-up has been successfully demonstrated using specially designed instruments of the same manufacturer. The reality is that the most of CCC users do not have access to such instruments and do not have enough experience to transfer methods from one CCC column to another. This unique study of three international teams is based on innovative approach to simplify the scale-up between different CCC machines using fractionation of Schinus terebinthifolius berries dichloromethane extract as a case study. The optimized separation methodology, recently developed by the authors (Part I), was repeatedly performed on CCC columns of different design available at most research laboratories across the world. Hexane - ethyl acetate - methanol - water (6:1:6:1, v/v/v/v) was used as solvent system with masticadienonic and 3β-masticadienolic acids as target compounds to monitor stationary phase retention and calculate peak resolution. It has been demonstrated that volumetric, linear and length scale-up transfer factors based on column characteristics can be directly applied to different i.d., volume and length columns independently on instrument make in an intra-apparatus scale-up and inter-apparatus method transfer. Copyright © 2016 Elsevier B.V. All rights reserved.
Direct measurements of local bed shear stress in the presence of pressure gradients
NASA Astrophysics Data System (ADS)
Pujara, Nimish; Liu, Philip L.-F.
2014-07-01
This paper describes the development of a shear plate sensor capable of directly measuring the local mean bed shear stress in small-scale and large-scale laboratory flumes. The sensor is capable of measuring bed shear stress in the range 200 Pa with an accuracy up to 1 %. Its size, 43 mm in the flow direction, is designed to be small enough to give spatially local measurements, and its bandwidth, 75 Hz, is high enough to resolve time-varying forcing. Typically, shear plate sensors are restricted to use in zero pressure gradient flows because secondary forces on the edge of the shear plate caused by pressure gradients can introduce large errors. However, by analysis of the pressure distribution at the edges of the shear plate in mild pressure gradients, we introduce a new methodology for correcting for the pressure gradient force. The developed sensor includes pressure tappings to measure the pressure gradient in the flow, and the methodology for correction is applied to obtain accurate measurements of bed shear stress under solitary waves in a small-scale wave flume. The sensor is also validated by measurements in a turbulent flat plate boundary layer in open channel flow.
Object-oriented analysis and design: a methodology for modeling the computer-based patient record.
Egyhazy, C J; Eyestone, S M; Martino, J; Hodgson, C L
1998-08-01
The article highlights the importance of an object-oriented analysis and design (OOAD) methodology for the computer-based patient record (CPR) in the military environment. Many OOAD methodologies do not adequately scale up, allow for efficient reuse of their products, or accommodate legacy systems. A methodology that addresses these issues is formulated and used to demonstrate its applicability in a large-scale health care service system. During a period of 6 months, a team of object modelers and domain experts formulated an OOAD methodology tailored to the Department of Defense Military Health System and used it to produce components of an object model for simple order processing. This methodology and the lessons learned during its implementation are described. This approach is necessary to achieve broad interoperability among heterogeneous automated information systems.
Estimating Agricultural Nitrous Oxide Emissions
USDA-ARS?s Scientific Manuscript database
Nitrous oxide emissions are highly variable in space and time and different methodologies have not agreed closely, especially at small scales. However, as scale increases, so does the agreement between estimates based on soil surface measurements (bottom up approach) and estimates derived from chang...
Analysis of World Economic Variables Using Multidimensional Scaling
Machado, J.A. Tenreiro; Mata, Maria Eugénia
2015-01-01
Waves of globalization reflect the historical technical progress and modern economic growth. The dynamics of this process are here approached using the multidimensional scaling (MDS) methodology to analyze the evolution of GDP per capita, international trade openness, life expectancy, and education tertiary enrollment in 14 countries. MDS provides the appropriate theoretical concepts and the exact mathematical tools to describe the joint evolution of these indicators of economic growth, globalization, welfare and human development of the world economy from 1977 up to 2012. The polarization dance of countries enlightens the convergence paths, potential warfare and present-day rivalries in the global geopolitical scene. PMID:25811177
Tate, Robyn L; McDonald, Skye; Perdices, Michael; Togher, Leanne; Schultz, Regina; Savage, Sharon
2008-08-01
Rating scales that assess methodological quality of clinical trials provide a means to critically appraise the literature. Scales are currently available to rate randomised and non-randomised controlled trials, but there are none that assess single-subject designs. The Single-Case Experimental Design (SCED) Scale was developed for this purpose and evaluated for reliability. Six clinical researchers who were trained and experienced in rating methodological quality of clinical trials developed the scale and participated in reliability studies. The SCED Scale is an 11-item rating scale for single-subject designs, of which 10 items are used to assess methodological quality and use of statistical analysis. The scale was developed and refined over a 3-year period. Content validity was addressed by identifying items to reduce the main sources of bias in single-case methodology as stipulated by authorities in the field, which were empirically tested against 85 published reports. Inter-rater reliability was assessed using a random sample of 20/312 single-subject reports archived in the Psychological Database of Brain Impairment Treatment Efficacy (PsycBITE). Inter-rater reliability for the total score was excellent, both for individual raters (overall ICC = 0.84; 95% confidence interval 0.73-0.92) and for consensus ratings between pairs of raters (overall ICC = 0.88; 95% confidence interval 0.78-0.95). Item reliability was fair to excellent for consensus ratings between pairs of raters (range k = 0.48 to 1.00). The results were replicated with two independent novice raters who were trained in the use of the scale (ICC = 0.88, 95% confidence interval 0.73-0.95). The SCED Scale thus provides a brief and valid evaluation of methodological quality of single-subject designs, with the total score demonstrating excellent inter-rater reliability using both individual and consensus ratings. Items from the scale can also be used as a checklist in the design, reporting and critical appraisal of single-subject designs, thereby assisting to improve standards of single-case methodology.
ERIC Educational Resources Information Center
Burstein, Leigh
Two specific methods of analysis in large-scale evaluations are considered: structural equation modeling and selection modeling/analysis of non-equivalent control group designs. Their utility in large-scale educational program evaluation is discussed. The examination of these methodological developments indicates how people (evaluators,…
Costing the scaling-up of human resources for health: lessons from Mozambique and Guinea Bissau
2010-01-01
Introduction In the context of the current human resources for health (HRH) crisis, the need for comprehensive Human Resources Development Plans (HRDP) is acute, especially in resource-scarce sub-Saharan African countries. However, the financial implications of such plans rarely receive due consideration, despite the availability of much advice and examples in the literature on how to conduct HRDP costing. Global initiatives have also been launched recently to standardise costing methodologies and respective tools. Methods This paper reports on two separate experiences of HRDP costing in Mozambique and Guinea Bissau, with the objective to provide an insight into the practice of costing exercises in information-poor settings, as well as to contribute to the existing debate on HRH costing methodologies. The study adopts a case-study approach to analyse the methodologies developed in the two countries, their contexts, policy processes and actors involved. Results From the analysis of the two cases, it emerged that the costing exercises represented an important driver of the HRDP elaboration, which lent credibility to the process, and provided a financial framework within which HRH policies could be discussed. In both cases, bottom-up and country-specific methods were designed to overcome the countries' lack of cost and financing data, as well as to interpret their financial systems. Such an approach also allowed the costing exercises to feed directly into the national planning and budgeting process. Conclusions The authors conclude that bottom-up and country-specific costing methodologies have the potential to serve adequately the multi-faceted purpose of the exercise. It is recognised that standardised tools and methodologies may help reduce local governments' dependency on foreign expertise to conduct the HRDP costing and facilitate regional and international comparisons. However, adopting pre-defined and insufficiently flexible tools may undermine the credibility of the costing exercise, and reduce the space for policy negotiation opportunities within the HRDP elaboration process. PMID:20579341
Costing the scaling-up of human resources for health: lessons from Mozambique and Guinea Bissau.
Tyrrell, Amanda K; Russo, Giuliano; Dussault, Gilles; Ferrinho, Paulo
2010-06-25
In the context of the current human resources for health (HRH) crisis, the need for comprehensive Human Resources Development Plans (HRDP) is acute, especially in resource-scarce sub-Saharan African countries. However, the financial implications of such plans rarely receive due consideration, despite the availability of much advice and examples in the literature on how to conduct HRDP costing. Global initiatives have also been launched recently to standardise costing methodologies and respective tools. This paper reports on two separate experiences of HRDP costing in Mozambique and Guinea Bissau, with the objective to provide an insight into the practice of costing exercises in information-poor settings, as well as to contribute to the existing debate on HRH costing methodologies. The study adopts a case-study approach to analyse the methodologies developed in the two countries, their contexts, policy processes and actors involved. From the analysis of the two cases, it emerged that the costing exercises represented an important driver of the HRDP elaboration, which lent credibility to the process, and provided a financial framework within which HRH policies could be discussed. In both cases, bottom-up and country-specific methods were designed to overcome the countries' lack of cost and financing data, as well as to interpret their financial systems. Such an approach also allowed the costing exercises to feed directly into the national planning and budgeting process. The authors conclude that bottom-up and country-specific costing methodologies have the potential to serve adequately the multi-faceted purpose of the exercise. It is recognised that standardised tools and methodologies may help reduce local governments' dependency on foreign expertise to conduct the HRDP costing and facilitate regional and international comparisons. However, adopting pre-defined and insufficiently flexible tools may undermine the credibility of the costing exercise, and reduce the space for policy negotiation opportunities within the HRDP elaboration process.
Design Evolution and Methodology for Pumpkin Super-Pressure Balloons
NASA Astrophysics Data System (ADS)
Farley, Rodger
The NASA Ultra Long Duration Balloon (ULDB) program has had many technical development issues discovered and solved along its road to success as a new vehicle. It has the promise of being a sub-satellite, a means to launch up to 2700 kg to 33.5 km altitude for 100 days from a comfortable mid-latitude launch point. Current high-lift long duration ballooning is accomplished out of Antarctica with zero-pressure balloons, which cannot cope with the rigors of diurnal cycles. The ULDB design is still evolving, the product of intense analytical effort, scaled testing, improved manufacturing, and engineering intuition. The past technical problems, in particular the s-cleft deformation, their solutions, future challenges, and the methodology of pumpkin balloon design will generally be described.
Scaling Effects on Materials Tribology: From Macro to Micro Scale.
Stoyanov, Pantcho; Chromik, Richard R
2017-05-18
The tribological study of materials inherently involves the interaction of surface asperities at the micro to nanoscopic length scales. This is the case for large scale engineering applications with sliding contacts, where the real area of contact is made up of small contacting asperities that make up only a fraction of the apparent area of contact. This is why researchers have sought to create idealized experiments of single asperity contacts in the field of nanotribology. At the same time, small scale engineering structures known as micro- and nano-electromechanical systems (MEMS and NEMS) have been developed, where the apparent area of contact approaches the length scale of the asperities, meaning the real area of contact for these devices may be only a few asperities. This is essentially the field of microtribology, where the contact size and/or forces involved have pushed the nature of the interaction between two surfaces towards the regime where the scale of the interaction approaches that of the natural length scale of the features on the surface. This paper provides a review of microtribology with the purpose to understand how tribological processes are different at the smaller length scales compared to macrotribology. Studies of the interfacial phenomena at the macroscopic length scales (e.g., using in situ tribometry) will be discussed and correlated with new findings and methodologies at the micro-length scale.
Scaling Effects on Materials Tribology: From Macro to Micro Scale
Stoyanov, Pantcho; Chromik, Richard R.
2017-01-01
The tribological study of materials inherently involves the interaction of surface asperities at the micro to nanoscopic length scales. This is the case for large scale engineering applications with sliding contacts, where the real area of contact is made up of small contacting asperities that make up only a fraction of the apparent area of contact. This is why researchers have sought to create idealized experiments of single asperity contacts in the field of nanotribology. At the same time, small scale engineering structures known as micro- and nano-electromechanical systems (MEMS and NEMS) have been developed, where the apparent area of contact approaches the length scale of the asperities, meaning the real area of contact for these devices may be only a few asperities. This is essentially the field of microtribology, where the contact size and/or forces involved have pushed the nature of the interaction between two surfaces towards the regime where the scale of the interaction approaches that of the natural length scale of the features on the surface. This paper provides a review of microtribology with the purpose to understand how tribological processes are different at the smaller length scales compared to macrotribology. Studies of the interfacial phenomena at the macroscopic length scales (e.g., using in situ tribometry) will be discussed and correlated with new findings and methodologies at the micro-length scale. PMID:28772909
NASA Astrophysics Data System (ADS)
Kossieris, Panagiotis; Makropoulos, Christos; Onof, Christian; Koutsoyiannis, Demetris
2018-01-01
Many hydrological applications, such as flood studies, require the use of long rainfall data at fine time scales varying from daily down to 1 min time step. However, in the real world there is limited availability of data at sub-hourly scales. To cope with this issue, stochastic disaggregation techniques are typically employed to produce possible, statistically consistent, rainfall events that aggregate up to the field data collected at coarser scales. A methodology for the stochastic disaggregation of rainfall at fine time scales was recently introduced, combining the Bartlett-Lewis process to generate rainfall events along with adjusting procedures to modify the lower-level variables (i.e., hourly) so as to be consistent with the higher-level one (i.e., daily). In the present paper, we extend the aforementioned scheme, initially designed and tested for the disaggregation of daily rainfall into hourly depths, for any sub-hourly time scale. In addition, we take advantage of the recent developments in Poisson-cluster processes incorporating in the methodology a Bartlett-Lewis model variant that introduces dependence between cell intensity and duration in order to capture the variability of rainfall at sub-hourly time scales. The disaggregation scheme is implemented in an R package, named HyetosMinute, to support disaggregation from daily down to 1-min time scale. The applicability of the methodology was assessed on a 5-min rainfall records collected in Bochum, Germany, comparing the performance of the above mentioned model variant against the original Bartlett-Lewis process (non-random with 5 parameters). The analysis shows that the disaggregation process reproduces adequately the most important statistical characteristics of rainfall at wide range of time scales, while the introduction of the model with dependent intensity-duration results in a better performance in terms of skewness, rainfall extremes and dry proportions.
2012-01-01
Background Nanoparticle based delivery of anticancer drugs have been widely investigated. However, a very important process for Research & Development in any pharmaceutical industry is scaling nanoparticle formulation techniques so as to produce large batches for preclinical and clinical trials. This process is not only critical but also difficult as it involves various formulation parameters to be modulated all in the same process. Methods In our present study, we formulated curcumin loaded poly (lactic acid-co-glycolic acid) nanoparticles (PLGA-CURC). This improved the bioavailability of curcumin, a potent natural anticancer drug, making it suitable for cancer therapy. Post formulation, we optimized our process by Reponse Surface Methodology (RSM) using Central Composite Design (CCD) and scaled up the formulation process in four stages with final scale-up process yielding 5 g of curcumin loaded nanoparticles within the laboratory setup. The nanoparticles formed after scale-up process were characterized for particle size, drug loading and encapsulation efficiency, surface morphology, in vitro release kinetics and pharmacokinetics. Stability analysis and gamma sterilization were also carried out. Results Results revealed that that process scale-up is being mastered for elaboration to 5 g level. The mean nanoparticle size of the scaled up batch was found to be 158.5 ± 9.8 nm and the drug loading was determined to be 10.32 ± 1.4%. The in vitro release study illustrated a slow sustained release corresponding to 75% drug over a period of 10 days. The pharmacokinetic profile of PLGA-CURC in rats following i.v. administration showed two compartmental model with the area under the curve (AUC0-∞) being 6.139 mg/L h. Gamma sterilization showed no significant change in the particle size or drug loading of the nanoparticles. Stability analysis revealed long term physiochemical stability of the PLGA-CURC formulation. Conclusions A successful effort towards formulating, optimizing and scaling up PLGA-CURC by using Solid-Oil/Water emulsion technique was demonstrated. The process used CCD-RSM for optimization and further scaled up to produce 5 g of PLGA-CURC with almost similar physicochemical characteristics as that of the primary formulated batch. PMID:22937885
Ranjan, Amalendu P; Mukerjee, Anindita; Helson, Lawrence; Vishwanatha, Jamboor K
2012-08-31
Nanoparticle based delivery of anticancer drugs have been widely investigated. However, a very important process for Research & Development in any pharmaceutical industry is scaling nanoparticle formulation techniques so as to produce large batches for preclinical and clinical trials. This process is not only critical but also difficult as it involves various formulation parameters to be modulated all in the same process. In our present study, we formulated curcumin loaded poly (lactic acid-co-glycolic acid) nanoparticles (PLGA-CURC). This improved the bioavailability of curcumin, a potent natural anticancer drug, making it suitable for cancer therapy. Post formulation, we optimized our process by Reponse Surface Methodology (RSM) using Central Composite Design (CCD) and scaled up the formulation process in four stages with final scale-up process yielding 5 g of curcumin loaded nanoparticles within the laboratory setup. The nanoparticles formed after scale-up process were characterized for particle size, drug loading and encapsulation efficiency, surface morphology, in vitro release kinetics and pharmacokinetics. Stability analysis and gamma sterilization were also carried out. Results revealed that that process scale-up is being mastered for elaboration to 5 g level. The mean nanoparticle size of the scaled up batch was found to be 158.5±9.8 nm and the drug loading was determined to be 10.32±1.4%. The in vitro release study illustrated a slow sustained release corresponding to 75% drug over a period of 10 days. The pharmacokinetic profile of PLGA-CURC in rats following i.v. administration showed two compartmental model with the area under the curve (AUC0-∞) being 6.139 mg/L h. Gamma sterilization showed no significant change in the particle size or drug loading of the nanoparticles. Stability analysis revealed long term physiochemical stability of the PLGA-CURC formulation. A successful effort towards formulating, optimizing and scaling up PLGA-CURC by using Solid-Oil/Water emulsion technique was demonstrated. The process used CCD-RSM for optimization and further scaled up to produce 5 g of PLGA-CURC with almost similar physicochemical characteristics as that of the primary formulated batch.
Alfadl, Abubakr A; Ibrahim, Mohamed Izham b Mohamed; Hassali, Mohamed Azmi Ahmad
2013-09-11
Although desperate need and drug counterfeiting are linked in developing countries, little research has been carried out to address this link, and there is a lack of proper tools and methodology. This study addresses the need for a new methodological approach by developing a scale to aid in understanding the demand side of drug counterfeiting in a developing country. The study presents a quantitative, non-representative survey conducted in Sudan. A face-to-face structured interview survey methodology was employed to collect the data from the general population (people in the street) in two phases: pilot (n = 100) and final survey (n = 1003). Data were analyzed by examining means, variances, squared multiple correlations, item-to-total correlations, and the results of an exploratory factor analysis and a confirmatory factor analysis. As an approach to scale purification, internal consistency was examined and improved. The scale was reduced from 44 to 41 items and Cronbach's alpha improved from 0.818 to 0.862. Finally, scale items were assessed. The result was an eleven-factor solution. Convergent and discriminant validity were demonstrated. The results of this study indicate that the "Consumer Behavior Toward Counterfeit Drugs Scale" is a valid, reliable measure with a solid theoretical base. Ultimately, the study offers public health policymakers a valid measurement tool and, consequently, a new methodological approach with which to build a better understanding of the demand side of counterfeit drugs and to develop more effective strategies to combat the problem.
NASA Astrophysics Data System (ADS)
Klees, R.; Slobbe, D. C.; Farahani, H. H.
2018-04-01
The paper is about a methodology to combine a noisy satellite-only global gravity field model (GGM) with other noisy datasets to estimate a local quasi-geoid model using weighted least-squares techniques. In this way, we attempt to improve the quality of the estimated quasi-geoid model and to complement it with a full noise covariance matrix for quality control and further data processing. The methodology goes beyond the classical remove-compute-restore approach, which does not account for the noise in the satellite-only GGM. We suggest and analyse three different approaches of data combination. Two of them are based on a local single-scale spherical radial basis function (SRBF) model of the disturbing potential, and one is based on a two-scale SRBF model. Using numerical experiments, we show that a single-scale SRBF model does not fully exploit the information in the satellite-only GGM. We explain this by a lack of flexibility of a single-scale SRBF model to deal with datasets of significantly different bandwidths. The two-scale SRBF model performs well in this respect, provided that the model coefficients representing the two scales are estimated separately. The corresponding methodology is developed in this paper. Using the statistics of the least-squares residuals and the statistics of the errors in the estimated two-scale quasi-geoid model, we demonstrate that the developed methodology provides a two-scale quasi-geoid model, which exploits the information in all datasets.
[Shoulder disability questionnaires: a systematic review].
Fayad, F; Mace, Y; Lefevre-Colau, M M
2005-07-01
To identify all available shoulder disability questionnaires designed to measure physical functioning and to examine those with satisfactory clinimetric quality. We used the Medline database and the "Guide des outils de mesure de l'évaluation en médecine physique et de réadaptation" textbook to search for questionnaires. Analysis took into account the development methodology, clinimetric quality of the instruments and frequency of their utilization. We classified the instruments according to the International Classification of Functioning, Disability and Health. Thirty-eight instruments have been developed to measure disease-, shoulder- or upper extremity-specific outcome. Four scales assess upper-extremity disability and 3 others shoulder disability. We found 6 scales evaluating disability and shoulder pain, 7 scales measuring the quality of life in patients with various conditions of the shoulder, 14 scales combining objective and subjective measures, 2 pain scales and 2 unclassified scales. Older instruments developed before the advent of modern measurement development methodology usually combine objective and subjective measures. Recent instruments were designed with appropriate methodology. Most are self-administered questionnaires. Numerous shoulder outcome measure instruments are available. There is no "gold standard" for assessing shoulder function outcome in the general population.
ERIC Educational Resources Information Center
Randolph, Justus
2005-01-01
A high quality review of the distance learning literature from 1992-1999 concluded that most of the research on distance learning had serious methodological flaws. This paper presents the results of a small-scale replication of that review. From three leading distance education journals, a sample of 66 articles was categorized by study type and…
A strategic approach for Water Safety Plans implementation in Portugal.
Vieira, Jose M P
2011-03-01
Effective risk assessment and risk management approaches in public drinking water systems can benefit from a systematic process for hazards identification and effective management control based on the Water Safety Plan (WSP) concept. Good results from WSP development and implementation in a small number of Portuguese water utilities have shown that a more ambitious nationwide strategic approach to disseminate this methodology is needed. However, the establishment of strategic frameworks for systematic and organic scaling-up of WSP implementation at a national level requires major constraints to be overcome: lack of legislation and policies and the need for appropriate monitoring tools. This study presents a framework to inform future policy making by understanding the key constraints and needs related to institutional, organizational and research issues for WSP development and implementation in Portugal. This methodological contribution for WSP implementation can be replicated at a global scale. National health authorities and the Regulator may promote changes in legislation and policies. Independent global monitoring and benchmarking are adequate tools for measuring the progress over time and for comparing the performance of water utilities. Water utilities self-assessment must include performance improvement, operational monitoring and verification. Research and education and resources dissemination ensure knowledge acquisition and transfer.
Scaled Rocket Testing in Hypersonic Flow
NASA Technical Reports Server (NTRS)
Dufrene, Aaron; MacLean, Matthew; Carr, Zakary; Parker, Ron; Holden, Michael; Mehta, Manish
2015-01-01
NASA's Space Launch System (SLS) uses four clustered liquid rocket engines along with two solid rocket boosters. The interaction between all six rocket exhaust plumes will produce a complex and severe thermal environment in the base of the vehicle. This work focuses on a recent 2% scale, hot-fire SLS base heating test. These base heating tests are short-duration tests executed with chamber pressures near the full-scale values with gaseous hydrogen/oxygen engines and RSRMV analogous solid propellant motors. The LENS II shock tunnel/Ludwieg tube tunnel was used at or near flight duplicated conditions up to Mach 5. Model development was strongly based on the Space Shuttle base heating tests with several improvements including doubling of the maximum chamber pressures and duplication of freestream conditions. Detailed base heating results are outside of the scope of the current work, rather test methodology and techniques are presented along with broader applicability toward scaled rocket testing in supersonic and hypersonic flow.
Machtans, Craig S.; Thogmartin, Wayne E.
2014-01-01
The publication of a U.S. estimate of bird–window collisions by Loss et al. is an example of the somewhat contentious approach of using extrapolations to obtain large-scale estimates from small-scale studies. We review the approach by Loss et al. and other authors who have published papers on human-induced avian mortality and describe the drawbacks and advantages to publishing what could be considered imperfect science. The main drawback is the inherent and somewhat unquantifiable bias of using small-scale studies to scale up to a national estimate. The direct benefits include development of new methodologies for creating the estimates, an explicit treatment of known biases with acknowledged uncertainty in the final estimate, and the novel results. Other overarching benefits are that these types of papers are catalysts for improving all aspects of the science of estimates and for policies that must respond to the new information.
An engineering closure for heavily under-resolved coarse-grid CFD in large applications
NASA Astrophysics Data System (ADS)
Class, Andreas G.; Yu, Fujiang; Jordan, Thomas
2016-11-01
Even though high performance computation allows very detailed description of a wide range of scales in scientific computations, engineering simulations used for design studies commonly merely resolve the large scales thus speeding up simulation time. The coarse-grid CFD (CGCFD) methodology is developed for flows with repeated flow patterns as often observed in heat exchangers or porous structures. It is proposed to use inviscid Euler equations on a very coarse numerical mesh. This coarse mesh needs not to conform to the geometry in all details. To reinstall physics on all smaller scales cheap subgrid models are employed. Subgrid models are systematically constructed by analyzing well-resolved generic representative simulations. By varying the flow conditions in these simulations correlations are obtained. These comprehend for each individual coarse mesh cell a volume force vector and volume porosity. Moreover, for all vertices, surface porosities are derived. CGCFD is related to the immersed boundary method as both exploit volume forces and non-body conformal meshes. Yet, CGCFD differs with respect to the coarser mesh and the use of Euler equations. We will describe the methodology based on a simple test case and the application of the method to a 127 pin wire-wrap fuel bundle.
High-Order Moving Overlapping Grid Methodology in a Spectral Element Method
NASA Astrophysics Data System (ADS)
Merrill, Brandon E.
A moving overlapping mesh methodology that achieves spectral accuracy in space and up to second-order accuracy in time is developed for solution of unsteady incompressible flow equations in three-dimensional domains. The targeted applications are in aerospace and mechanical engineering domains and involve problems in turbomachinery, rotary aircrafts, wind turbines and others. The methodology is built within the dual-session communication framework initially developed for stationary overlapping meshes. The methodology employs semi-implicit spectral element discretization of equations in each subdomain and explicit treatment of subdomain interfaces with spectrally-accurate spatial interpolation and high-order accurate temporal extrapolation, and requires few, if any, iterations, yet maintains the global accuracy and stability of the underlying flow solver. Mesh movement is enabled through the Arbitrary Lagrangian-Eulerian formulation of the governing equations, which allows for prescription of arbitrary velocity values at discrete mesh points. The stationary and moving overlapping mesh methodologies are thoroughly validated using two- and three-dimensional benchmark problems in laminar and turbulent flows. The spatial and temporal global convergence, for both methods, is documented and is in agreement with the nominal order of accuracy of the underlying solver. Stationary overlapping mesh methodology was validated to assess the influence of long integration times and inflow-outflow global boundary conditions on the performance. In a turbulent benchmark of fully-developed turbulent pipe flow, the turbulent statistics are validated against the available data. Moving overlapping mesh simulations are validated on the problems of two-dimensional oscillating cylinder and a three-dimensional rotating sphere. The aerodynamic forces acting on these moving rigid bodies are determined, and all results are compared with published data. Scaling tests, with both methodologies, show near linear strong scaling, even for moderately large processor counts. The moving overlapping mesh methodology is utilized to investigate the effect of an upstream turbulent wake on a three-dimensional oscillating NACA0012 extruded airfoil. A direct numerical simulation (DNS) at Reynolds Number 44,000 is performed for steady inflow incident upon the airfoil oscillating between angle of attack 5.6° and 25° with reduced frequency k=0.16. Results are contrasted with subsequent DNS of the same oscillating airfoil in a turbulent wake generated by a stationary upstream cylinder.
The Dispositions for Culturally Responsive Pedagogy Scale
ERIC Educational Resources Information Center
Whitaker, Manya C.; Valtierra, Kristina Marie
2018-01-01
Purpose: The purpose of this study is to develop and validate the dispositions for culturally responsive pedagogy scale (DCRPS). Design/methodology/approach: Scale development consisted of a six-step process including item development, expert review, exploratory factor analysis, factor interpretation, confirmatory factor analysis and convergent…
An Integrated Modeling and Simulation Methodology for Intelligent Systems Design and Testing
2002-08-01
simulation and actual execution. KEYWORDS: Model Continuity, Modeling, Simulation, Experimental Frame, Real Time Systems , Intelligent Systems...the methodology for a stand-alone real time system. Then it will scale up to distributed real time systems . For both systems, step-wise simulation...MODEL CONTINUITY Intelligent real time systems monitor, respond to, or control, an external environment. This environment is connected to the digital
Prat, P; Aulinas, M; Turon, C; Comas, J; Poch, M
2009-01-01
Current management of sanitation infrastructures (sewer systems, wastewater treatment plant, receiving water, bypasses, deposits, etc) is not fulfilling the objectives of up to date legislation, to achieve a good ecological and chemical status of water bodies through integrated management. These made it necessary to develop new methodologies that help decision makers to improve the management in order to achieve that status. Decision Support Systems (DSS) based on Multi-Agent System (MAS) paradigm are promising tools to improve the integrated management. When all the different agents involved interact, new important knowledge emerges. This knowledge can be used to build better DSS and improve wastewater infrastructures management achieving the objectives planned by legislation. The paper describes a methodology to acquire this knowledge through a Role Playing Game (RPG). First of all there is an introduction about the wastewater problems, a definition of RPG, and the relation between RPG and MAS. Then it is explained how the RPG was built with two examples of game sessions and results. The paper finishes with a discussion about the uses of this methodology and future work.
Aqueous Two-Phase Systems at Large Scale: Challenges and Opportunities.
Torres-Acosta, Mario A; Mayolo-Deloisa, Karla; González-Valdez, José; Rito-Palomares, Marco
2018-06-07
Aqueous two-phase systems (ATPS) have proved to be an efficient and integrative operation to enhance recovery of industrially relevant bioproducts. After ATPS discovery, a variety of works have been published regarding their scaling from 10 to 1000 L. Although ATPS have achieved high recovery and purity yields, there is still a gap between their bench-scale use and potential industrial applications. In this context, this review paper critically analyzes ATPS scale-up strategies to enhance the potential industrial adoption. In particular, large-scale operation considerations, different phase separation procedures, the available optimization techniques (univariate, response surface methodology, and genetic algorithms) to maximize recovery and purity and economic modeling to predict large-scale costs, are discussed. ATPS intensification to increase the amount of sample to process at each system, developing recycling strategies and creating highly efficient predictive models, are still areas of great significance that can be further exploited with the use of high-throughput techniques. Moreover, the development of novel ATPS can maximize their specificity increasing the possibilities for the future industry adoption of ATPS. This review work attempts to present the areas of opportunity to increase ATPS attractiveness at industrial levels. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Evaluation of Tsunami Run-Up on Coastal Areas at Regional Scale
NASA Astrophysics Data System (ADS)
González, M.; Aniel-Quiroga, Í.; Gutiérrez, O.
2017-12-01
Tsunami hazard assessment is tackled by means of numerical simulations, giving as a result, the areas flooded by tsunami wave inland. To get this, some input data is required, i.e., the high resolution topobathymetry of the study area, the earthquake focal mechanism parameters, etc. The computational cost of these kinds of simulations are still excessive. An important restriction for the elaboration of large scale maps at National or regional scale is the reconstruction of high resolution topobathymetry on the coastal zone. An alternative and traditional method consists of the application of empirical-analytical formulations to calculate run-up at several coastal profiles (i.e. Synolakis, 1987), combined with numerical simulations offshore without including coastal inundation. In this case, the numerical simulations are faster but some limitations are added as the coastal bathymetric profiles are very simply idealized. In this work, we present a complementary methodology based on a hybrid numerical model, formed by 2 models that were coupled ad hoc for this work: a non-linear shallow water equations model (NLSWE) for the offshore part of the propagation and a Volume of Fluid model (VOF) for the areas near the coast and inland, applying each numerical scheme where they better reproduce the tsunami wave. The run-up of a tsunami scenario is obtained by applying the coupled model to an ad-hoc numerical flume. To design this methodology, hundreds of worldwide topobathymetric profiles have been parameterized, using 5 parameters (2 depths and 3 slopes). In addition, tsunami waves have been also parameterized by their height and period. As an application of the numerical flume methodology, the coastal parameterized profiles and tsunami waves have been combined to build a populated database of run-up calculations. The combination was tackled by means of numerical simulations in the numerical flume The result is a tsunami run-up database that considers real profiles shape, realistic tsunami waves, and optimized numerical simulations. This database allows the calculation of the run-up of any new tsunami wave by interpolation on the database, in a short period of time, based on the tsunami wave characteristics provided as an output of the NLSWE model along the coast at a large scale domain (regional or National scale).
Probabilistic simulation of multi-scale composite behavior
NASA Technical Reports Server (NTRS)
Liaw, D. G.; Shiao, M. C.; Singhal, S. N.; Chamis, Christos C.
1993-01-01
A methodology is developed to computationally assess the probabilistic composite material properties at all composite scale levels due to the uncertainties in the constituent (fiber and matrix) properties and in the fabrication process variables. The methodology is computationally efficient for simulating the probability distributions of material properties. The sensitivity of the probabilistic composite material property to each random variable is determined. This information can be used to reduce undesirable uncertainties in material properties at the macro scale of the composite by reducing the uncertainties in the most influential random variables at the micro scale. This methodology was implemented into the computer code PICAN (Probabilistic Integrated Composite ANalyzer). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in the material properties of a typical laminate and comparing the results with the Monte Carlo simulation method. The experimental data of composite material properties at all scales fall within the scatters predicted by PICAN.
2013-01-01
Background Although desperate need and drug counterfeiting are linked in developing countries, little research has been carried out to address this link, and there is a lack of proper tools and methodology. This study addresses the need for a new methodological approach by developing a scale to aid in understanding the demand side of drug counterfeiting in a developing country. Methods The study presents a quantitative, non-representative survey conducted in Sudan. A face-to-face structured interview survey methodology was employed to collect the data from the general population (people in the street) in two phases: pilot (n = 100) and final survey (n = 1003). Data were analyzed by examining means, variances, squared multiple correlations, item-to-total correlations, and the results of an exploratory factor analysis and a confirmatory factor analysis. Results As an approach to scale purification, internal consistency was examined and improved. The scale was reduced from 44 to 41 items and Cronbach’s alpha improved from 0.818 to 0.862. Finally, scale items were assessed. The result was an eleven-factor solution. Convergent and discriminant validity were demonstrated. Conclusion The results of this study indicate that the “Consumer Behavior Toward Counterfeit Drugs Scale” is a valid, reliable measure with a solid theoretical base. Ultimately, the study offers public health policymakers a valid measurement tool and, consequently, a new methodological approach with which to build a better understanding of the demand side of counterfeit drugs and to develop more effective strategies to combat the problem. PMID:24020730
Baumann, Pascal; Hahn, Tobias; Hubbuch, Jürgen
2015-10-01
Upstream processes are rather complex to design and the productivity of cells under suitable cultivation conditions is hard to predict. The method of choice for examining the design space is to execute high-throughput cultivation screenings in micro-scale format. Various predictive in silico models have been developed for many downstream processes, leading to a reduction of time and material costs. This paper presents a combined optimization approach based on high-throughput micro-scale cultivation experiments and chromatography modeling. The overall optimized system must not necessarily be the one with highest product titers, but the one resulting in an overall superior process performance in up- and downstream. The methodology is presented in a case study for the Cherry-tagged enzyme Glutathione-S-Transferase from Escherichia coli SE1. The Cherry-Tag™ (Delphi Genetics, Belgium) which can be fused to any target protein allows for direct product analytics by simple VIS absorption measurements. High-throughput cultivations were carried out in a 48-well format in a BioLector micro-scale cultivation system (m2p-Labs, Germany). The downstream process optimization for a set of randomly picked upstream conditions producing high yields was performed in silico using a chromatography modeling software developed in-house (ChromX). The suggested in silico-optimized operational modes for product capturing were validated subsequently. The overall best system was chosen based on a combination of excellent up- and downstream performance. © 2015 Wiley Periodicals, Inc.
Optimizing fusion PIC code performance at scale on Cori Phase 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koskela, T. S.; Deslippe, J.
In this paper we present the results of optimizing the performance of the gyrokinetic full-f fusion PIC code XGC1 on the Cori Phase Two Knights Landing system. The code has undergone substantial development to enable the use of vector instructions in its most expensive kernels within the NERSC Exascale Science Applications Program. We study the single-node performance of the code on an absolute scale using the roofline methodology to guide optimization efforts. We have obtained 2x speedups in single node performance due to enabling vectorization and performing memory layout optimizations. On multiple nodes, the code is shown to scale wellmore » up to 4000 nodes, near half the size of the machine. We discuss some communication bottlenecks that were identified and resolved during the work.« less
NASA Astrophysics Data System (ADS)
Chen, J.; Wang, D.; Zhao, R. L.; Zhang, H.; Liao, A.; Jiu, J.
2014-04-01
Geospatial databases are irreplaceable national treasure of immense importance. Their up-to-dateness referring to its consistency with respect to the real world plays a critical role in its value and applications. The continuous updating of map databases at 1:50,000 scales is a massive and difficult task for larger countries of the size of more than several million's kilometer squares. This paper presents the research and technological development to support the national map updating at 1:50,000 scales in China, including the development of updating models and methods, production tools and systems for large-scale and rapid updating, as well as the design and implementation of the continuous updating workflow. The use of many data sources and the integration of these data to form a high accuracy, quality checked product were required. It had in turn required up to date techniques of image matching, semantic integration, generalization, data base management and conflict resolution. Design and develop specific software tools and packages to support the large-scale updating production with high resolution imagery and large-scale data generalization, such as map generalization, GIS-supported change interpretation from imagery, DEM interpolation, image matching-based orthophoto generation, data control at different levels. A national 1:50,000 databases updating strategy and its production workflow were designed, including a full coverage updating pattern characterized by all element topographic data modeling, change detection in all related areas, and whole process data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country.
ERIC Educational Resources Information Center
Kansiime, Monica K.; Watiti, James; Mchana, Abigael; Jumah, Raymond; Musebe, Richard; Rware, Harrison
2018-01-01
Purpose: We assessed the effectiveness of Village-based Advisors (VBAs) as a novel approach for scaling up improved common bean technologies in southern highlands of Tanzania. Design/methodology/approach: Data were gathered through focus group discussions (FGDs) and interviews with 11 VBAs and 102 farmers (37% female). The effectiveness of VBAs…
Energy Conservation Using Dynamic Voltage Frequency Scaling for Computational Cloud
Florence, A. Paulin; Shanthi, V.; Simon, C. B. Sunil
2016-01-01
Cloud computing is a new technology which supports resource sharing on a “Pay as you go” basis around the world. It provides various services such as SaaS, IaaS, and PaaS. Computation is a part of IaaS and the entire computational requests are to be served efficiently with optimal power utilization in the cloud. Recently, various algorithms are developed to reduce power consumption and even Dynamic Voltage and Frequency Scaling (DVFS) scheme is also used in this perspective. In this paper we have devised methodology which analyzes the behavior of the given cloud request and identifies the associated type of algorithm. Once the type of algorithm is identified, using their asymptotic notations, its time complexity is calculated. Using best fit strategy the appropriate host is identified and the incoming job is allocated to the victimized host. Using the measured time complexity the required clock frequency of the host is measured. According to that CPU frequency is scaled up or down using DVFS scheme, enabling energy to be saved up to 55% of total Watts consumption. PMID:27239551
Energy Conservation Using Dynamic Voltage Frequency Scaling for Computational Cloud.
Florence, A Paulin; Shanthi, V; Simon, C B Sunil
2016-01-01
Cloud computing is a new technology which supports resource sharing on a "Pay as you go" basis around the world. It provides various services such as SaaS, IaaS, and PaaS. Computation is a part of IaaS and the entire computational requests are to be served efficiently with optimal power utilization in the cloud. Recently, various algorithms are developed to reduce power consumption and even Dynamic Voltage and Frequency Scaling (DVFS) scheme is also used in this perspective. In this paper we have devised methodology which analyzes the behavior of the given cloud request and identifies the associated type of algorithm. Once the type of algorithm is identified, using their asymptotic notations, its time complexity is calculated. Using best fit strategy the appropriate host is identified and the incoming job is allocated to the victimized host. Using the measured time complexity the required clock frequency of the host is measured. According to that CPU frequency is scaled up or down using DVFS scheme, enabling energy to be saved up to 55% of total Watts consumption.
An Integrated Scale for Measuring an Organizational Learning System
ERIC Educational Resources Information Center
Jyothibabu, C.; Farooq, Ayesha; Pradhan, Bibhuti Bhusan
2010-01-01
Purpose: The purpose of this paper is to develop an integrated measurement scale for an organizational learning system by capturing the learning enablers, learning results and performance outcome in an organization. Design/methodology/approach: A new measurement scale was developed by integrating and modifying two existing scales, identified…
Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology
Comella, Cynthia L.; Fox, Susan H.; Bhatia, Kailash P.; Perlmutter, Joel S.; Jinnah, Hyder A.; Zurowski, Mateusz; McDonald, William M.; Marsh, Laura; Rosen, Ami R.; Waliczek, Tracy; Wright, Laura J.; Galpern, Wendy R.; Stebbins, Glenn T.
2016-01-01
We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies. PMID:27088112
Development of the Comprehensive Cervical Dystonia Rating Scale: Methodology.
Comella, Cynthia L; Fox, Susan H; Bhatia, Kailash P; Perlmutter, Joel S; Jinnah, Hyder A; Zurowski, Mateusz; McDonald, William M; Marsh, Laura; Rosen, Ami R; Waliczek, Tracy; Wright, Laura J; Galpern, Wendy R; Stebbins, Glenn T
2015-06-01
We present the methodology utilized for development and clinimetric testing of the Comprehensive Cervical Dystonia (CD) Rating scale, or CCDRS. The CCDRS includes a revision of the Toronto Western Spasmodic Torticollis Rating Scale (TWSTRS-2), a newly developed psychiatric screening tool (TWSTRS-PSYCH), and the previously validated Cervical Dystonia Impact Profile (CDIP-58). For the revision of the TWSTRS, the original TWSTRS was examined by a committee of dystonia experts at a dystonia rating scales workshop organized by the Dystonia Medical Research Foundation. During this workshop, deficiencies in the standard TWSTRS were identified and recommendations for revision of the severity and pain subscales were incorporated into the TWSTRS-2. Given that no scale currently evaluates the psychiatric features of cervical dystonia (CD), we used a modified Delphi methodology and a reiterative process of item selection to develop the TWSTRS-PSYCH. We also included the CDIP-58 to capture the impact of CD on quality of life. The three scales (TWSTRS2, TWSTRS-PSYCH, and CDIP-58) were combined to construct the CCDRS. Clinimetric testing of reliability and validity of the CCDRS are described. The CCDRS was designed to be used in a modular fashion that can measure the full spectrum of CD. This scale will provide rigorous assessment for studies of natural history as well as novel symptom-based or disease-modifying therapies.
[Modeling continuous scaling of NDVI based on fractal theory].
Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng
2013-07-01
Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.
Chan, Chung-Hung; See, Tiam-You; Yusoff, Rozita; Ngoh, Gek-Cheng; Kow, Kien-Woh
2017-04-15
This work demonstrated the optimization and scale up of microwave-assisted extraction (MAE) and ultrasonic-assisted extraction (UAE) of bioactive compounds from Orthosiphon stamineus using energy-based parameters such as absorbed power density and absorbed energy density (APD-AED) and response surface methodology (RSM). The intensive optimum conditions of MAE obtained at 80% EtOH, 50mL/g, APD of 0.35W/mL, AED of 250J/mL can be used to determine the optimum conditions of the scale-dependent parameters i.e. microwave power and treatment time at various extraction scales (100-300mL solvent loading). The yields of the up scaled conditions were consistent with less than 8% discrepancy and they were about 91-98% of the Soxhlet extraction yield. By adapting APD-AED method in the case of UAE, the intensive optimum conditions of the extraction, i.e. 70% EtOH, 30mL/g, APD of 0.22W/mL, AED of 450J/mL are able to achieve similar scale up results. Copyright © 2016 Elsevier Ltd. All rights reserved.
Porfirif, María C; Milatich, Esteban J; Farruggia, Beatriz M; Romanini, Diana
2016-06-01
A one-step method as a strategy of alpha-amylase concentration and purification was developed in this work. This methodology requires the use of a very low concentration of biodegradable polyelectrolyte (Eudragit(®) E-PO) and represents a low cost, fast, easy to scale up and non-polluting technology. Besides, this methodology allows recycling the polymer after precipitation. The formation of reversible soluble/insoluble complexes between alpha-amylase and the polymer Eudragit(®) E-PO was studied, and their precipitation in selected conditions was applied with bioseparation purposes. Turbidimetric assays allowed to determine the pH range where the complexes are insoluble (4.50-7.00); pH 5.50 yielded the highest turbidity of the system. The presence of NaCl (0.05M) in the medium totally dissociates the protein-polymer complexes. When the adequate concentration of polymer was added under these conditions to a liquid culture of Aspergillus oryzae, purification factors of alpha-amylase up to 7.43 and recoveries of 88% were obtained in a simple step without previous clarification. These results demonstrate that this methodology is suitable for the concentration and production of alpha-amylase from this source and could be applied at the beginning of downstream processing. Copyright © 2016 Elsevier B.V. All rights reserved.
Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica
2016-06-01
To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs <11% in all cases. The application of this methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix effect and LODs/LOQs of each method were studied for all the analytes: endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for chlorpyrifos. In the first laboratory evaluation of these biobeds endosulfan was bioconverted up to 87% and chlorpyrifos more than 79% after 27 days. Copyright © 2016 Elsevier B.V. All rights reserved.
Musabayane, N
2000-01-01
This paper focuses on the use of Participatory Hygiene and Sanitation Transformation (PHAST) and how the methodology can be taken to scale. It uses the Zimbabwe experience and highlights some of the benefits in the application of PHAST, conditions necessary for scaling up and possible constraints. The PHAST initiative started off as a pilot process seeking to promote improved hygiene behaviour and promotion of sanitation. Having successfully piloted PHAST, Zimbabwe has scaled up the use of the methodology at a country level. While impact studies have not yet been conducted, reviews of the effects of the process have indicated positive behaviour change in such areas as management of water, construction and use of latrines. The process has also led to a change of institutional approaches in planning for improved water and sanitation from supply driven projects to demand responsive approaches. Some lessons learnt have included the need for baseline surveys at the start of the use of PHAST, the difficulty in developing monitoring indicators and hence difficulty in measuring impacts. Conclusions being drawn using assessment studies are that the use of participatory approaches has led to improved hygiene behaviour with communities being able to link causes and effects. The use of participatory methods also necessitates a change in institutional approaches from supply driven approaches to demand responsiveness. Other lessons drawn were related to the creation of an enabling environment for the application of participatory processes. Such enabling environment includes capacity building, resource allocation, policy and institutional support.
Effective Rating Scale Development for Speaking Tests: Performance Decision Trees
ERIC Educational Resources Information Center
Fulcher, Glenn; Davidson, Fred; Kemp, Jenny
2011-01-01
Rating scale design and development for testing speaking is generally conducted using one of two approaches: the measurement-driven approach or the performance data-driven approach. The measurement-driven approach prioritizes the ordering of descriptors onto a single scale. Meaning is derived from the scaling methodology and the agreement of…
NASA Astrophysics Data System (ADS)
Liu, Yushi; Poh, Hee Joo
2014-11-01
The Computational Fluid Dynamics analysis has become increasingly important in modern urban planning in order to create highly livable city. This paper presents a multi-scale modeling methodology which couples Weather Research and Forecasting (WRF) Model with open source CFD simulation tool, OpenFOAM. This coupling enables the simulation of the wind flow and pollutant dispersion in urban built-up area with high resolution mesh. In this methodology meso-scale model WRF provides the boundary condition for the micro-scale CFD model OpenFOAM. The advantage is that the realistic weather condition is taken into account in the CFD simulation and complexity of building layout can be handled with ease by meshing utility of OpenFOAM. The result is validated against the Joint Urban 2003 Tracer Field Tests in Oklahoma City and there is reasonably good agreement between the CFD simulation and field observation. The coupling of WRF- OpenFOAM provide urban planners with reliable environmental modeling tool in actual urban built-up area; and it can be further extended with consideration of future weather conditions for the scenario studies on climate change impact.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dombroski, M; Melius, C; Edmunds, T
2008-09-24
This study uses the Multi-scale Epidemiologic Simulation and Analysis (MESA) system developed for foreign animal diseases to assess consequences of nationwide human infectious disease outbreaks. A literature review identified the state of the art in both small-scale regional models and large-scale nationwide models and characterized key aspects of a nationwide epidemiological model. The MESA system offers computational advantages over existing epidemiological models and enables a broader array of stochastic analyses of model runs to be conducted because of those computational advantages. However, it has only been demonstrated on foreign animal diseases. This paper applied the MESA modeling methodology to humanmore » epidemiology. The methodology divided 2000 US Census data at the census tract level into school-bound children, work-bound workers, elderly, and stay at home individuals. The model simulated mixing among these groups by incorporating schools, workplaces, households, and long-distance travel via airports. A baseline scenario with fixed input parameters was run for a nationwide influenza outbreak using relatively simple social distancing countermeasures. Analysis from the baseline scenario showed one of three possible results: (1) the outbreak burned itself out before it had a chance to spread regionally, (2) the outbreak spread regionally and lasted a relatively long time, although constrained geography enabled it to eventually be contained without affecting a disproportionately large number of people, or (3) the outbreak spread through air travel and lasted a long time with unconstrained geography, becoming a nationwide pandemic. These results are consistent with empirical influenza outbreak data. The results showed that simply scaling up a regional small-scale model is unlikely to account for all the complex variables and their interactions involved in a nationwide outbreak. There are several limitations of the methodology that should be explored in future work including validating the model against reliable historical disease data, improving contact rates, spread methods, and disease parameters through discussions with epidemiological experts, and incorporating realistic behavioral assumptions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hamada, Yuki; Grippo, Mark A.
2015-01-01
A monitoring plan that incorporates regional datasets and integrates cost-effective data collection methods is necessary to sustain the long-term environmental monitoring of utility-scale solar energy development in expansive, environmentally sensitive desert environments. Using very high spatial resolution (VHSR; 15 cm) multispectral imagery collected in November 2012 and January 2014, an image processing routine was developed to characterize ephemeral streams, vegetation, and land surface in the southwestern United States where increased utility-scale solar development is anticipated. In addition to knowledge about desert landscapes, the methodology integrates existing spectral indices and transformation (e.g., visible atmospherically resistant index and principal components); a newlymore » developed index, erosion resistance index (ERI); and digital terrain and surface models, all of which were derived from a common VHSR image. The methodology identified fine-scale ephemeral streams with greater detail than the National Hydrography Dataset and accurately estimated vegetation distribution and fractional cover of various surface types. The ERI classified surface types that have a range of erosive potentials. The remote-sensing methodology could ultimately reduce uncertainty and monitoring costs for all stakeholders by providing a cost-effective monitoring approach that accurately characterizes the land resources at potential development sites.« less
ERIC Educational Resources Information Center
Wine, Jennifer S.; Whitmore, Roy W.; Heuer, Ruth E.; Biber, Melissa; Pratt, Daniel J.
This report describes the methods and procedures used for the full-scale data collection effort of the Beginning Postsecondary Students Longitudinal Study First Follow-Up 1996-98 (BPS:96/98). These students, who started their postsecondary education during the 1995-96 academic year, were first interviewed during 1996 as part of the National…
Development of a large-scale transportation optimization course.
DOT National Transportation Integrated Search
2011-11-01
"In this project, a course was developed to introduce transportation and logistics applications of large-scale optimization to graduate students. This report details what : similar courses exist in other universities, and the methodology used to gath...
Barriers to Research Utilization Scale: psychometric properties of the Turkish version.
Temel, Ayla Bayik; Uysal, Aynur; Ardahan, Melek; Ozkahraman, Sukran
2010-02-01
This paper is report of a study designed to assess the psychometric properties of the Turkish version of the Barriers to Research Utilization Scale. The original Barriers to Research Utilization Scale was developed by Funk et al. in the United States of America. Many researchers in various countries have used this scale to identify barriers to research utilization. A methodological study was carried out at four hospitals. The sample consisted of 300 nurses. Data were collected in 2005 using a socio-demographic form (12 questions) and the Turkish version of the Barriers to Research Utilization Scale. A Likert-type scale composed of four sub-factors and 29 items was used. Means and standard deviations were calculated for interval level data. A P value of <0.05 was considered statistically significant. Language equivalence and content validity were assessed by eight experts. Confirmatory factor analysis revealed that the Turkish version was made up of four subscales. Internal consistency reliability coefficient was 0.92 for the total scale and ranged from 0.73 to 0.80 for the subscales. Total-item correlation coefficients ranged from 0.37 to 0.60. The Turkish version of the scale is similar in structure to the original English language scale.
Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.
2014-05-01
The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.
Large-scale mapping of hard-rock aquifer properties applied to Burkina Faso.
Courtois, Nathalie; Lachassagne, Patrick; Wyns, Robert; Blanchin, Raymonde; Bougaïré, Francis D; Somé, Sylvain; Tapsoba, Aïssata
2010-01-01
A country-scale (1:1,000,000) methodology has been developed for hydrogeologic mapping of hard-rock aquifers (granitic and metamorphic rocks) of the type that underlie a large part of the African continent. The method is based on quantifying the "useful thickness" and hydrodynamic properties of such aquifers and uses a recent conceptual model developed for this hydrogeologic context. This model links hydrodynamic parameters (transmissivity, storativity) to lithology and the geometry of the various layers constituting a weathering profile. The country-scale hydrogeological mapping was implemented in Burkina Faso, where a recent 1:1,000,000-scale digital geological map and a database of some 16,000 water wells were used to evaluate the methodology.
Multi-scale structural community organisation of the human genome.
Boulos, Rasha E; Tremblay, Nicolas; Arneodo, Alain; Borgnat, Pierre; Audit, Benjamin
2017-04-11
Structural interaction frequency matrices between all genome loci are now experimentally achievable thanks to high-throughput chromosome conformation capture technologies. This ensues a new methodological challenge for computational biology which consists in objectively extracting from these data the structural motifs characteristic of genome organisation. We deployed the fast multi-scale community mining algorithm based on spectral graph wavelets to characterise the networks of intra-chromosomal interactions in human cell lines. We observed that there exist structural domains of all sizes up to chromosome length and demonstrated that the set of structural communities forms a hierarchy of chromosome segments. Hence, at all scales, chromosome folding predominantly involves interactions between neighbouring sites rather than the formation of links between distant loci. Multi-scale structural decomposition of human chromosomes provides an original framework to question structural organisation and its relationship to functional regulation across the scales. By construction the proposed methodology is independent of the precise assembly of the reference genome and is thus directly applicable to genomes whose assembly is not fully determined.
[Methodologies for Ascertaining Local Education Needs and for Allocating and Developing Resources.
ERIC Educational Resources Information Center
Bellott, Fred
A survey of 125 school systems in the United States was conducted to investigate methodologies used for developing needs assessment programs at a local level. Schools were asked to reply to a questionnaire which attempted to detail and identify how needs assessment programs are set up, what methodologies are employed, the number of resultant…
Mittra, J; Tait, J; Mastroeni, M; Turner, M L; Mountford, J C; Bruce, K
2015-01-25
The creation of red blood cells for the blood transfusion markets represents a highly innovative application of regenerative medicine with a medium term (5-10 year) prospect for first clinical studies. This article describes a case study analysis of a project to derive red blood cells from human embryonic stem cells, including the systemic challenges arising from (i) the selection of appropriate and viable regulatory protocols and (ii) technological constraints related to stem cell manufacture and scale up to clinical Good Manufacturing Practice (GMP) standard. The method used for case study analysis (Analysis of Life Science Innovation Systems (ALSIS)) is also innovative, demonstrating a new approach to social and natural science collaboration to foresight product development pathways. Issues arising along the development pathway include cell manufacture and scale-up challenges, affected by regulatory demands emerging from the innovation ecosystem (preclinical testing and clinical trials). Our discussion reflects on the efforts being made by regulators to adapt the current pharmaceuticals-based regulatory model to an allogeneic regenerative medicine product and the broader lessons from this case study for successful innovation and translation of regenerative medicine therapies, including the role of methodological and regulatory innovation in future development in the field. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.
Carbon nanotubes (CNTs) based advanced dermal therapeutics: current trends and future potential.
Kuche, Kaushik; Maheshwari, Rahul; Tambe, Vishakha; Mak, Kit-Kay; Jogi, Hardi; Raval, Nidhi; Pichika, Mallikarjuna Rao; Kumar Tekade, Rakesh
2018-05-17
The search for effective and non-invasive delivery modules to transport therapeutic molecules across skin has led to the discovery of a number of nanocarriers (viz.: liposomes, ethosomes, dendrimers, etc.) in the last few decades. However, available literature suggests that these delivery modules face several issues including poor stability, low encapsulation efficiency, and scale-up hurdles. Recently, carbon nanotubes (CNTs) emerged as a versatile tool to deliver therapeutics across skin. Superior stability, high loading capacity, well-developed synthesis protocol as well as ease of scale-up are some of the reason for growing interest in CNTs. CNTs have a unique physical architecture and a large surface area with unique surface chemistry that can be tailored for vivid biomedical applications. CNTs have been thus largely engaged in the development of transdermal systems such as tuneable hydrogels, programmable nonporous membranes, electroresponsive skin modalities, protein channel mimetic platforms, reverse iontophoresis, microneedles, and dermal buckypapers. In addition, CNTs were also employed in the development of RNA interference (RNAi) based therapeutics for correcting defective dermal genes. This review expounds the state-of-art synthesis methodologies, skin penetration mechanism, drug liberation profile, loading potential, characterization techniques, and transdermal applications along with a summary on patent/regulatory status and future scope of CNT based skin therapeutics.
Oishi, Sana; Kimura, Shin-Ichiro; Noguchi, Shuji; Kondo, Mio; Kondo, Yosuke; Shimokawa, Yoshiyuki; Iwao, Yasunori; Itai, Shigeru
2018-01-15
A new scale-down methodology from commercial rotary die scale to laboratory scale was developed to optimize a plant-derived soft gel capsule formulation and eventually manufacture superior soft gel capsules on a commercial scale, in order to reduce the time and cost for formulation development. Animal-derived and plant-derived soft gel film sheets were prepared using an applicator on a laboratory scale and their physicochemical properties, such as tensile strength, Young's modulus, and adhesive strength, were evaluated. The tensile strength of the animal-derived and plant-derived soft gel film sheets was 11.7 MPa and 4.41 MPa, respectively. The Young's modulus of the animal-derived and plant-derived soft gel film sheets was 169 MPa and 17.8 MPa, respectively, and both sheets showed a similar adhesion strength of approximately 4.5-10 MPa. Using a D-optimal mixture design, plant-derived soft gel film sheets were prepared and optimized by varying their composition, including variations in the mass of κ-carrageenan, ι-carrageenan, oxidized starch and heat-treated starch. The physicochemical properties of the sheets were evaluated to determine the optimal formulation. Finally, plant-derived soft gel capsules were manufactured using the rotary die method and the prepared soft gel capsules showed equivalent or superior physical properties compared with pre-existing soft gel capsules. Therefore, we successfully developed a new scale-down methodology to optimize the formulation of plant-derived soft gel capsules on a commercial scale. Copyright © 2017 Elsevier B.V. All rights reserved.
Continuous flow nitration in miniaturized devices
2014-01-01
Summary This review highlights the state of the art in the field of continuous flow nitration with miniaturized devices. Although nitration has been one of the oldest and most important unit reactions, the advent of miniaturized devices has paved the way for new opportunities to reconsider the conventional approach for exothermic and selectivity sensitive nitration reactions. Four different approaches to flow nitration with microreactors are presented herein and discussed in view of their advantages, limitations and applicability of the information towards scale-up. Selected recent patents that disclose scale-up methodologies for continuous flow nitration are also briefly reviewed. PMID:24605161
Prescription-event monitoring: methodology and recent progress.
Rawson, N S; Pearce, G L; Inman, W H
1990-01-01
Event monitoring was first suggested 25 years ago as a way of detecting adverse reactions to drugs. Prescription-event monitoring (PEM), which has been developed by the Drug Safety Research Unit, is the first large-scale systematic post-marketing surveillance method to use event monitoring in the U.K. PEM identifies patients, who have been prescribed a particular drug, and their doctors from photocopies of National Health Service prescriptions which are processed centrally in England. A personalized follow-up questionnaire ("green form") is mailed to each patient's general practitioner, usually on the first anniversary of the initial prescription, asking for information about the patient, especially any "events" that he or she may have experienced since beginning treatment with the drug. The methodology of PEM is presented, together with examples of analyses that can be performed using results from recent studies. The problems and benefits of PEM are discussed.
Aschner, Pablo M; Muñoz, Oscar Mauricio; Girón, Diana; García, Olga Milena; Fernández-Ávila, Daniel Gerardo; Casas, Luz Ángela; Bohórquez, Luisa Fernanda; Arango T, Clara María; Carvajal, Liliana; Ramírez, Doris Amanda; Sarmiento, Juan Guillermo; Colon, Cristian Alejandro; Correa G, Néstor Fabián; Alarcón R, Pilar; Bustamante S, Álvaro Andrés
2016-06-30
In Colombia, diabetes mellitus is a public health program for those responsible for creating and implementing strategies for prevention, diagnosis, treatment, and follow-up that are applicable at all care levels, with the objective of establishing early and sustained control of diabetes. A clinical practice guide has been developed following the broad outline of the methodological guide from the Ministry of Health and Social Welfare, with the aim of systematically gathering scientific evidence and formulating recommendations using the GRADE (Grading of Recommendations Assessment, Development and Evaluation) methodology. The current document presents in summary form the results of this process, including the recommendations and the considerations taken into account in formulating them. In general terms, what is proposed here is a screening process using the Finnish Diabetes Risk Score questionnaire adapted to the Colombian population, which enables early diagnosis of the illness, and an algorithm for determining initial treatment that can be generalized to most patients with diabetes mellitus type 2 and that is simple to apply in a primary care context. In addition, several recommendations have been made to scale up pharmacological treatment in those patients that do not achieve the objectives or fail to maintain them during initial treatment. These recommendations also take into account the evolution of weight and the individualization of glycemic control goals for special populations. Finally, recommendations have been made for opportune detection of micro- and macrovascular complications of diabetes.
VALFAST: Secure Probabilistic Validation of Hundreds of Kepler Planet Candidates
NASA Astrophysics Data System (ADS)
Morton, Tim; Petigura, E.; Johnson, J. A.; Howard, A.; Marcy, G. W.; Baranec, C.; Law, N. M.; Riddle, R. L.; Ciardi, D. R.; Robo-AO Team
2014-01-01
The scope, scale, and tremendous success of the Kepler mission has necessitated the rapid development of probabilistic validation as a new conceptual framework for analyzing transiting planet candidate signals. While several planet validation methods have been independently developed and presented in the literature, none has yet come close to addressing the entire Kepler survey. I present the results of applying VALFAST---a planet validation code based on the methodology described in Morton (2012)---to every Kepler Object of Interest. VALFAST is unique in its combination of detail, completeness, and speed. Using the transit light curve shape, realistic population simulations, and (optionally) diverse follow-up observations, it calculates the probability that a transit candidate signal is the result of a true transiting planet or any of a number of astrophysical false positive scenarios, all in just a few minutes on a laptop computer. In addition to efficiently validating the planetary nature of hundreds of new KOIs, this broad application of VALFAST also demonstrates its ability to reliably identify likely false positives. This extensive validation effort is also the first to incorporate data from all of the largest Kepler follow-up observing efforts: the CKS survey of ~1000 KOIs with Keck/HIRES, the Robo-AO survey of >1700 KOIs, and high-resolution images obtained through the Kepler Follow-up Observing Program. In addition to enabling the core science that the Kepler mission was designed for, this methodology will be critical to obtain statistical results from future surveys such as TESS and PLATO.
Muñoz, Oscar Mauricio; Girón, Diana; García, Olga Milena; Fernández-Ávila, Daniel Gerardo; Casas, Luz Ángela; Bohórquez, Luisa Fernanda; Arango T, Clara María; Carvajal, Liliana; Ramírez, Doris Amanda; Sarmiento, Juan Guillermo; Colon, Cristian Alejandro; Correa G, Néstor Fabián; Alarcón R, Pilar; Bustamante S, Álvaro Andrés
2016-01-01
In Colombia, diabetes mellitus is a public health program for those responsible for creating and implementing strategies for prevention, diagnosis, treatment, and follow-up that are applicable at all care levels, with the objective of establishing early and sustained control of diabetes. A clinical practice guide has been developed following the broad outline of the methodological guide from the Ministry of Health and Social Welfare, with the aim of systematically gathering scientific evidence and formulating recommendations using the GRADE (Grading of Recommendations Assessment, Development and Evaluation) methodology. The current document presents in summary form the results of this process, including the recommendations and the considerations taken into account in formulating them. In general terms, what is proposed here is a screening process using the Finnish Diabetes Risk Score questionnaire adapted to the Colombian population, which enables early diagnosis of the illness, and an algorithm for determining initial treatment that can be generalized to most patients with diabetes mellitus type 2 and that is simple to apply in a primary care context. In addition, several recommendations have been made to scale up pharmacological treatment in those patients that do not achieve the objectives or fail to maintain them during initial treatment. These recommendations also take into account the evolution of weight and the individualization of glycemic control goals for special populations. Finally, recommendations have been made for opportune detection of micro- and macrovascular complications of diabetes. PMID:27546934
Reference Model 6 (RM6): Oscillating Wave Energy Converter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bull, Diana L; Smith, Chris; Jenne, Dale Scott
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter reference model design in a complementary manner to Reference Models 1-4 contained in the above report. In this report, a conceptual design for an Oscillating Water Column Wave Energy Converter (WEC) device appropriate for the modeled reference resource site was identified, and a detailed backward bent duct buoy (BBDB) device design was developed using a combination of numerical modeling tools and scaled physical models. Our team used the methodology in SAND2013-9040more » for the economic analysis that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays, up to 100 devices. The methodology was applied to identify key cost drivers and to estimate levelized cost of energy (LCOE) for this RM6 Oscillating Water Column device in dollars per kilowatt-hour ($/kWh). Although many costs were difficult to estimate at this time due to the lack of operational experience, the main contribution of this work was to disseminate a detailed set of methodologies and models that allow for an initial cost analysis of this emerging technology. This project is sponsored by the U.S. Department of Energy's (DOE) Wind and Water Power Technologies Program Office (WWPTO), within the Office of Energy Efficiency & Renewable Energy (EERE). Sandia National Laboratories, the lead in this effort, collaborated with partners from National Laboratories, industry, and universities to design and test this reference model.« less
Grall-Bronnec, M; Sauvaget, A
2014-11-01
Repetitive transcranial magnetic stimulation (rTMS) is a potential therapeutic intervention for the treatment of addiction. This critical review aims to summarise the recent developments with respect to the efficacy of rTMS for all types of addiction and related disorders (including eating disorders), and concentrates on the associated methodological and technical issues. The bibliographic search consisted of a computerised screening of the Medline and ScienceDirect databases up to December 2013. Criteria for inclusion were the target problem was an addiction, a related disorder, or craving; the intervention was performed using rTMS; and the study was a clinical trial. Of the potential 638 articles, 18 met the criteria for inclusion. Most of these (11 of the 18) supported the efficacy of rTMS, especially in the short term. In most cases, the main assessment criterion was the measurement of craving using a Visual Analogue Scale. The results are discussed with respect to the study limitations and, in particular, the many methodological and technical discrepancies that were identified. Key recommendations are provided.
HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases
NASA Technical Reports Server (NTRS)
Freeman, Michael S.
1987-01-01
The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.
Elliston, Adam; Wood, Ian P; Soucouri, Marie J; Tantale, Rachelle J; Dicks, Jo; Roberts, Ian N; Waldron, Keith W
2015-01-01
High-throughput (HTP) screening is becoming an increasingly useful tool for collating biological data which would otherwise require the employment of excessive resources. Second generation biofuel production is one such process. HTP screening allows the investigation of large sample sets to be undertaken with increased speed and cost effectiveness. This paper outlines a methodology that will enable solid lignocellulosic substrates to be hydrolyzed and fermented at a 96-well plate scale, facilitating HTP screening of ethanol production, whilst maintaining repeatability similar to that achieved at a larger scale. The results showed that utilizing sheets of biomass of consistent density (handbills), for paper, and slurries of pretreated biomass that could be pipetted allowed standardized and accurate transfers to 96-well plates to be achieved (±3.1 and 1.7%, respectively). Processing these substrates by simultaneous saccharification and fermentation (SSF) at various volumes showed no significant difference on final ethanol yields, either at standard shake flask (200 mL), universal bottle (10 mL) or 96-well plate (1 mL) scales. Substrate concentrations of up to 10% (w/v) were trialed successfully for SSFs at 1 mL volume. The methodology was successfully tested by showing the effects of steam explosion pretreatment on both oilseed rape and wheat straws. This methodology could be used to replace large shake flask reactions with comparatively fast 96-well plate SSF assays allowing for HTP experimentation. Additionally this method is compatible with a number of standardized assay techniques such as simple colorimetric, High-performance liquid chromatography (HPLC) and Nuclear magnetic resonance (NMR) spectroscopy. Furthermore this research has practical uses in the biorefining of biomass substrates for second generation biofuels and novel biobased chemicals by allowing HTP SSF screening, which should allow selected samples to be scaled up or studied in more detail.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mata, Pedro; Fuente, Rafael de la; Iglesias, Javier
Iberdrola (spanish utility) and Iberdrola Ingenieria (engineering branch) have been developing during the last two years the 110% Extended Power Up-rate Project (EPU 110%) for Cofrentes BWR-6. IBERDROLA has available an in-house design and licensing reload methodology that has been approved by the Spanish Nuclear Regulatory Authority. This methodology has been already used to perform the nuclear design and the reload licensing analysis for Cofrentes cycles 12 to 14. The methodology has been also applied to develop a significant number of safety analysis of the Cofrentes Extended Power Up-rate including: Reactor Heat Balance, Core and Fuel performance, Thermal Hydraulic Stability,more » ECCS LOCA Evaluation, Transient Analysis, Anticipated Transient Without Scram (ATWS) and Station Blackout (SBO) Since the scope of the licensing process of the Cofrentes Extended Power Up-rate exceeds the range of analysis included in the Cofrentes generic reload licensing process, it has been required to extend the applicability of the Cofrentes licensing methodology to the analysis of new transients. This is the case of the TLFW transient. The content of this paper shows the benefits of having an in-house design and licensing methodology, and describes the process to extend the applicability of the methodology to the analysis of new transients. The case of analysis of Total Loss of Feedwater with the Cofrentes Retran Model is included as an example of this process. (authors)« less
Goodman, Angela; Hakala, J. Alexandra; Bromhal, Grant; Deel, Dawn; Rodosta, Traci; Frailey, Scott; Small, Michael; Allen, Doug; Romanov, Vyacheslav; Fazio, Jim; Huerta, Nicolas; McIntyre, Dustin; Kutchko, Barbara; Guthrie, George
2011-01-01
A detailed description of the United States Department of Energy (US-DOE) methodology for estimating CO2 storage potential for oil and gas reservoirs, saline formations, and unmineable coal seams is provided. The oil and gas reservoirs are assessed at the field level, while saline formations and unmineable coal seams are assessed at the basin level. The US-DOE methodology is intended for external users such as the Regional Carbon Sequestration Partnerships (RCSPs), future project developers, and governmental entities to produce high-level CO2 resource assessments of potential CO2 storage reservoirs in the United States and Canada at the regional and national scale; however, this methodology is general enough that it could be applied globally. The purpose of the US-DOE CO2 storage methodology, definitions of storage terms, and a CO2 storage classification are provided. Methodology for CO2 storage resource estimate calculation is outlined. The Log Odds Method when applied with Monte Carlo Sampling is presented in detail for estimation of CO2 storage efficiency needed for CO2 storage resource estimates at the regional and national scale. CO2 storage potential reported in the US-DOE's assessment are intended to be distributed online by a geographic information system in NatCarb and made available as hard-copy in the Carbon Sequestration Atlas of the United States and Canada. US-DOE's methodology will be continuously refined, incorporating results of the Development Phase projects conducted by the RCSPs from 2008 to 2018. Estimates will be formally updated every two years in subsequent versions of the Carbon Sequestration Atlas of the United States and Canada.
NASA Technical Reports Server (NTRS)
Chapman, C. P.; Chapman, P. D.; Lewison, A. H.
1982-01-01
A low power photovoltaic system was constructed with approximately 500 amp hours of battery energy storage to provide power to an emergency amateur radio communications center. The system can power the communications center for about 72 hours of continuous nonsun operation. Complete construction details and a design methodology algorithm are given with abundant engineering data and adequate theory to allow similar systems to be constructed, scaled up or down, with minimum design effort.
Data Quality Assurance for Supersonic Jet Noise Measurements
NASA Technical Reports Server (NTRS)
Brown, Clifford A.; Henderson, Brenda S.; Bridges, James E.
2010-01-01
The noise created by a supersonic aircraft is a primary concern in the design of future high-speed planes. The jet noise reduction technologies required on these aircraft will be developed using scale-models mounted to experimental jet rigs designed to simulate the exhaust gases from a full-scale jet engine. The jet noise data collected in these experiments must accurately predict the noise levels produced by the full-scale hardware in order to be a useful development tool. A methodology has been adopted at the NASA Glenn Research Center s Aero-Acoustic Propulsion Laboratory to insure the quality of the supersonic jet noise data acquired from the facility s High Flow Jet Exit Rig so that it can be used to develop future nozzle technologies that reduce supersonic jet noise. The methodology relies on mitigating extraneous noise sources, examining the impact of measurement location on the acoustic results, and investigating the facility independence of the measurements. The methodology is documented here as a basis for validating future improvements and its limitations are noted so that they do not affect the data analysis. Maintaining a high quality jet noise laboratory is an ongoing process. By carefully examining the data produced and continually following this methodology, data quality can be maintained and improved over time.
Apparatus and methodology for fire gas characterization by means of animal exposure
NASA Technical Reports Server (NTRS)
Marcussen, W. H.; Hilado, C. J.; Furst, A.; Leon, H. A.; Kourtides, D. A.; Parker, J. A.; Butte, J. C.; Cummins, J. M.
1976-01-01
While there is a great deal of information available from small-scale laboratory experiments and for relatively simple mixtures of gases, considerable uncertainty exists regarding appropriate bioassay techniques for the complex mixture of gases generated in full-scale fires. Apparatus and methodology have been developed based on current state of the art for determining the effects of fire gases in the critical first 10 minutes of a full-scale fire on laboratory animals. This information is presented for its potential value and use while further improvements are being made.
Adaptive Multi-scale Prognostics and Health Management for Smart Manufacturing Systems
Choo, Benjamin Y.; Adams, Stephen C.; Weiss, Brian A.; Marvel, Jeremy A.; Beling, Peter A.
2017-01-01
The Adaptive Multi-scale Prognostics and Health Management (AM-PHM) is a methodology designed to enable PHM in smart manufacturing systems. In application, PHM information is not yet fully utilized in higher-level decision-making in manufacturing systems. AM-PHM leverages and integrates lower-level PHM information such as from a machine or component with hierarchical relationships across the component, machine, work cell, and assembly line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are then made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. To overcome the issue of exponential explosion of complexity associated with describing a large manufacturing system, the AM-PHM methodology takes a hierarchical Markov Decision Process (MDP) approach into describing the system and solving for an optimized policy. A description of the AM-PHM methodology is followed by a simulated industry-inspired example to demonstrate the effectiveness of AM-PHM. PMID:28736651
Scaling an in situ network for high resolution modeling during SMAPVEX15
USDA-ARS?s Scientific Manuscript database
Among the greatest challenges within the field of soil moisture estimation is that of scaling sparse point measurements within a network to produce higher resolution map products. Large-scale field experiments present an ideal opportunity to develop methodologies for this scaling, by coupling in si...
NASA Technical Reports Server (NTRS)
Wang, Jianzhong Jay; Datta, Koushik; Landis, Michael R. (Technical Monitor)
2002-01-01
This paper describes the development of a life-cycle cost (LCC) estimating methodology for air traffic control Decision Support Tools (DSTs) under development by the National Aeronautics and Space Administration (NASA), using a combination of parametric, analogy, and expert opinion methods. There is no one standard methodology and technique that is used by NASA or by the Federal Aviation Administration (FAA) for LCC estimation of prospective Decision Support Tools. Some of the frequently used methodologies include bottom-up, analogy, top-down, parametric, expert judgement, and Parkinson's Law. The developed LCC estimating methodology can be visualized as a three-dimensional matrix where the three axes represent coverage, estimation, and timing. This paper focuses on the three characteristics of this methodology that correspond to the three axes.
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul MA; Scharf, Thomas; ÓLaighin, Gearóid
2017-01-01
Background Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. Objective We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. Methods We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. Results We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. Conclusions We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. PMID:28302594
DOE Office of Scientific and Technical Information (OSTI.GOV)
Castillo, H.
1982-01-01
The Government of Costa Rica has stated the need for a formal procedure for the evaluation and categorization of an environmental program. Methodological studies were prepared as the basis for the development of the general methodology by which each government or institution can adapt and implement the procedure. The methodology was established by using different techniques according to their contribution to the evaluation process, such as: Systemic Approach, Delphi, and Saaty Methods. The methodology consists of two main parts: 1) evaluation of the environmental aspects by using different techniques; 2) categorization of the environmental aspects by applying the methodology tomore » the Costa Rican Environmental affairs using questionnaire answers supplied by experts both inside and outside of the country. The second part of the research includes Appendixes in which is presented general information concerning institutions related to environmental affairs; description of the methods used; results of the current status evaluation and its scale; the final scale of categorization; and the questionnaires and a list of experts. The methodology developed in this research will have a beneficial impact on environmental concerns in Costa Rica. As a result of this research, a Commission Office of Environmental Affairs, providing links between consumers, engineers, scientists, and the Government, is recommended. Also there is significant potential use of this methodology in developed countries for a better balancing of the budgets of major research programs such as cancer, heart, and other research areas.« less
Mapping and monitoring High Nature Value farmlands: challenges in European landscapes.
Lomba, Angela; Guerra, Carlos; Alonso, Joaquim; Honrado, João Pradinho; Jongman, Rob; McCracken, David
2014-10-01
The importance of low intensity farming for the conservation of biodiversity throughout Europe was acknowledged early in the 1990s when the concept of 'High Nature Value farmlands' (HNVf) was devised. HNVf has subsequently been given high priority within the EU Rural Development Programme. This puts a requirement on each EU Member State not only to identify the extent and condition of HNVf within their borders but also to track trends in HNVf over time. However, the diversity of rural landscapes across the EU, the scarcity of (adequate) datasets on biodiversity, land cover and land use, and the lack of a common methodology for HNVf mapping currently represent obstacles to the implementation of the HNVf concept across Europe. This manuscript provides an overview of the characteristics of HNVf across Europe together with a description of the development of the HNVf concept. Current methodological approaches for the identification and mapping of HNVf across EU-27 and Switzerland are then reviewed, the main limitations of these approaches highlighted and recommendations made as to how the identification, mapping and reporting of HNVf state and trends across Europe can potentially be improved and harmonised. In particular, we propose a new framework that is built on the need for strategic HNVf monitoring based on a hierarchical, bottom-up structure of assessment units, coincident with the EU levels of political decision and devised indicators, and which is linked strongly to a collaborative European network that can provide the integration and exchange of data from different sources and scales under common standards. Such an approach is essential if the scale of the issues facing HNVf landscapes are to be identified and monitored properly at the European level. This would then allow relevant agri-environmental measures to be developed, implemented and evaluated at the scale(s) required to maintain the habitats and species of high nature conservation value that are intimately associated with those landscapes. Copyright © 2014 Elsevier Ltd. All rights reserved.
USDA-ARS?s Scientific Manuscript database
We developed a cost-based methodology to assess the value of forested watersheds to improve water quality in public water supplies. The developed methodology is applicable to other source watersheds to determine ecosystem services for water quality. We assess the value of forest land for source wate...
Dodeen, Hamzeh; Al-Darmaki, Fatima
2016-12-01
The aim of this study was to determine the feasibility of generating a shorter version of the Emirati Marital Satisfaction Scale (EMSS) using item response theory (IRT)-based methodology. The EMSS is the first national scale used to provide an understanding of the family function and level of marital satisfaction within the cultural context of the United Arab Emirates. A sample of 1,049 Emirati married individuals from different ages, genders, places of residence, and monthly incomes participated in this study. The IRT was calibrated using X-Calibre 4.2 and the graded response model. The analysis was developed on the basis of a short form of the EMSS (7 items), which constitutes a promising alternative to the original scale for practitioners and researchers. This short version is reliable, valid, and it gives results very similar to the original scale. The results of this study confirmed the usefulness of IRT-based methodology for developing psychological and counseling scales. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
USDA-ARS?s Scientific Manuscript database
There is a need to develop scale explicit understanding of erosion to overcome existing conceptual and methodological flaws in our modelling methods currently applied to understand the process of erosion, transport and deposition at the catchment scale. These models need to be based on a sound under...
Pollard, Beth; Johnston, Marie; Dixon, Diane
2007-01-01
Subjective measures involving clinician ratings or patient self-assessments have become recognised as an important tool for the assessment of health outcome. The value of a health outcome measure is usually assessed by a psychometric evaluation of its reliability, validity and responsiveness. However, psychometric testing involves an accumulation of evidence and has recognised limitations. It has been suggested that an evaluation of how well a measure has been developed would be a useful additional criteria in assessing the value of a measure. This paper explored the theoretical background and methodological development of subjective health status measures commonly used in osteoarthritis research. Fourteen subjective health outcome measures commonly used in osteoarthritis research were examined. Each measure was explored on the basis of their i) theoretical framework (was there a definition of what was being assessed and was it part of a theoretical model?) and ii) methodological development (what was the scaling strategy, how were the items generated and reduced, what was the response format and what was the scoring method?). Only the AIMS, SF-36 and WHOQOL defined what they were assessing (i.e. the construct of interest) and no measure assessed was part of a theoretical model. None of the clinician report measures appeared to have implemented a scaling procedure or described the rationale for the items selected or scoring system. Of the patient self-report measures, the AIMS, MPQ, OXFORD, SF-36, WHOQOL and WOMAC appeared to follow a standard psychometric scaling method. The DRP and EuroQol used alternative scaling methods. The review highlighted the general lack of theoretical framework for both clinician report and patient self-report measures. This review also drew attention to the wide variation in the methodological development of commonly used measures in OA. While, in general the patient self-report measures had good methodological development, the clinician report measures appeared less well developed. It would be of value if new measures defined the construct of interest and, that the construct, be part of theoretical model. By ensuring measures are both theoretically and empirically valid then improvements in subjective health outcome measures should be possible. PMID:17343739
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. Results have shown that it is feasible to design a methodology that can provide suitable guidelines for operational production of small scale rural land use maps of semiarid developing regions from LANDSAT MSS imagery, using inexpensive and unsophisticated visual techniques. The suggested methodology provides immediate practical benefits to map makers attempting to produce land use maps in countries with limited budgets and equipment. Many preprocessing and interpretation techniques were considered, but rejected on the grounds that they were inappropriate mainly due to the high cost of imagery and/or equipment, or due to their inadequacy for use in operational projects in the developing countries. Suggested imagery and interpretation techniques, consisting of color composites and monocular magnification proved to be the simplest, fastest, and most versatile methods.
Multi-scale landslide hazard assessment: Advances in global and regional methodologies
NASA Astrophysics Data System (ADS)
Kirschbaum, Dalia; Peters-Lidard, Christa; Adler, Robert; Hong, Yang
2010-05-01
The increasing availability of remotely sensed surface data and precipitation provides a unique opportunity to explore how smaller-scale landslide susceptibility and hazard assessment methodologies may be applicable at larger spatial scales. This research first considers an emerging satellite-based global algorithm framework, which evaluates how the landslide susceptibility and satellite derived rainfall estimates can forecast potential landslide conditions. An analysis of this algorithm using a newly developed global landslide inventory catalog suggests that forecasting errors are geographically variable due to improper weighting of surface observables, resolution of the current susceptibility map, and limitations in the availability of landslide inventory data. These methodological and data limitation issues can be more thoroughly assessed at the regional level, where available higher resolution landslide inventories can be applied to empirically derive relationships between surface variables and landslide occurrence. The regional empirical model shows improvement over the global framework in advancing near real-time landslide forecasting efforts; however, there are many uncertainties and assumptions surrounding such a methodology that decreases the functionality and utility of this system. This research seeks to improve upon this initial concept by exploring the potential opportunities and methodological structure needed to advance larger-scale landslide hazard forecasting and make it more of an operational reality. Sensitivity analysis of the surface and rainfall parameters in the preliminary algorithm indicates that surface data resolution and the interdependency of variables must be more appropriately quantified at local and regional scales. Additionally, integrating available surface parameters must be approached in a more theoretical, physically-based manner to better represent the physical processes underlying slope instability and landslide initiation. Several rainfall infiltration and hydrological flow models have been developed to model slope instability at small spatial scales. This research investigates the potential of applying a more quantitative hydrological model to larger spatial scales, utilizing satellite and surface data inputs that are obtainable over different geographic regions. Due to the significant role that data and methodological uncertainties play in the effectiveness of landslide hazard assessment outputs, the methodology and data inputs are considered within an ensemble uncertainty framework in order to better resolve the contribution and limitations of model inputs and to more effectively communicate the model skill for improved landslide hazard assessment.
A multi-scale modelling procedure to quantify hydrological impacts of upland land management
NASA Astrophysics Data System (ADS)
Wheater, H. S.; Jackson, B.; Bulygina, N.; Ballard, C.; McIntyre, N.; Marshall, M.; Frogbrook, Z.; Solloway, I.; Reynolds, B.
2008-12-01
Recent UK floods have focused attention on the effects of agricultural intensification on flood risk. However, quantification of these effects raises important methodological issues. Catchment-scale data have proved inadequate to support analysis of impacts of land management change, due to climate variability, uncertainty in input and output data, spatial heterogeneity in land use and lack of data to quantify historical changes in management practices. Manipulation experiments to quantify the impacts of land management change have necessarily been limited and small scale, and in the UK mainly focused on the lowlands and arable agriculture. There is a need to develop methods to extrapolate from small scale observations to predict catchment-scale response, and to quantify impacts for upland areas. With assistance from a cooperative of Welsh farmers, a multi-scale experimental programme has been established at Pontbren, in mid-Wales, an area of intensive sheep production. The data have been used to support development of a multi-scale modelling methodology to assess impacts of agricultural intensification and the potential for mitigation of flood risk through land use management. Data are available from replicated experimental plots under different land management treatments, from instrumented field and hillslope sites, including tree shelter belts, and from first and second order catchments. Measurements include climate variables, soil water states and hydraulic properties at multiple depths and locations, tree interception, overland flow and drainflow, groundwater levels, and streamflow from multiple locations. Fine resolution physics-based models have been developed to represent soil and runoff processes, conditioned using experimental data. The detailed models are used to calibrate simpler 'meta- models' to represent individual hydrological elements, which are then combined in a semi-distributed catchment-scale model. The methodology is illustrated using field and catchment-scale simulations to demonstrate the the response of improved and unimproved grassland, and the potential effects of land management interventions, including farm ponds, tree shelter belts and buffer strips. It is concluded that the methodology developed has the potential to represent and quantify catchment-scale effects of upland management; continuing research is extending the work to a wider range of upland environments and land use types, with the aim of providing generic simulation tools that can be used to provide strategic policy guidance.
Caudle, Susan E; Katzenstein, Jennifer M; Oghalai, John S; Lin, Jerry; Caudle, Donald D
2014-02-01
Methodologically, longitudinal assessment of cognitive development in young children has proven difficult because few measures span infancy through school age. This matter is further complicated when the child presents with a sensory deficit such as hearing loss. Few measures are validated in this population, and children who are evaluated for cochlear implantation are often reevaluated annually. The authors sought to evaluate the predictive validity of subscales of the Mullen Scales of Early Learning (MSEL) on Leiter International Performance Scales-Revised (LIPS-R) Full-Scale IQ scores. To further elucidate the relationship of these two measures, comparisons were also made with the Vineland Adaptive Behavior Scale-Second Edition (VABS), which provides a measure of adaptive functioning across the life span. Participants included 35 children (14 female, 21 male) who were evaluated both as part of the precandidacy process for cochlear implantation using the MSEL and VABS and following implantation with the LIPS-R and VABS. Hierarchical linear regression revealed that the MSEL Visual Reception subdomain score significantly predicted 52% of the variance in LIPS-R Full-Scale IQ scores at follow-up, F(1, 34) = 35.80, p < .0001, R (2) = .52, β = .72. This result suggests that the Visual Reception subscale offers predictive validity of later LIPS-R Full-Scale IQ scores. The VABS was also significantly correlated with cognitive variables at each time point.
2011-01-01
Background Kenya experienced rapid scale up of HIV testing and counselling services in government health services from 2001. We set out to examine the human resource policy implications of scaling up HIV testing and counselling in Kenya and to analyse the resultant policy against a recognised theoretical framework of health policy reform (policy analysis triangle). Methods Qualitative methods were used to gain in-depth insights from policy makers who shaped scale up. This included 22 in-depth interviews with Voluntary Counselling and Testing (VCT) task force members, critical analysis of 53 sets of minutes and diary notes. We explore points of consensus and conflict amongst policymakers in Kenya and analyse this content to assess who favoured and resisted new policies, how scale up was achieved and the importance of the local context in which scale up occurred. Results The scale up of VCT in Kenya had a number of human resource policy implications resulting from the introduction of lay counsellors and their authorisation to conduct rapid HIV testing using newly introduced rapid testing technologies. Our findings indicate that three key groups of actors were critical: laboratory professionals, counselling associations and the Ministry of Health. Strategic alliances between donors, NGOs and these three key groups underpinned the process. The process of reaching consensus required compromise and time commitment but was critical to a unified nationwide approach. Policies around quality assurance were integral in ensuring standardisation of content and approach. Conclusion The introduction and scale up of new health service initiatives such as HIV voluntary counselling and testing necessitates changes to existing health systems and modification of entrenched interests around professional counselling and laboratory testing. Our methodological approach enabled exploration of complexities of scale up of HIV testing and counselling in Kenya. We argue that a better understanding of the diverse actors, the context and the process, is required to mitigate risks and maximise impact. PMID:22008721
López-Serna, Rebeca; Marín-de-Jesús, David; Irusta-Mata, Rubén; García-Encina, Pedro Antonio; Lebrero, Raquel; Fdez-Polanco, María; Muñoz, Raúl
2018-08-15
The work here presented aimed at developing an analytical method for the simultaneous determination of 22 pharmaceuticals and personal care products, including 3 transformation products, in sewage and sludge. A meticulous method optimization, involving an experimental design, was carried out. The developed method was fully automated and consisted of the online extraction of 17 mL of water sample by Direct Immersion Solid Phase MicroExtraction followed by On-fiber Derivatization coupled to Gas Chromatography - Mass Spectrometry (DI-SPME - On-fiber Derivatization - GC - MS). This methodology was validated for 12 of the initial compounds as a reliable (relative recoveries above 90% for sewage and 70% for sludge; repeatability as %RSD below 10% in all cases), sensitive (LODs below 20 ng L -1 in sewage and 10 ng g -1 in sludge), versatile (sewage and sewage-sludge samples up to 15,000 ng L -1 and 900 ng g -1 , respectively) and green analytical alternative for many medium-tech routine laboratories around the world to keep up with both current and forecast environmental regulations requirements. The remaining 10 analytes initially considered showed insufficient suitability to be included in the final method. The methodology was successfully applied to real samples generated in a pilot scale sewage treatment reactor. Copyright © 2018 Elsevier B.V. All rights reserved.
Messori, Stefano; Zilli, Romano; Mariano, Valeria; Bagni, Marina
2017-03-31
Diseases evolve constantly and research is needed to face emerging new threats. Evidences suggest that the impact of such threats will have its peak in the Mediterranean area. The FORE‑Med, Foresight project for the Mediterranean, aims at identifying the future challenges on livestock health and aquaculture in this area, to ensure an effective coordination of research activities and the delivery of timely solution to emerging issues. One hundred experts with multidisciplinary background and coming from countries all around the Mediterranean basin were gathered to participate in a think‑tank to develop a Strategic Research Agenda on animal health for Mediterranean up to 2030. A tailored foresight methodology was implemented, merging the best fit for purpose techniques (e.g. '7 questions', Social, Technological, Economical, Environmental, and Political (STEEP), analysis, scenario building, and backcasting). Both remote and face‑to‑face debates were held, to ensure a fruitful exchanges and participation among experts. Research needs were identified and prioritised, both on relevance and on temporal scale. The implemented participative approach allowed for the definition of a research priority list for animal health and aquaculture in the Mediterranean, which served as a basis to build a strategic research agenda. The latter is expected to satisfy the sectors' needs and guarantee a much‑needed coordination for research activities in the Mediterranean area.
Romero, Mara C; Fogar, Ricardo A; Rolhaiser, Fabiana; Clavero, Verónica V; Romero, Ana M; Judis, María A
2018-05-01
The goal of this study was to develop a fish-based product suitable for people with celiac disease. Water and gluten-free flours (rice, corn, amaranth or quinoa) were added to improve cooking yield, texture parameters and as an aid in improving quality attributes such as taste and juiciness. Cooking yields of patties containing gluten-free flours were higher than control and maximum values ranged between 91 and 93%. Hardness was higher in patties made with amaranth or quinoa flour, whereas cohesiveness and springiness were higher in patties made with corn and rice flour, respectively. Response surface methodology was used to optimize patties formulations. Optimized formulations were prepared and evaluated showing a good agreement between predicted and experimental responses. Also, nutritional value and consumer acceptance of optimized formulations were analysed. Flours addition affected proximate composition increasing carbohydrates, total fat and mineral content compared to control. Sensory evaluation showed that no differences were found in the aroma of products. Addition of rice flour increased juiciness and tenderness whereas taste, overall acceptance and buying intention were higher in control patty, followed by patties made with corn flour. The present investigation shows good possibilities for further product development, including the scale up at an industrial level.
Effective strategies for scaling up evidence-based practices in primary care: a systematic review.
Ben Charif, Ali; Zomahoun, Hervé Tchala Vignon; LeBlanc, Annie; Langlois, Léa; Wolfenden, Luke; Yoong, Sze Lin; Williams, Christopher M; Lépine, Roxanne; Légaré, France
2017-11-22
While an extensive array of existing evidence-based practices (EBPs) have the potential to improve patient outcomes, little is known about how to implement EBPs on a larger scale. Therefore, we sought to identify effective strategies for scaling up EBPs in primary care. We conducted a systematic review with the following inclusion criteria: (i) study design: randomized and non-randomized controlled trials, before-and-after (with/without control), and interrupted time series; (ii) participants: primary care-related units (e.g., clinical sites, patients); (iii) intervention: any strategy used to scale up an EBP; (iv) comparator: no restrictions; and (v) outcomes: no restrictions. We searched MEDLINE, Embase, PsycINFO, Web of Science, CINAHL, and the Cochrane Library from database inception to August 2016 and consulted clinical trial registries and gray literature. Two reviewers independently selected eligible studies, then extracted and analyzed data following the Cochrane methodology. We extracted components of scaling-up strategies and classified them into five categories: infrastructure, policy/regulation, financial, human resources-related, and patient involvement. We extracted scaling-up process outcomes, such as coverage, and provider/patient outcomes. We validated data extraction with study authors. We included 14 studies. They were published since 2003 and primarily conducted in low-/middle-income countries (n = 11). Most were funded by governmental organizations (n = 8). The clinical area most represented was infectious diseases (HIV, tuberculosis, and malaria, n = 8), followed by newborn/child care (n = 4), depression (n = 1), and preventing seniors' falls (n = 1). Study designs were mostly before-and-after (without control, n = 8). The most frequently targeted unit of scaling up was the clinical site (n = 11). The component of a scaling-up strategy most frequently mentioned was human resource-related (n = 12). All studies reported patient/provider outcomes. Three studies reported scaling-up coverage, but no study quantitatively reported achieving a coverage of 80% in combination with a favorable impact. We found few studies assessing strategies for scaling up EBPs in primary care settings. It is uncertain whether any strategies were effective as most studies focused more on patient/provider outcomes and less on scaling-up process outcomes. Minimal consensus on the metrics of scaling up are needed for assessing the scaling up of EBPs in primary care. This review is registered as PROSPERO CRD42016041461 .
Development of the Scale for "Convergence Thinking" in Engineering
ERIC Educational Resources Information Center
Park, Sungmi
2016-01-01
Purpose: The purpose of this paper is to define the concept of "convergence thinking" as a trading zone for knowledge fusion in the engineering field, and develops its measuring scale. Design/ Methodology/Approach: Based on results from literature review, this study clarifies a theoretical ground for "convergence thinking."…
Personal Accountability in Education: Measure Development and Validation
ERIC Educational Resources Information Center
Rosenblatt, Zehava
2017-01-01
Purpose: The purpose of this paper, three-study research project, is to establish and validate a two-dimensional scale to measure teachers' and school administrators' accountability disposition. Design/methodology/approach: The scale items were developed in focus groups, and the final measure was tested on various samples of Israeli teachers and…
Initial Development and Validation of the Global Citizenship Scale
ERIC Educational Resources Information Center
Morais, Duarte B.; Ogden, Anthony C.
2011-01-01
The purpose of this article is to report on the initial development of a theoretically grounded and empirically validated scale to measure global citizenship. The methodology employed is multi-faceted, including two expert face validity trials, extensive exploratory and confirmatory factor analyses with multiple datasets, and a series of three…
Bragança, Luís
2014-01-01
This paper analyses the current trends in sustainability assessment. After about 15 years from the launch of sustainability assessment tools, focused on buildings evaluation, the paradigm of sustainability assessment tools is changing from the building scale to the built environment scale. Currently European cities and cities around the world are concerned with sustainable development, as well as its evolution. Cities seek a way to adapt to contemporary changes, in order to meet the required needs and ensure population's well-being. Considering this, the new generations of sustainability assessment tools are being developed to be used to guide and help cities and urban areas to become more sustainable. Following the trend of the most important sustainability assessment tools, the sustainability assessment tool SBToolPT is also developing its version for assessing the sustainability of the built environment, namely, the urban planning projects and the urban regeneration projects, to be developed in Portugal, the SBToolPT-UP. The application of the methodology to three case studies will demonstrate its feasibility; at the same time this will identify the best practices which will serve as reference for new projects, thereby assisting the development of the tool. PMID:24592171
Castanheira, Guilherme; Bragança, Luís
2014-01-01
This paper analyses the current trends in sustainability assessment. After about 15 years from the launch of sustainability assessment tools, focused on buildings evaluation, the paradigm of sustainability assessment tools is changing from the building scale to the built environment scale. Currently European cities and cities around the world are concerned with sustainable development, as well as its evolution. Cities seek a way to adapt to contemporary changes, in order to meet the required needs and ensure population's well-being. Considering this, the new generations of sustainability assessment tools are being developed to be used to guide and help cities and urban areas to become more sustainable. Following the trend of the most important sustainability assessment tools, the sustainability assessment tool SBTool(PT) is also developing its version for assessing the sustainability of the built environment, namely, the urban planning projects and the urban regeneration projects, to be developed in Portugal, the SBTool(PT)-UP. The application of the methodology to three case studies will demonstrate its feasibility; at the same time this will identify the best practices which will serve as reference for new projects, thereby assisting the development of the tool.
Experimental Methodology for Measuring Combustion and Injection-Coupled Responses
NASA Technical Reports Server (NTRS)
Cavitt, Ryan C.; Frederick, Robert A.; Bazarov, Vladimir G.
2006-01-01
A Russian scaling methodology for liquid rocket engines utilizing a single, full scale element is reviewed. The scaling methodology exploits the supercritical phase of the full scale propellants to simplify scaling requirements. Many assumptions are utilized in the derivation of the scaling criteria. A test apparatus design is presented to implement the Russian methodology and consequently verify the assumptions. This test apparatus will allow researchers to assess the usefulness of the scaling procedures and possibly enhance the methodology. A matrix of the apparatus capabilities for a RD-170 injector is also presented. Several methods to enhance the methodology have been generated through the design process.
Highly-Skilled Colombian Immigrants in Spain: Do They Have to Return Home to Start up in Business?
ERIC Educational Resources Information Center
Bulla, Francisco Javier Matiz; Hormiga, Esther
2011-01-01
Purpose: The purpose of this paper is to understand why high-skilled immigrants from a developing country (Colombia) are returning to their home country to create businesses instead of starting up in their host country (Spain). Design/methodology/approach: A case study methodology was used to present the experiences of three high-skilled…
Quantifying Stream-Aquifer Exchanges Over Scales: the Concept of Nested Interfaces
NASA Astrophysics Data System (ADS)
Flipo, N.; Mouhri, A.; Labarthe, B.; Saleh, F. S.
2013-12-01
Recent developments in hydrological modelling are based on a view of the interface being a single continuum through which water flows. These coupled hydrological-hydrogeological models, emphasizing the importance of the stream-aquifer interface (SAI), are more and more used in hydrological sciences for pluri-disciplinary studies aiming at questioning environmental issues. This notion of a single continuum comes from the historical modelling of hydrosystems based on the hypothesis of a homogeneous media that led to the Darcy law. Nowadays, there is a need to first bridge the gap between hydrological and eco-hydrological views of the SAIs, and, second, to rationalize the modelling of SAI within a consistent framework that fully takes into account the multi-dimensionality of the SAIs. We first define the concept of nested SAIs as a key transitional component of continental hydrosystem. We then demonstrate the usefulness of the concept for the multi-dimensional study of the SAI, with a special emphasis on the stream network which is identified as the key component for scaling hydrological processes occurring at the interface. Finally we focus on SAI modelling at various scales with up-to-date methodologies and give some guidance for the multi-dimensional modelling of the interface using the innovative methodology MIM (Measurements-Interpolation-Modelling), which is graphically developed. MIM scales in space three pools of methods needed to fully understand SAIs. The outcome of MIM is the localization in space of the type of SAI that can be studied by a given approach. The efficiency of the method is illustrated from the local (approx. 1m) to the regional scale (> 10 000 km2) with two examples from the Paris basin (France). The first one consists in the implementation of a sampling system of stream-aquifer exchanges, which is coupled with local 2D thermo-hydro models and a pseudo 3D hydro(geo)logical model at the watershed scale (40 km2). The quantification of monthly stream-aquifer exchanges over 14 000 km of river network in the Paris basin (74 000 km2) corresponds to a unique regional scale example.
ERIC Educational Resources Information Center
Powers, Donald; Schedl, Mary; Papageorgiou, Spiros
2017-01-01
The aim of this study was to develop, for the benefit of both test takers and test score users, enhanced "TOEFL ITP"® test score reports that go beyond the simple numerical scores that are currently reported. To do so, we applied traditional scale anchoring (proficiency scaling) to item difficulty data in order to develop performance…
Vicario, Ana; Aragón, Leslie; Wang, Chien C; Bertolino, Franco; Gomez, María R
2018-02-05
In this work, a novel molecularly imprinted polymer (MIP) proposed as solid phase extraction sorbent was developed for the determination of propylparaben (PP) in diverse cosmetic samples. The use of parabens (PAs) is authorized by regulatory agencies as microbiological preservative; however, recently several studies claim that large-scale use of these preservatives can be a potential health risk and harmful to the environment. Diverse factors that influence on polymer synthesis were studied, including template, functional monomer, porogen and crosslinker used. Morphological characterization of the MIP was performed using SEM and BET analysis. Parameters affecting the molecularly imprinted solid phase extraction (MISPE) and elution efficiency of PP were evaluated. After sample clean-up, the analyte was analyzed by high performance liquid chromatography (HPLC). The whole procedure was validated, showing satisfactory analytical parameters. After applying the MISPE methodology, the extraction recoveries were always better than 86.15%; the obtained precision expressed as RSD% was always lower than 2.19 for the corrected peak areas. Good linear relationship was obtained within the range 8-500ngmL -1 of PP, r 2 =0.99985. Lower limits of detection and quantification after MISPE procedure of 2.4 and 8ngmL -1 , respectively were reached, in comparison with previously reported methodologies. The development of MISPE-HPLC methodology provided a simple an economic way for accomplishing a clean-up/preconcentration step and the subsequent determination of PP in a complex matrix. The performance of the proposed method was compared against C-18 and silica solid phase extraction (SPE) cartridges. The recovery factors obtained after applying extraction methods were 96.6, 64.8 and 0.79 for MISPE, C18-SPE and silica-SPE procedures, respectively. The proposed methodology improves the retention capability of SPE material plus robustness and possibility of reutilization, enabling it to be used for PP routine monitoring in diverse personal care products (PCP) and environmental samples. Copyright © 2017 Elsevier B.V. All rights reserved.
Combernoux, Nicolas; Schrive, Luc; Labed, Véronique; Wyart, Yvan; Carretier, Emilie; Moulin, Philippe
2017-10-15
The recent use of the reverse osmosis (RO) process at the damaged Fukushima-Daiichi nuclear power plant generated a growing interest in the application of this process for decontamination purposes. This study focused on the development of a robust RO process for decontamination of two kinds of liquid effluents: a contaminated groundwater after a nuclear disaster and a contaminated seawater during a nuclear accident. The SW30 HR membrane was selected among other in this study due to higher retentions (96% for Cs and 98% for Sr) in a true groundwater. Significant fouling and scaling phenomenon, attributed to calcium and strontium precipitation, were evidenced in this work: this underscored the importance of the lab scale experiment in the process. Validation of the separation performances on trace radionuclides concentration was performed with similar retention around 96% between surrogates Cs (inactive) and 137 Cs (radioactive). The scale up to a 2.6 m 2 spiral wound membrane led to equivalent retentions (around 96% for Cs and 99% for Sr) but lower flux values: this underlined that the hydrodynamic parameters (flowrate/cross-flow velocity) should be optimized. This methodology was also applied on the reconstituted seawater effluent: retentions were slightly lower than for the groundwater and the same hydrodynamic effects were observed on the pilot scale. Then, ageing of the membrane through irradiation experiments were performed. Results showed that the membrane active layer composition influenced the membrane resistance towards γ irradiation: the SW30 HR membrane performances (retention and permeability) were better than the Osmonics SE at 1 MGy. Finally, to supplement the scale up approach, the irradiation of a spiral wound membrane revealed a limited effect on the permeability and retention. This indicated that irradiation conditions need to be controlled for a further development of the process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Molecular-Scale Electronics: From Concept to Function.
Xiang, Dong; Wang, Xiaolong; Jia, Chuancheng; Lee, Takhee; Guo, Xuefeng
2016-04-13
Creating functional electrical circuits using individual or ensemble molecules, often termed as "molecular-scale electronics", not only meets the increasing technical demands of the miniaturization of traditional Si-based electronic devices, but also provides an ideal window of exploring the intrinsic properties of materials at the molecular level. This Review covers the major advances with the most general applicability and emphasizes new insights into the development of efficient platform methodologies for building reliable molecular electronic devices with desired functionalities through the combination of programmed bottom-up self-assembly and sophisticated top-down device fabrication. First, we summarize a number of different approaches of forming molecular-scale junctions and discuss various experimental techniques for examining these nanoscale circuits in details. We then give a full introduction of characterization techniques and theoretical simulations for molecular electronics. Third, we highlight the major contributions and new concepts of integrating molecular functionalities into electrical circuits. Finally, we provide a critical discussion of limitations and main challenges that still exist for the development of molecular electronics. These analyses should be valuable for deeply understanding charge transport through molecular junctions, the device fabrication process, and the roadmap for future practical molecular electronics.
High-Bandwidth Dynamic Full-Field Profilometry for Nano-Scale Characterization of MEMS
NASA Astrophysics Data System (ADS)
Chen, Liang-Chia; Huang, Yao-Ting; Chang, Pi-Bai
2006-10-01
The article describes an innovative optical interferometric methodology to delivery dynamic surface profilometry with a measurement bandwidth up to 10MHz or higher and a vertical resolution up to 1 nm. Previous work using stroboscopic microscopic interferometry for dynamic characterization of micro (opto)electromechanical systems (M(O)EMS) has been limited in measurement bandwidth mainly within a couple of MHz. For high resonant mode analysis, the stroboscopic light pulse is insufficiently short to capture the moving fringes from dynamic motion of the detected structure. In view of this need, a microscopic prototype based on white-light stroboscopic interferometry with an innovative light superposition strategy was developed to achieve dynamic full-field profilometry with a high measurement bandwidth up to 10MHz or higher. The system primarily consists of an optical microscope, on which a Mirau interferometric objective embedded with a piezoelectric vertical translator, a high-power LED light module with dual operation modes and light synchronizing electronics unit are integrated. A micro cantilever beam used in AFM was measured to verify the system capability in accurate characterisation of dynamic behaviours of the device. The full-field seventh-mode vibration at a vibratory frequency of 3.7MHz can be fully characterized and nano-scale vertical measurement resolution as well as tens micrometers of vertical measurement range can be performed.
A Data Preparation Methodology in Data Mining Applied to Mortality Population Databases.
Pérez, Joaquín; Iturbide, Emmanuel; Olivares, Víctor; Hidalgo, Miguel; Martínez, Alicia; Almanza, Nelva
2015-11-01
It is known that the data preparation phase is the most time consuming in the data mining process, using up to 50% or up to 70% of the total project time. Currently, data mining methodologies are of general purpose and one of their limitations is that they do not provide a guide about what particular task to develop in a specific domain. This paper shows a new data preparation methodology oriented to the epidemiological domain in which we have identified two sets of tasks: General Data Preparation and Specific Data Preparation. For both sets, the Cross-Industry Standard Process for Data Mining (CRISP-DM) is adopted as a guideline. The main contribution of our methodology is fourteen specialized tasks concerning such domain. To validate the proposed methodology, we developed a data mining system and the entire process was applied to real mortality databases. The results were encouraging because it was observed that the use of the methodology reduced some of the time consuming tasks and the data mining system showed findings of unknown and potentially useful patterns for the public health services in Mexico.
NASA Astrophysics Data System (ADS)
Serra, Romain; Valette, Anne; Taji, Amine; Emsley, Stephen
2017-04-01
Building climate resilience (i.e. climate change adaptation or self-renew of ecosystems) or planning environment rehabilitations and nature-based solutions to address their vulnerabilities to disturbances has prerequisites: 1- identify the disorder, i.e. stresses caused by events such as hurricanes, tsunamis, heavy rains, hailstone falls, smog… or piled-up along-time such as warming, rainfalls, ocean acidification, soil salinization… and measured by trends; and 2- qualify its impact on the ecosystems, i.e. the resulting strains. Mitigation of threats is accordingly twofold, i. on locally temporal scales for protection, ii. on long scale for prevention and sustainability. For assessment and evaluation prior to design future scenarios, it requires concomitant acquisition of (a) climate data at global and local spatial scale which describe the changes at the various temporal scales of phenomena without signal aliasing, and of (b) the ecosystems' status at the scales of the forcing and of relaxation times, hysteresis lags, periodicities of orbits in chaotic systems, shifts from one attractor in ecosystems to the others, etc. Dissociating groups of timescales and spatial scales facilitates the analysis and help set-up monitoring schemes. The Sentinel-2 mission, with a revisit of the earth every few days and a 10m resolution on-ground is a good automatic spectro-analytical monitoring system because detecting changes in numerous optical & IR bands at proper spatial scales for the description of land parcels. Combined with photo-interpreted VHR data which describe the environment more crudely but with high precision of land parcels' border locations, it helps find the relationship between stress and strains to empirically understand the relationships. An example is provided for Tonga, courtesy of ESA support and ADB request, with a focus on time-series' consistency that requires radiometric and geometric normalisation of EO data sets. Methodologies have been developed in the frame of ESA programs and EC program (H2020 Co-Resyf).
Demonstration-Scale High-Cell-Density Fermentation of Pichia pastoris.
Liu, Wan-Cang; Zhu, Ping
2018-01-01
Pichia pastoris has been one of the most successful heterologous overexpression systems in generating proteins for large-scale production through high-cell-density fermentation. However, optimizing conditions of the large-scale high-cell-density fermentation for biochemistry and industrialization is usually a laborious and time-consuming process. Furthermore, it is often difficult to produce authentic proteins in large quantities, which is a major obstacle for functional and structural features analysis and industrial application. For these reasons, we have developed a protocol for efficient demonstration-scale high-cell-density fermentation of P. pastoris, which employs a new methanol-feeding strategy-biomass-stat strategy and a strategy of increased air pressure instead of pure oxygen supplement. The protocol included three typical stages of glycerol batch fermentation (initial culture phase), glycerol fed-batch fermentation (biomass accumulation phase), and methanol fed-batch fermentation (induction phase), which allows direct online-monitoring of fermentation conditions, including broth pH, temperature, DO, anti-foam generation, and feeding of glycerol and methanol. Using this protocol, production of the recombinant β-xylosidase of Lentinula edodes origin in 1000-L scale fermentation can be up to ~900 mg/L or 9.4 mg/g cells (dry cell weight, intracellular expression), with the specific production rate and average specific production of 0.1 mg/g/h and 0.081 mg/g/h, respectively. The methodology described in this protocol can be easily transferred to other systems, and eligible to scale up for a large number of proteins used in either the scientific studies or commercial purposes.
Potyrailo, Radislav A; Chisholm, Bret J; Morris, William G; Cawse, James N; Flanagan, William P; Hassib, Lamyaa; Molaison, Chris A; Ezbiansky, Karin; Medford, George; Reitz, Hariklia
2003-01-01
Coupling of combinatorial chemistry methods with high-throughput (HT) performance testing and measurements of resulting properties has provided a powerful set of tools for the 10-fold accelerated discovery of new high-performance coating materials for automotive applications. Our approach replaces labor-intensive steps with automated systems for evaluation of adhesion of 8 x 6 arrays of coating elements that are discretely deposited on a single 9 x 12 cm plastic substrate. Performance of coatings is evaluated with respect to their resistance to adhesion loss, because this parameter is one of the primary considerations in end-use automotive applications. Our HT adhesion evaluation provides previously unavailable capabilities of high speed and reproducibility of testing by using a robotic automation, an expanded range of types of tested coatings by using the coating tagging strategy, and an improved quantitation by using high signal-to-noise automatic imaging. Upon testing, the coatings undergo changes that are impossible to quantitatively predict using existing knowledge. Using our HT methodology, we have developed several coatings leads. These HT screening results for the best coating compositions have been validated on the traditional scales of coating formulation and adhesion loss testing. These validation results have confirmed the superb performance of combinatorially developed coatings over conventional coatings on the traditional scale.
NASA Astrophysics Data System (ADS)
Rose, A.; McKee, J.; Weber, E.; Bhaduri, B. L.
2017-12-01
Leveraging decades of expertise in population modeling, and in response to growing demand for higher resolution population data, Oak Ridge National Laboratory is now generating LandScan HD at global scale. LandScan HD is conceived as a 90m resolution population distribution where modeling is tailored to the unique geography and data conditions of individual countries or regions by combining social, cultural, physiographic, and other information with novel geocomputation methods. Similarities among these areas are exploited in order to leverage existing training data and machine learning algorithms to rapidly scale development. Drawing on ORNL's unique set of capabilities, LandScan HD adapts highly mature population modeling methods developed for LandScan Global and LandScan USA, settlement mapping research and production in high-performance computing (HPC) environments, land use and neighborhood mapping through image segmentation, and facility-specific population density models. Adopting a flexible methodology to accommodate different geographic areas, LandScan HD accounts for the availability, completeness, and level of detail of relevant ancillary data. Beyond core population and mapped settlement inputs, these factors determine the model complexity for an area, requiring that for any given area, a data-driven model could support either a simple top-down approach, a more detailed bottom-up approach, or a hybrid approach.
Putilov, Arcady A
2017-01-01
Differences between the so-called larks and owls representing the opposite poles of morningness-eveningness dimension are widely known. However, scientific consensus has not yet been reached on the methodology for ranking and typing people along other dimensions of individual variation in their sleep-wake pattern. This review focused on the history and state-of-the-art of the methodology for self-assessment of individual differences in more than one trait or adaptability of the human sleep-wake cycle. The differences between this and other methodologies for the self-assessment of trait- and state-like variation in the perceived characteristics of daily rhythms were discussed and the critical issues that remained to be addressed in future studies were highlighted. These issues include a) a failure to develop a unidimensional scale for scoring chronotypological differences, b) the inconclusive results of the long-lasting search for objective markers of chronotype, c) a disagreement on both number and content of scales required for multidimensional self-assessment of chronobiological differences, d) a lack of evidence for the reliability and/or external validity of most of the proposed scales and e) an insufficient development of conceptualizations, models and model-based quantitative simulations linking the differences between people in their sleep-wake pattern with the differences in the basic parameters of underlying chronoregulatory processes. It seems that, in the nearest future, the wide implementation of portable actigraphic and somnographic devices might lead to the development of objective methodologies for multidimensional assessment and classification of sleep-wake traits and adaptabilities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, J. -J.; Chang, Y. -S.; Hartmann, H.
2013-09-01
This report presents a general methodology for obtaining preliminary estimates of the potential human health risks associated with developing a utility-scale solar energy facility on a contaminated site, based on potential exposures to contaminants in soils (including transport of those contaminants into the air).
Development of a Scale to Measure Faculty Attitude Towards Open Educational Resources
ERIC Educational Resources Information Center
Mishra, Sanjaya; Sharma, Meenu; Sharma, Ramesh Chander; Singh, Alka; Thakur, Atul
2016-01-01
This paper describes the entire methodology for the development of a scale to measure Attitude towards Open Educational Resources (ATOER). Traditionally, it is observed that some teachers are more willing to share their work than others, indicating the need to understand teachers' psychological and behavioural determinants that influence use of…
Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.
Nandi, Anirban; Danquah, Michael K.
2014-01-01
Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA. PMID:25057428
The Disabled Student Experience: Does the SERVQUAL Scale Measure Up?
ERIC Educational Resources Information Center
Vaughan, Elizabeth; Woodruffe-Burton, Helen
2011-01-01
Purpose: The purpose of this paper is to empirically test a new disabled service user-specific service quality model ARCHSECRET against a modified SERVQUAL model in the context of disabled students within higher education. Design/methodology/approach: The application of SERVQUAL in the voluntary sector had raised serious issues on its portability…
An overview of key technology thrusts at Bell Helicopter Textron
NASA Technical Reports Server (NTRS)
Harse, James H.; Yen, Jing G.; Taylor, Rodney S.
1988-01-01
Insight is provided into several key technologies at Bell. Specific topics include the results of ongoing research and development in advanced rotors, methodology development, and new configurations. The discussion on advanced rotors highlight developments on the composite, bearingless rotor, including the development and testing of full scale flight hardware as well as some of the design support analyses and verification testing. The discussion on methodology development concentrates on analytical development in aeromechanics, including correlation studies and design application. New configurations, presents the results of some advanced configuration studies including hardware development.
Zhang, Yun-jian; Li, Qiang; Zhang, Yu-xiu; Wang, Dan; Xing, Jian-min
2012-01-01
Succinic acid is considered as an important platform chemical. Succinic acid fermentation with Actinobacillus succinogenes strain BE-1 was optimized by central composite design (CCD) using a response surface methodology (RSM). The optimized production of succinic acid was predicted and the interactive effects between glucose, yeast extract, and magnesium carbonate were investigated. As a result, a model for predicting the concentration of succinic acid production was developed. The accuracy of the model was confirmed by the analysis of variance (ANOVA), and the validity was further proved by verification experiments showing that percentage errors between actual and predicted values varied from 3.02% to 6.38%. In addition, it was observed that the interactive effect between yeast extract and magnesium carbonate was statistically significant. In conclusion, RSM is an effective and useful method for optimizing the medium components and investigating the interactive effects, and can provide valuable information for succinic acid scale-up fermentation using A. succinogenes strain BE-1. PMID:22302423
Participatory approaches to understanding practices of flood management across borders
NASA Astrophysics Data System (ADS)
Bracken, L. J.; Forrester, J.; Oughton, E. A.; Cinderby, S.; Donaldson, A.; Anness, L.; Passmore, D.
2012-04-01
The aim of this paper is to outline and present initial results from a study designed to identify principles of and practices for adaptive co-management strategies for resilience to flooding in borderlands using participatory methods. Borderlands are the complex and sometimes undefined spaces existing at the interface of different territories and draws attention towards messy connections and disconnections (Strathern 2004; Sassen 2006). For this project the borderlands concerned are those between professional and lay knowledge, between responsible agencies, and between one nation and another. Research was focused on the River Tweed catchment, located on the Scottish-English border. This catchment is subject to complex environmental designations and rural development regimes that make integrated management of the whole catchment difficult. A multi-method approach was developed using semi-structured interviews, Q methodology and participatory GIS in order to capture wide ranging practices for managing flooding, the judgements behind these practices and to 'scale up' participation in the study. Professionals and local experts were involved in the research. The methodology generated a useful set of options for flood management, with research outputs easily understood by key management organisations and the wider public alike. There was a wide endorsement of alternative flood management solutions from both managers and local experts. The role of location was particularly important for ensuring communication and data sharing between flood managers from different organisations and more wide ranging stakeholders. There were complex issues around scale; both the mismatch between communities and evidence of flooding and the mismatch between governance and scale of intervention for natural flood management. The multi-method approach was essential in capturing practice and the complexities around governance of flooding. The involvement of key flood management organisations was integral to making the research of relevance to professionals.
NASA Astrophysics Data System (ADS)
Pathak, Maharshi
City administrators and real-estate developers have been setting up rather aggressive energy efficiency targets. This, in turn, has led the building science research groups across the globe to focus on urban scale building performance studies and level of abstraction associated with the simulations of the same. The increasing maturity of the stakeholders towards energy efficiency and creating comfortable working environment has led researchers to develop methodologies and tools for addressing the policy driven interventions whether it's urban level energy systems, buildings' operational optimization or retrofit guidelines. Typically, these large-scale simulations are carried out by grouping buildings based on their design similarities i.e. standardization of the buildings. Such an approach does not necessarily lead to potential working inputs which can make decision-making effective. To address this, a novel approach is proposed in the present study. The principle objective of this study is to propose, to define and evaluate the methodology to utilize machine learning algorithms in defining representative building archetypes for the Stock-level Building Energy Modeling (SBEM) which are based on operational parameter database. The study uses "Phoenix- climate" based CBECS-2012 survey microdata for analysis and validation. Using the database, parameter correlations are studied to understand the relation between input parameters and the energy performance. Contrary to precedence, the study establishes that the energy performance is better explained by the non-linear models. The non-linear behavior is explained by advanced learning algorithms. Based on these algorithms, the buildings at study are grouped into meaningful clusters. The cluster "mediod" (statistically the centroid, meaning building that can be represented as the centroid of the cluster) are established statistically to identify the level of abstraction that is acceptable for the whole building energy simulations and post that the retrofit decision-making. Further, the methodology is validated by conducting Monte-Carlo simulations on 13 key input simulation parameters. The sensitivity analysis of these 13 parameters is utilized to identify the optimum retrofits. From the sample analysis, the envelope parameters are found to be more sensitive towards the EUI of the building and thus retrofit packages should also be directed to maximize the energy usage reduction.
Kubelka, Jan
2009-04-01
Many important biochemical processes occur on the time-scales of nanoseconds and microseconds. The introduction of the laser temperature-jump (T-jump) to biophysics more than a decade ago opened these previously inaccessible time regimes up to direct experimental observation. Since then, laser T-jump methodology has evolved into one of the most versatile and generally applicable methods for studying fast biomolecular kinetics. This perspective is a review of the principles and applications of the laser T-jump technique in biophysics. A brief overview of the T-jump relaxation kinetics and the historical development of laser T-jump methodology is presented. The physical principles and practical experimental considerations that are important for the design of the laser T-jump experiments are summarized. These include the Raman conversion for generating heating pulses, considerations of size, duration and uniformity of the temperature jump, as well as potential adverse effects due to photo-acoustic waves, cavitation and thermal lensing, and their elimination. The laser T-jump apparatus developed at the NIH Laboratory of Chemical Physics is described in detail along with a brief survey of other laser T-jump designs in use today. Finally, applications of the laser T-jump in biophysics are reviewed, with an emphasis on the broad range of problems where the laser T-jump methodology has provided important new results and insights into the dynamics of the biomolecular processes.
Clean Water for Developing Countries.
Pandit, Aniruddha B; Kumar, Jyoti Kishen
2015-01-01
Availability of safe drinking water, a vital natural resource, is still a distant dream to many around the world, especially in developing countries. Increasing human activity and industrialization have led to a wide range of physical, chemical, and biological pollutants entering water bodies and affecting human lives. Efforts to develop efficient, economical, and technologically sound methods to produce clean water for developing countries have increased worldwide. We focus on solar disinfection, filtration, hybrid filtration methods, treatment of harvested rainwater, herbal water disinfection, and arsenic removal technologies. Simple, yet innovative water treatment devices ranging from use of plant xylem as filters, terafilters, and hand pumps to tippy taps designed indigenously are methods mentioned here. By describing the technical aspects of major water disinfection methods relevant for developing countries on medium to small scales and emphasizing their merits, demerits, economics, and scalability, we highlight the current scenario and pave the way for further research and development and scaling up of these processes. This review focuses on clean drinking water, especially for rural populations in developing countries. It describes various water disinfection techniques that are not only economically viable and energy efficient but also employ simple methodologies that are effective in reducing the physical, chemical, and biological pollutants found in drinking water to acceptable limits.
Structural Health Monitoring of Nuclear Spent Fuel Storage Facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Lingyu
Interim storage of spent nuclear fuel from reactor sites has gained additional importance and urgency for resolving waste-management-related technical issues. To ensure that nuclear power remains clean energy, monitoring has been identified by DOE as a high priority cross-cutting need, necessary to determine and predict the degradation state of the systems, structures, and components (SSCs) important to safety (ITS). Therefore, nondestructive structural condition monitoring becomes a need to be installed on existing or to be integrated into future storage system to quantify the state of health or to guarantee the safe operation of nuclear power plants (NPPs) during their extendedmore » life span. In this project, the lead university and the collaborating national laboratory teamed to develop a nuclear structural health monitoring (n-SHM) system based on in-situ piezoelectric sensing technologies that can monitor structural degradation and aging for nuclear spent fuel DCSS and similar structures. We also aimed to identify and quantify possible influences of nuclear spent fuel environment (temperature and radiation) to the piezoelectric sensor system and come up with adequate solutions and guidelines therefore. We have therefore developed analytical model for piezoelectric based n-SHM methods, with considerations of temperature and irradiation influence on the model of sensing and algorithms in acoustic emission (AE), guided ultrasonic waves (GUW), and electromechanical impedance spectroscopy (EMIS). On the other side, experimentally the temperature and irradiation influence on the piezoelectric sensors and sensing capabilities were investigated. Both short-term and long-term irradiation investigation with our collaborating national laboratory were performed. Moreover, we developed multi-modal sensing, validated in laboratory setup, and conducted the testing on the We performed multi-modal sensing development, verification and validation tests on very complex structures including a medium-scale vacuum drying chamber and a small-scale mockup canister available for the desired testing. Our work developed the potential candidate for long term structural health monitoring of spent fuel canister through piezoelectric wafer sensors and provided the sensing methodologies based on AE and GUW methodologies. It overall provides an innovative system and methodology for enhancing the safe operation of nuclear power plant. All major accomplishments planned in the original proposal were successfully achieved.« less
Boron-rich benzene and pyrene derivatives for the detection of thermal neutrons
Yemam, Henok A.; Mahl, Adam; Koldemir, Unsal; Remedes, Tyler; Parkin, Sean; Greife, Uwe; Sellinger, Alan
2015-01-01
A synthetic methodology is developed to generate boron rich aromatic small molecules based on benzene and pyrene moieties for the detection of thermal neutrons. The prepared aromatic compounds have a relatively high boron content up to 7.4 wt%, which is important for application in neutron detection as 10B (20% of natural abundance boron) has a large neutron induced reaction cross-section. This is demonstrated by preparing blends of the synthesized molecules with fluorescent dopants in poly(vinyltoluene) matrices resulting in comparable scintillation light output and neutron capture as state-of-the art commercial scintillators, but with the advantage of much lower cost. The boron-rich benzene and pyrene derivatives are prepared by Suzuki conditions using both microwave and traditional heating, affording yields of 40–93%. This new procedure is simple and straightforward, and has the potential to be scaled up. PMID:26334111
Marek, Ryan J; Tarescavage, Anthony M; Ben-Porath, Yossef S; Ashton, Kathleen; Merrell Rish, Julie; Heinberg, Leslie J
2015-01-01
Previous studies suggest that presurgical psychopathology accounts for some of the variance in suboptimal weight loss outcomes among Roux-en-Y gastric bypass (RYGB) patients, but research has been equivocal. The present study seeks to extend the past literature by examining associations between presurgical scale scores on the broadband Minnesota Multiphasic Personality Inventory-2-Restructured Form (MMPI-2-RF) and suboptimal weight loss and poor adherence to follow-up 1 year postoperatively after accounting for several methodologic considerations. Cleveland Clinic Bariatric and Metabolic Institute, Cleveland, Ohio, USA. The sample consisted of 498 RYGB patients, who produced a valid presurgical MMPI-2-RF protocol at program intake. The sample was primarily female (72.9%), Caucasian (64.9%), and middle-aged (mean = 46.4 years old; standard deviation [SD] = 11.6). The mean presurgical body mass index (BMI) was 47.4 kg/m(2) (SD = 8.2) and mean percent weight loss (%WL) at 1 year postoperatively was 31.18 %WL (SD = 8.44). As expected, scales from the Behavioral/Externalizing Dysfunction (BXD) domain of the MMPI-2-RF were associated with worse weight loss outcomes and poor adherence to follow-up, particularly after accounting for range restriction due to underreporting. Individuals producing elevated scores on these scales were at greater risk for achieving suboptimal weight loss (<50% excess weight loss) and not following up with their appointment compared with those who scored below cut-offs. Patients who are more likely to engage in undercontrolled behavior (e.g., poor impulse control), as indicated by presurgical MMPI-2-RF findings, are at greater risk for suboptimal weight loss and poor adherence to follow-up following RYGB. Objective psychological assessments should also be conducted postoperatively to ensure that intervention is administered in a timely manner. Future research in the area of presurgical psychological screening should consider the impact of underreporting and other discussed methodologic issues in predictive analyses. Copyright © 2015 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.
Ion Torrent sequencing as a tool for mutation discovery in the flax (Linum usitatissimum L.) genome.
Galindo-González, Leonardo; Pinzón-Latorre, David; Bergen, Erik A; Jensen, Dustin C; Deyholos, Michael K
2015-01-01
Detection of induced mutations is valuable for inferring gene function and for developing novel germplasm for crop improvement. Many reverse genetics approaches have been developed to identify mutations in genes of interest within a mutagenized population, including some approaches that rely on next-generation sequencing (e.g. exome capture, whole genome resequencing). As an alternative to these genome or exome-scale methods, we sought to develop a scalable and efficient method for detection of induced mutations that could be applied to a small number of target genes, using Ion Torrent technology. We developed this method in flax (Linum usitatissimum), to demonstrate its utility in a crop species. We used an amplicon-based approach in which DNA samples from an ethyl methanesulfonate (EMS)-mutagenized population were pooled and used as template in PCR reactions to amplify a region of each gene of interest. Barcodes were incorporated during PCR, and the pooled amplicons were sequenced using an Ion Torrent PGM. A pilot experiment with known SNPs showed that they could be detected at a frequency > 0.3% within the pools. We then selected eight genes for which we wanted to discover novel mutations, and applied our approach to screen 768 individuals from the EMS population, using either the Ion 314 or Ion 316 chips. Out of 29 potential mutations identified after processing the NGS reads, 16 mutations were confirmed using Sanger sequencing. The methodology presented here demonstrates the utility of Ion Torrent technology in detecting mutation variants in specific genome regions for large populations of a species such as flax. The methodology could be scaled-up to test >100 genes using the higher capacity chips now available from Ion Torrent.
Jing, Liang; Chen, Bing; Wen, Diya; Zheng, Jisi; Zhang, Baiyu
2017-12-01
This study shed light on removing atrazine from pesticide production wastewater using a pilot-scale UV/O 3 /ultrasound flow-through system. A significant quadratic polynomial prediction model with an adjusted R 2 of 0.90 was obtained from central composite design with response surface methodology. The optimal atrazine removal rate (97.68%) was obtained at the conditions of 75 W UV power, 10.75 g h -1 O 3 flow rate and 142.5 W ultrasound power. A Monte Carlo simulation aided artificial neural networks model was further developed to quantify the importance of O 3 flow rate (40%), UV power (30%) and ultrasound power (30%). Their individual and interaction effects were also discussed in terms of reaction kinetics. UV and ultrasound could both enhance the decomposition of O 3 and promote hydroxyl radical (OH·) formation. Nonetheless, the dose of O 3 was the dominant factor and must be optimized because excess O 3 can react with OH·, thereby reducing the rate of atrazine degradation. The presence of other organic compounds in the background matrix appreciably inhibited the degradation of atrazine, while the effects of Cl - , CO 3 2- and HCO 3 - were comparatively negligible. It was concluded that the optimization of system performance using response surface methodology and neural networks would be beneficial for scaling up the treatment by UV/O 3 /ultrasound at industrial level. Copyright © 2017 Elsevier Ltd. All rights reserved.
Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco
2015-01-01
The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.
Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco
2015-01-01
The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest. PMID:26284241
A demand-centered, hybrid life-cycle methodology for city-scale greenhouse gas inventories.
Ramaswami, Anu; Hillman, Tim; Janson, Bruce; Reiner, Mark; Thomas, Gregg
2008-09-01
Greenhouse gas (GHG) accounting for individual cities is confounded by spatial scale and boundary effects that impact the allocation of regional material and energy flows. This paper develops a demand-centered, hybrid life-cycle-based methodology for conducting city-scale GHG inventories that incorporates (1) spatial allocation of surface and airline travel across colocated cities in larger metropolitan regions, and, (2) life-cycle assessment (LCA) to quantify the embodied energy of key urban materials--food, water, fuel, and concrete. The hybrid methodology enables cities to separately report the GHG impact associated with direct end-use of energy by cities (consistent with EPA and IPCC methods), as well as the impact of extra-boundary activities such as air travel and production of key urban materials (consistent with Scope 3 protocols recommended by the World Resources Institute). Application of this hybrid methodology to Denver, Colorado, yielded a more holistic GHG inventory that approaches a GHG footprint computation, with consistency of inclusions across spatial scale as well as convergence of city-scale per capita GHG emissions (approximately 25 mt CO2e/person/year) with state and national data. The method is shown to have significant policy impacts, and also demonstrates the utility of benchmarks in understanding energy use in various city sectors.
Polańska, Kinga; Hanke, Wojciech; Król, Anna; Potocka, Adrianna; Waszkowska, Małgorzata; Jacukowicz, Aleksandra; Gromadzińska, Jolanta; Wąsowicz, Wojciech; Jerzyńska, Joanna; Stelmach, Włodzimierz; Stelmach, Iwona
2016-11-18
Effects of environmental exposures in utero and in the first years of life on early life health and development is a growing research area with major public health implications. The main aim of this work has been to provide an overview of the next step of the Polish Mother and Child Cohort Study (REPRO_PL) covering exposure, health and neurodevelopment assessments of children at 7 years of age. Details regarding methodology of the follow-up of the children are crucial for cross-cohort collaboration and a full understanding of the future research questions. Phase III of the REPRO_PL cohort covers a follow-up of 900 children at the age of 7 years old. The questionnaire filled in by the mothers is composed of: socio-demographic, child exposure and home environment information, nutritional status and health data. In the case of 400 children, environmental (including collection of urine, saliva and buccal cells), health status and psychomotor assessments are performed. Health and development check consists of physical measurements, child health status assessment (including lung function tests, skin prick testing, an interview/examination by an allergist) and psychomotor development tests (the Strength and Difficulties Questionnaire and the Intelligence and Development Scales). The results of the study will become available within the next few years. Extension of the REPRO_PL cohort with examinations of children at the age of 7 years old may provide a better understanding of the relationship between environmental and lifestyle-related factors and children's health and neurodevelopment; and may further strengthen scientific base for policies and interventions promoting healthy lifestyle. Int J Occup Med Environ Health 2016;29(6):883-893. This work is available in Open Access model and licensed under a CC BY-NC 3.0 PL license.
Kayal, Arivudainambi; Mohan, Viswanathan; Malanda, Belma; Anjana, Ranjit Mohan; Bhavadharini, Balaji; Mahalakshmi, Manni Mohanraj; Maheswari, Kumar; Uma, Ram; Unnikrishnan, Ranjit; Kalaiyarasi, Gunasekaran; Ninov, Lyudmil; Belton, Anne
2016-01-01
Aim: The Women In India with GDM Strategy (WINGS) project was conducted with the aim of developing a model of care (MOC) suitable for women with gestational diabetes mellitus (GDM) in low- and middle-income countries. Methodology: The WINGS project was carried out in Chennai, Southern India, in two phases. In Phase I, a situational analysis was conducted to understand the practice patterns of health-care professionals and to determine the best screening criteria through a pilot screening study. Results: Phase II involved developing a MOC-based on findings from the situational analysis and evaluating its effectiveness. The model focused on diagnosis, management, and follow-up of women with GDM who were followed prospectively throughout their pregnancy. An educational booklet was provided to all women with GDM, offering guidance on self-management of GDM including sample meal plans and physical activity tips. A pedometer was provided to all women to monitor step count. Medical nutrition therapy (MNT) was the first line of treatment given to women with GDM. Women were advised to undergo fasting blood glucose and postprandial blood glucose testing every fortnight. Insulin was indicated when the target blood glucose levels were not achieved with MNT. Women were evaluated for pregnancy outcomes and postpartum glucose tolerance status. Conclusions: The WINGS MOC offers a comprehensive package at every level of care for women with GDM. If successful, this MOC will be scaled up to other resource-constrained settings with the hope of improving lives of women with GDM. PMID:27730085
ERIC Educational Resources Information Center
Rutherford, Teomara; Kibrick, Melissa; Burchinal, Margaret; Richland, Lindsey; Conley, AnneMarie; Osborne, Keara; Schneider, Stephanie; Duran, Lauren; Coulson, Andrew; Antenore, Fran; Daniels, Abby; Martinez, Michael E.
2010-01-01
This paper describes the background, methodology, preliminary findings, and anticipated future directions of a large-scale multi-year randomized field experiment addressing the efficacy of ST Math [Spatial-Temporal Math], a fully-developed math curriculum that uses interactive animated software. ST Math's unique approach minimizes the use of…
Managing design for manufacture and assembly in the development of MEMS-based products
NASA Astrophysics Data System (ADS)
Hsu, Hung-Yao; Narasimhan, Nachchinarkkinian; Hariz, Alex J.
2006-12-01
Design for manufacturability, assembly and reliability of MEMS products is being applied to a multitude of novel MEMS products to make up for the lack of "Standard Process for MEMS" concept. The latter has proved a major handicap in commercialization of MEMS devices when compared to integrated circuits products. Furthermore, an examination of recent engineering literature seems to suggest convergence towards the development of the design for manufacturability and reliability of MEMS products. This paper will highlight the advantages and disadvantages of conventional techniques that have been pursued up to this point to achieve commercialization of MEMS products, identify some of the problems slowing down development, and explore measures that could be taken to try to address those problems. Successful commercialization critically depends on packaging and assembly, manufacturability, and reliability for micro scale products. However, a methodology that appropriately shadows next generation knowledge management will undoubtedly address most of the critical problems that are hampering development of MEMS industries. Finally this paper will also identify contemporary issues that are challenging the industry in regards to product commercialization and will recommend appropriate measures based on knowledge flow to address those shortcomings and lay out plans to expedient and successful paths to market.
Adaptive Multi-scale PHM for Robotic Assembly Processes
Choo, Benjamin Y.; Beling, Peter A.; LaViers, Amy E.; Marvel, Jeremy A.; Weiss, Brian A.
2017-01-01
Adaptive multiscale prognostics and health management (AM-PHM) is a methodology designed to support PHM in smart manufacturing systems. As a rule, PHM information is not used in high-level decision-making in manufacturing systems. AM-PHM leverages and integrates component-level PHM information with hierarchical relationships across the component, machine, work cell, and production line levels in a manufacturing system. The AM-PHM methodology enables the creation of actionable prognostic and diagnostic intelligence up and down the manufacturing process hierarchy. Decisions are made with the knowledge of the current and projected health state of the system at decision points along the nodes of the hierarchical structure. A description of the AM-PHM methodology with a simulated canonical robotic assembly process is presented. PMID:28664161
Building laboratory capacity to support HIV care in Nigeria: Harvard/APIN PEPFAR, 2004-2012.
Hamel, Donald J; Sankalé, Jean-Louis; Samuels, Jay Osi; Sarr, Abdoulaye D; Chaplin, Beth; Ofuche, Eke; Meloni, Seema T; Okonkwo, Prosper; Kanki, Phyllis J
From 2004-2012, the Harvard/AIDS Prevention Initiative in Nigeria, funded through the US President's Emergency Plan for AIDS Relief programme, scaled up HIV care and treatment services in Nigeria. We describe the methodologies and collaborative processes developed to improve laboratory capacity significantly in a resource-limited setting. These methods were implemented at 35 clinic and laboratory locations. Systems were established and modified to optimise numerous laboratory processes. These included strategies for clinic selection and management, equipment and reagent procurement, supply chains, laboratory renovations, equipment maintenance, electronic data management, quality development programmes and trainings. Over the eight-year programme, laboratories supported 160 000 patients receiving HIV care in Nigeria, delivering over 2.5 million test results, including regular viral load quantitation. External quality assurance systems were established for CD4+ cell count enumeration, blood chemistries and viral load monitoring. Laboratory equipment platforms were improved and standardised and use of point-of-care analysers was expanded. Laboratory training workshops supported laboratories toward increasing staff skills and improving overall quality. Participation in a World Health Organisation-led African laboratory quality improvement system resulted in significant gains in quality measures at five laboratories. Targeted implementation of laboratory development processes, during simultaneous scale-up of HIV treatment programmes in a resource-limited setting, can elicit meaningful gains in laboratory quality and capacity. Systems to improve the physical laboratory environment, develop laboratory staff, create improvements to reduce costs and increase quality are available for future health and laboratory strengthening programmes. We hope that the strategies employed may inform and encourage the development of other laboratories in resource-limited settings.
The Scaling of Performance and Losses in Miniature Internal Combustion Engines
2010-01-01
reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat transfer...making reliable measurements of engine performance and losses in these small engines. Methodologies are also developed for measuring volumetric, heat ...the most important challenge as it accounts for 60-70% of total energy losses. Combustion losses are followed in order of importance by heat transfer
The Development of Methodologies for Determining Non-Linear Effects in Infrasound Sensors
2010-09-01
THE DEVELOPMENT OF METHODOLOGIES FOR DETERMINING NON-LINEAR EFFECTS IN INFRASOUND SENSORS Darren M. Hart, Harold V. Parks, and Randy K. Rembold...the past year, four new infrasound sensor designs were evaluated for common performance characteristics, i.e., power consumption, response (amplitude...and phase), noise, full-scale, and dynamic range. In the process of evaluating a fifth infrasound sensor, which is an update of an original design
Harte, Richard; Glynn, Liam; Rodríguez-Molinero, Alejandro; Baker, Paul Ma; Scharf, Thomas; Quinlan, Leo R; ÓLaighin, Gearóid
2017-03-16
Design processes such as human-centered design, which involve the end user throughout the product development and testing process, can be crucial in ensuring that the product meets the needs and capabilities of the user, particularly in terms of safety and user experience. The structured and iterative nature of human-centered design can often present a challenge when design teams are faced with the necessary, rapid, product development life cycles associated with the competitive connected health industry. We wanted to derive a structured methodology that followed the principles of human-centered design that would allow designers and developers to ensure that the needs of the user are taken into account throughout the design process, while maintaining a rapid pace of development. In this paper, we present the methodology and its rationale before outlining how it was applied to assess and enhance the usability, human factors, and user experience of a connected health system known as the Wireless Insole for Independent and Safe Elderly Living (WIISEL) system, a system designed to continuously assess fall risk by measuring gait and balance parameters associated with fall risk. We derived a three-phase methodology. In Phase 1 we emphasized the construction of a use case document. This document can be used to detail the context of use of the system by utilizing storyboarding, paper prototypes, and mock-ups in conjunction with user interviews to gather insightful user feedback on different proposed concepts. In Phase 2 we emphasized the use of expert usability inspections such as heuristic evaluations and cognitive walkthroughs with small multidisciplinary groups to review the prototypes born out of the Phase 1 feedback. Finally, in Phase 3 we emphasized classical user testing with target end users, using various metrics to measure the user experience and improve the final prototypes. We report a successful implementation of the methodology for the design and development of a system for detecting and predicting falls in older adults. We describe in detail what testing and evaluation activities we carried out to effectively test the system and overcome usability and human factors problems. We feel this methodology can be applied to a wide variety of connected health devices and systems. We consider this a methodology that can be scaled to different-sized projects accordingly. ©Richard Harte, Liam Glynn, Alejandro Rodríguez-Molinero, Paul MA Baker, Thomas Scharf, Leo R Quinlan, Gearóid ÓLaighin. Originally published in JMIR Human Factors (http://humanfactors.jmir.org), 16.03.2017.
Method for the Direct Solve of the Many-Body Schrödinger Wave Equation
NASA Astrophysics Data System (ADS)
Jerke, Jonathan; Tymczak, C. J.; Poirier, Bill
We report on theoretical and computational developments towards a computationally efficient direct solve of the many-body Schrödinger wave equation for electronic systems. This methodology relies on two recent developments pioneered by the authors: 1) the development of a Cardinal Sine basis for electronic structure calculations; and 2) the development of a highly efficient and compact representation of multidimensional functions using the Canonical tensor rank representation developed by Belykin et. al. which we have adapted to electronic structure problems. We then show several relevant examples of the utility and accuracy of this methodology, scaling with system size, and relevant convergence issues of the methodology. Method for the Direct Solve of the Many-Body Schrödinger Wave Equation.
Emerging freeze-drying process development and scale-up issues.
Patel, Sajal Manubhai; Pikal, Michael J
2011-03-01
Although several guidelines do exist for freeze-drying process development and scale-up, there are still a number of issues that require additional attention. The objective of this review article is to discuss some emerging process development and scale-up issue with emphasis on effect of load condition and freeze-drying in novel container systems such as syringes, Lyoguard trays, ampoules, and 96-well plates. Understanding the heat and mass transfer under different load conditions and for freeze-drying in these novel container systems will help in developing a robust freeze-drying process which is also easier to scale-up. Further research and development needs in these emerging areas have also been addressed. © 2011 American Association of Pharmaceutical Scientists
Teacher Self-Efficacy and Occupational Stress: A Major Australian Curriculum Reform Revisited
ERIC Educational Resources Information Center
McCormick, John; Ayres, Paul L.
2009-01-01
Purpose: The purpose of this research was to study teachers' self-efficacy and occupational stress in the context of a large-scale curriculum reform in New South Wales, Australia. The study aims to follow up and replicate a study carried out approximately one year earlier. Design/methodology/approach: A theoretical framework, primarily based on…
Increases in Global and Domain Specific Self-Esteem Following a 10 Day Developmental Voyage
ERIC Educational Resources Information Center
Grocott, Andrew C.; Hunter, John A.
2009-01-01
Although positive effects are often reported, research assessing the impact of Adventure Education and Outward Bound programmes on self-esteem is fraught with methodological weaknesses pertaining to an emphasis on scales assessing global self-esteem, a lack of follow-up measures to assess the potential long-term benefits of such programmes and…
Scaling dimensions in spectroscopy of soil and vegetation
NASA Astrophysics Data System (ADS)
Malenovský, Zbyněk; Bartholomeus, Harm M.; Acerbi-Junior, Fausto W.; Schopfer, Jürg T.; Painter, Thomas H.; Epema, Gerrit F.; Bregt, Arnold K.
2007-05-01
The paper revises and clarifies definitions of the term scale and scaling conversions for imaging spectroscopy of soil and vegetation. We demonstrate a new four-dimensional scale concept that includes not only spatial but also the spectral, directional and temporal components. Three scaling remote sensing techniques are reviewed: (1) radiative transfer, (2) spectral (un)mixing, and (3) data fusion. Relevant case studies are given in the context of their up- and/or down-scaling abilities over the soil/vegetation surfaces and a multi-source approach is proposed for their integration. Radiative transfer (RT) models are described to show their capacity for spatial, spectral up-scaling, and directional down-scaling within a heterogeneous environment. Spectral information and spectral derivatives, like vegetation indices (e.g. TCARI/OSAVI), can be scaled and even tested by their means. Radiative transfer of an experimental Norway spruce ( Picea abies (L.) Karst.) research plot in the Czech Republic was simulated by the Discrete Anisotropic Radiative Transfer (DART) model to prove relevance of the correct object optical properties scaled up to image data at two different spatial resolutions. Interconnection of the successive modelling levels in vegetation is shown. A future development in measurement and simulation of the leaf directional spectral properties is discussed. We describe linear and/or non-linear spectral mixing techniques and unmixing methods that demonstrate spatial down-scaling. Relevance of proper selection or acquisition of the spectral endmembers using spectral libraries, field measurements, and pure pixels of the hyperspectral image is highlighted. An extensive list of advanced unmixing techniques, a particular example of unmixing a reflective optics system imaging spectrometer (ROSIS) image from Spain, and examples of other mixture applications give insight into the present status of scaling capabilities. Simultaneous spatial and temporal down-scaling by means of a data fusion technique is described. A demonstrative example is given for the moderate resolution imaging spectroradiometer (MODIS) and LANDSAT Thematic Mapper (TM) data from Brazil. Corresponding spectral bands of both sensors were fused via a pyramidal wavelet transform in Fourier space. New spectral and temporal information of the resultant image can be used for thematic classification or qualitative mapping. All three described scaling techniques can be integrated as the relevant methodological steps within a complex multi-source approach. We present this concept of combining numerous optical remote sensing data and methods to generate inputs for ecosystem process models.
NASA Astrophysics Data System (ADS)
Moradi, Saed; Moallem, Payman; Sabahi, Mohamad Farzan
2018-03-01
False alarm rate and detection rate are still two contradictory metrics for infrared small target detection in an infrared search and track system (IRST), despite the development of new detection algorithms. In certain circumstances, not detecting true targets is more tolerable than detecting false items as true targets. Hence, considering background clutter and detector noise as the sources of the false alarm in an IRST system, in this paper, a false alarm aware methodology is presented to reduce false alarm rate while the detection rate remains undegraded. To this end, advantages and disadvantages of each detection algorithm are investigated and the sources of the false alarms are determined. Two target detection algorithms having independent false alarm sources are chosen in a way that the disadvantages of the one algorithm can be compensated by the advantages of the other one. In this work, multi-scale average absolute gray difference (AAGD) and Laplacian of point spread function (LoPSF) are utilized as the cornerstones of the desired algorithm of the proposed methodology. After presenting a conceptual model for the desired algorithm, it is implemented through the most straightforward mechanism. The desired algorithm effectively suppresses background clutter and eliminates detector noise. Also, since the input images are processed through just four different scales, the desired algorithm has good capability for real-time implementation. Simulation results in term of signal to clutter ratio and background suppression factor on real and simulated images prove the effectiveness and the performance of the proposed methodology. Since the desired algorithm was developed based on independent false alarm sources, our proposed methodology is expandable to any pair of detection algorithms which have different false alarm sources.
Applying systems engineering methodologies to the micro- and nanoscale realm
NASA Astrophysics Data System (ADS)
Garrison Darrin, M. Ann
2012-06-01
Micro scale and nano scale technology developments have the potential to revolutionize smart and small systems. The application of systems engineering methodologies that integrate standalone, small-scale technologies and interface them with macro technologies to build useful systems is critical to realizing the potential of these technologies. This paper covers the expanding knowledge base on systems engineering principles for micro and nano technology integration starting with a discussion of the drivers for applying a systems approach. Technology development on the micro and nano scale has transition from laboratory curiosity to the realization of products in the health, automotive, aerospace, communication, and numerous other arenas. This paper focuses on the maturity (or lack thereof) of the field of nanosystems which is emerging in a third generation having transitioned from completing active structures to creating systems. The emphasis of applying a systems approach focuses on successful technology development based on the lack of maturity of current nano scale systems. Therefore the discussion includes details relating to enabling roles such as product systems engineering and technology development. Classical roles such as acquisition systems engineering are not covered. The results are also targeted towards small-scale technology developers who need to take into account systems engineering processes such as requirements definition, verification, and validation interface management and risk management in the concept phase of technology development to maximize the likelihood of success, cost effective micro and nano technology to increase the capability of emerging deployed systems and long-term growth and profits.
Probabilistic Simulation of Multi-Scale Composite Behavior
NASA Technical Reports Server (NTRS)
Chamis, Christos C.
2012-01-01
A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik
The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the latest ENDF-B-VII.1 cross section library in Serpent lead to ~180 pcm lower k ∞ values compared to the older ENDF-B-VII.0 dataset, caused by the modified graphite neutron capture cross section. Furthermore, the fourth beta release of SCALE 6.2 likewise produced lower CE k∞ values when compared to SCALE 6.1, and the improved performance of the new 252-group library available in SCALE 6.2 is especially noteworthy. A SCALE/TSUNAMI uncertainty analysis of the Hot Full Power variant for Ex. I-1a furthermore concluded that the 238U(n,γ) (capture) and 235U(View the MathML source) cross-section covariance matrices contributed the most to the total k ∞ uncertainty of 0.58%.« less
Bostelmann, Friederike; Strydom, Gerhard; Reitsma, Frederik; ...
2016-01-11
The quantification of uncertainties in design and safety analysis of reactors is today not only broadly accepted, but in many cases became the preferred way to replace traditional conservative analysis for safety and licensing analysis. The use of a more fundamental methodology is also consistent with the reliable high fidelity physics models and robust, efficient, and accurate codes available today. To facilitate uncertainty analysis applications a comprehensive approach and methodology must be developed and applied, in contrast to the historical approach where sensitivity analysis were performed and uncertainties then determined by a simplified statistical combination of a few important inputmore » parameters. New methodologies are currently under development in the OECD/NEA Light Water Reactor (LWR) Uncertainty Analysis in Best-Estimate Modelling (UAM) benchmark activity. High Temperature Gas-cooled Reactor (HTGR) designs require specific treatment of the double heterogeneous fuel design and large graphite quantities at high temperatures. The IAEA has therefore launched a Coordinated Research Project (CRP) on HTGR Uncertainty Analysis in Modelling (UAM) in 2013 to study uncertainty propagation specifically in the HTGR analysis chain. Two benchmark problems are defined, with the prismatic design represented by the General Atomics (GA) MHTGR-350 and a 250 MW modular pebble bed design similar to the Chinese HTR-PM. Work has started on the first phase and the current CRP status is reported in the paper. A comparison of the Serpent and SCALE/KENO-VI reference Monte Carlo results for Ex. I-1 of the MHTGR-350 design is also included. It was observed that the SCALE/KENO-VI Continuous Energy (CE) k ∞ values were 395 pcm (Ex. I-1a) to 803 pcm (Ex. I-1b) higher than the respective Serpent lattice calculations, and that within the set of the SCALE results, the KENO-VI 238 Multi-Group (MG) k ∞ values were up to 800 pcm lower than the KENO-VI CE values. The use of the latest ENDF-B-VII.1 cross section library in Serpent lead to ~180 pcm lower k ∞ values compared to the older ENDF-B-VII.0 dataset, caused by the modified graphite neutron capture cross section. Furthermore, the fourth beta release of SCALE 6.2 likewise produced lower CE k∞ values when compared to SCALE 6.1, and the improved performance of the new 252-group library available in SCALE 6.2 is especially noteworthy. A SCALE/TSUNAMI uncertainty analysis of the Hot Full Power variant for Ex. I-1a furthermore concluded that the 238U(n,γ) (capture) and 235U(View the MathML source) cross-section covariance matrices contributed the most to the total k ∞ uncertainty of 0.58%.« less
Fajans, Peter; Simmons, Ruth; Ghiron, Laura
2006-03-01
Public sector health systems that provide services to poor and marginalized populations in developing countries face great challenges. Change associated with health sector reform and structural adjustment often leaves these already-strained institutions with fewer resources and insufficient capacity to relieve health burdens. The Strategic Approach to Strengthening Reproductive Health Policies and Programs is a methodological innovation developed by the World Health Organization and its partners to help countries identify and prioritize their reproductive health service needs, test appropriate interventions, and scale up successful innovations to a subnational or national level. The participatory, interdisciplinary, and country-owned process can set in motion much-needed change. We describe key features of this approach, provide illustrations from country experiences, and use insights from the diffusion of innovation literature to explain the approach's dissemination and sustainability.
Millions Learning: Scaling up Quality Education in Developing Countries
ERIC Educational Resources Information Center
Robinson, Jenny Perlman; Winthrop, Rebecca
2016-01-01
"Millions Learning: Scaling up Quality Education in Developing Countries" tells the story of where and how quality education has scaled in low- and middle-income countries. The story emerges from wide-ranging research on scaling and learning, including 14 in-depth case studies from around the globe. Ultimately, "Millions…
Wycisk, Peter; Stollberg, Reiner; Neumann, Christian; Gossel, Wolfgang; Weiss, Holger; Weber, Roland
2013-04-01
A large-scale groundwater contamination characterises the Pleistocene groundwater system of the former industrial and abandoned mining region Bitterfeld/Wolfen, Eastern Germany. For more than a century, local chemical production and extensive lignite mining caused a complex contaminant release from local production areas and related dump sites. Today, organic pollutants (mainly organochlorines) are present in all compartments of the environment at high concentration levels. An integrated methodology for characterising the current situation of pollution as well as the future fate development of hazardous substances is highly required to decide on further management and remediation strategies. Data analyses have been performed on regional groundwater monitoring data from about 10 years, containing approximately 3,500 samples, and up to 180 individual organic parameters from almost 250 observation wells. Run-off measurements as well as water samples were taken biweekly from local creeks during a period of 18 months. A kriging interpolation procedure was applied on groundwater analytics to generate continuous distribution patterns of the nodal contaminant samples. High-resolution geological 3-D modelling serves as a database for a regional 3-D groundwater flow model. Simulation results support the future fate assessment of contaminants. A first conceptual model of the contamination has been developed to characterise the contamination in regional surface waters and groundwater. A reliable explanation of the variant hexachlorocyclohexane (HCH) occurrence within the two local aquifer systems has been derived from the regionalised distribution patterns. Simulation results from groundwater flow modelling provide a better understanding of the future pollutant migration paths and support the overall site characterisation. The presented case study indicates that an integrated assessment of large-scale groundwater contaminations often needs more data than only from local groundwater monitoring. The developed methodology is appropriate to assess POP-contaminated mega-sites including, e.g. HCH deposits. Although HCH isomers are relevant groundwater pollutants at this site, further organochlorine pollutants are present at considerably higher levels. The study demonstrates that an effective evaluation of the current situation of contamination as well as of the related future fate development requires detailed information of the entire observed system.
Fundamental Issues Concerning the Sustainment and Scaling Up of Professional Development Programs
ERIC Educational Resources Information Center
Tirosh, Dina; Tsamir, Pessia; Levenson, Esther
2015-01-01
The issue of sustaining and scaling up professional development for mathematics teachers raises several fundamental issues for researchers. This commentary addresses various definitions for sustainability and scaling up and how these definitions may affect the design of programs as well as the design of research. We consider four of the papers in…
Dickson, Kim E; Kinney, Mary V; Moxon, Sarah G; Ashton, Joanne; Zaka, Nabila; Simen-Kapeu, Aline; Sharma, Gaurav; Kerber, Kate J; Daelmans, Bernadette; Gülmezoglu, A; Mathai, Matthews; Nyange, Christabel; Baye, Martina; Lawn, Joy E
2015-01-01
The Every Newborn Action Plan (ENAP) and Ending Preventable Maternal Mortality targets cannot be achieved without high quality, equitable coverage of interventions at and around the time of birth. This paper provides an overview of the methodology and findings of a nine paper series of in-depth analyses which focus on the specific challenges to scaling up high-impact interventions and improving quality of care for mothers and newborns around the time of birth, including babies born small and sick. The bottleneck analysis tool was applied in 12 countries in Africa and Asia as part of the ENAP process. Country workshops engaged technical experts to complete a tool designed to synthesise "bottlenecks" hindering the scale up of maternal-newborn intervention packages across seven health system building blocks. We used quantitative and qualitative methods and literature review to analyse the data and present priority actions relevant to different health system building blocks for skilled birth attendance, emergency obstetric care, antenatal corticosteroids (ACS), basic newborn care, kangaroo mother care (KMC), treatment of neonatal infections and inpatient care of small and sick newborns. The 12 countries included in our analysis account for the majority of global maternal (48%) and newborn (58%) deaths and stillbirths (57%). Our findings confirm previously published results that the interventions with the most perceived bottlenecks are facility-based where rapid emergency care is needed, notably inpatient care of small and sick newborns, ACS, treatment of neonatal infections and KMC. Health systems building blocks with the highest rated bottlenecks varied for different interventions. Attention needs to be paid to the context specific bottlenecks for each intervention to scale up quality care. Crosscutting findings on health information gaps inform two final papers on a roadmap for improvement of coverage data for newborns and indicate the need for leadership for effective audit systems. Achieving the Sustainable Development Goal targets for ending preventable mortality and provision of universal health coverage will require large-scale approaches to improving quality of care. These analyses inform the development of systematic, targeted approaches to strengthening of health systems, with a focus on overcoming specific bottlenecks for the highest impact interventions.
2015-01-01
Background The Every Newborn Action Plan (ENAP) and Ending Preventable Maternal Mortality targets cannot be achieved without high quality, equitable coverage of interventions at and around the time of birth. This paper provides an overview of the methodology and findings of a nine paper series of in-depth analyses which focus on the specific challenges to scaling up high-impact interventions and improving quality of care for mothers and newborns around the time of birth, including babies born small and sick. Methods The bottleneck analysis tool was applied in 12 countries in Africa and Asia as part of the ENAP process. Country workshops engaged technical experts to complete a tool designed to synthesise "bottlenecks" hindering the scale up of maternal-newborn intervention packages across seven health system building blocks. We used quantitative and qualitative methods and literature review to analyse the data and present priority actions relevant to different health system building blocks for skilled birth attendance, emergency obstetric care, antenatal corticosteroids (ACS), basic newborn care, kangaroo mother care (KMC), treatment of neonatal infections and inpatient care of small and sick newborns. Results The 12 countries included in our analysis account for the majority of global maternal (48%) and newborn (58%) deaths and stillbirths (57%). Our findings confirm previously published results that the interventions with the most perceived bottlenecks are facility-based where rapid emergency care is needed, notably inpatient care of small and sick newborns, ACS, treatment of neonatal infections and KMC. Health systems building blocks with the highest rated bottlenecks varied for different interventions. Attention needs to be paid to the context specific bottlenecks for each intervention to scale up quality care. Crosscutting findings on health information gaps inform two final papers on a roadmap for improvement of coverage data for newborns and indicate the need for leadership for effective audit systems. Conclusions Achieving the Sustainable Development Goal targets for ending preventable mortality and provision of universal health coverage will require large-scale approaches to improving quality of care. These analyses inform the development of systematic, targeted approaches to strengthening of health systems, with a focus on overcoming specific bottlenecks for the highest impact interventions. PMID:26390820
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDaniel, Dwayne; Dulikravich, George; Cizmas, Paul
2017-11-27
This report summarizes the objectives, tasks and accomplishments made during the three year duration of this research project. The report presents the results obtained by applying advanced computational techniques to develop reduced-order models (ROMs) in the case of reacting multiphase flows based on high fidelity numerical simulation of gas-solids flow structures in risers and vertical columns obtained by the Multiphase Flow with Interphase eXchanges (MFIX) software. The research includes a numerical investigation of reacting and non-reacting gas-solids flow systems and computational analysis that will involve model development to accelerate the scale-up process for the design of fluidization systems by providingmore » accurate solutions that match the full-scale models. The computational work contributes to the development of a methodology for obtaining ROMs that is applicable to the system of gas-solid flows. Finally, the validity of the developed ROMs is evaluated by comparing the results against those obtained using the MFIX code. Additionally, the robustness of existing POD-based ROMs for multiphase flows is improved by avoiding non-physical solutions of the gas void fraction and ensuring that the reduced kinetics models used for reactive flows in fluidized beds are thermodynamically consistent.« less
Causal Attribution: A New Scale Developed to Minimize Existing Methodological Problems.
ERIC Educational Resources Information Center
Bull, Kay Sather; Feuquay, Jeffrey P.
In order to facilitate research on the construct of causal attribution, this paper details developmental procedures used to minimize previous deficiencies and proposes a new scale. The first version of the scale was in ipsative form and provided two basic sets of indices: (1) ability, effort, luck, and task difficulty indices in success and…
Large Uncertainties in Urban-Scale Carbon Emissions
NASA Astrophysics Data System (ADS)
Gately, C. K.; Hutyra, L. R.
2017-10-01
Accurate estimates of fossil fuel carbon dioxide (FFCO2) emissions are a critical component of local, regional, and global climate agreements. Current global inventories of FFCO2 emissions do not directly quantify emissions at local scales; instead, spatial proxies like population density, nighttime lights, and power plant databases are used to downscale emissions from national totals. We have developed a high-resolution (hourly, 1 km2) bottom-up Anthropogenic Carbon Emissions System (ACES) for FFCO2, based on local activity data for the year 2011 across the northeastern U.S. We compare ACES with three widely used global inventories, finding significant differences at regional (20%) and city scales (50-250%). At a spatial resolution of 0.1°, inventories differ by over 100% for half of the grid cells in the domain, with the largest differences in urban areas and oil and gas production regions. Given recent U.S. federal policy pull backs regarding greenhouse gas emissions reductions, inventories like ACES are crucial for U.S. actions, as the impetus for climate leadership has shifted to city and state governments. The development of a robust carbon monitoring system to track carbon fluxes is critical for emissions benchmarking and verification. We show that existing downscaled inventories are not suitable for urban emissions monitoring, as they do not consider important local activity patterns. The ACES methodology is designed for easy updating, making it suitable for emissions monitoring under most city, regional, and state greenhouse gas mitigation initiatives, in particular, for the small- and medium-sized cities that lack the resources to regularly perform their own bottom-up emissions inventories.
NASA Astrophysics Data System (ADS)
Malbéteau, Y.; Lopez, O.; Houborg, R.; McCabe, M.
2017-12-01
Agriculture places considerable pressure on water resources, with the relationship between water availability and food production being critical for sustaining population growth. Monitoring water resources is particularly important in arid and semi-arid regions, where irrigation can represent up to 80% of the consumptive uses of water. In this context, it is necessary to optimize on-farm irrigation management by adjusting irrigation to crop water requirements throughout the growing season. However, in situ point measurements are not routinely available over extended areas and may not be representative at the field scale. Remote sensing approaches present as a cost-effective technique for mapping and monitoring broad areas. By taking advantage of multi-sensor remote sensing methodologies, such as those provided by MODIS, Landsat, Sentinel and Cubesats, we propose a new method to estimate irrigation input at pivot-scale. Here we explore the development of crop-water use estimates via these remote sensing data and integrate them into a land surface modeling framework, using a farm in Saudi Arabia as a demonstration of what can be achieved at larger scales.
Nicholas S. Skowronski; Scott Haag; Jim Trimble; Kenneth L. Clark; Michael R. Gallagher; Richard G. Lathrop
2015-01-01
Large-scale fuel assessments are useful for developing policy aimed at mitigating wildfires in the wildland-urban interface (WUI), while finer-scale characterisation is necessary for maximising the effectiveness of fuel reduction treatments and directing suppression activities. We developed and tested an objective, consistent approach for characterising hazardous fuels...
Geometric scaling of artificial hair sensors for flow measurement under different conditions
NASA Astrophysics Data System (ADS)
Su, Weihua; Reich, Gregory W.
2017-03-01
Artificial hair sensors (AHSs) have been developed for prediction of the local flow speed and aerodynamic force around an airfoil and subsequent application in vibration control of the airfoil. Usually, a specific sensor design is only sensitive to the flow speeds within its operating flow measurement region. This paper aims at expanding this flow measurement concept of using AHSs to different flow speed conditions by properly sizing the parameters of the sensors, including the dimensions of the artificial hair, capillary, and carbon nanotubes (CNTs) that make up the sensor design, based on a baseline sensor design and its working flow condition. In doing so, the glass fiber hair is modeled as a cantilever beam with an elastic foundation, subject to the distributed aerodynamic drag over the length of the hair. Hair length and diameter, capillary depth, and CNT height are scaled by keeping the maximum compressive strain of the CNTs constant for different sensors under different speed conditions. Numerical studies will demonstrate the feasibility of the geometric scaling methodology by designing AHSs for aircraft with different dimensions and flight conditions, starting from the same baseline sensor. Finally, the operating bandwidth of the scaled sensors are explored.
Deb, Kalyanmoy; Sinha, Ankur
2010-01-01
Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.
de Paz, José-Miguel; Sánchez, Juan; Visconti, Fernando
2006-04-01
Soil is one of the main non-renewable natural resources in the world. In the Valencian Community (Mediterranean coast of Spain), it is especially important because agriculture and forest biomass exploitation are two of the main economic activities in the region. More than 44% of the total area is under agriculture and 52% is forested. The frequently arid or semi-arid climate with rainfall concentrated in few events, usually in the autumn and spring, scarcity of vegetation cover, and eroded and shallow soils in several areas lead to soil degradation processes. These processes, mainly water erosion and salinization, can be intense in many locations within the Valencian Community. Evaluation of soil degradation on a regional scale is important because degradation is incompatible with sustainable development. Policy makers involved in land use planning require tools to evaluate soil degradation so they can go on to develop measures aimed at protecting and conserving soils. In this study, a methodology to evaluate physical, chemical and biological soil degradation in a GIS-based approach was developed for the Valencian Community on a 1/200,000 scale. The information used in this study was obtained from two different sources: (i) a soil survey with more than 850 soil profiles sampled within the Valencian Community, and (ii) the environmental information implemented in the Geo-scientific map of the Valencian Community digitised on an Arc/Info GIS. Maps of physical, chemical and biological soil degradation in the Valencian Community on a 1/200,000 scale were obtained using the methodology devised. These maps can be used to make a cost-effective evaluation of soil degradation on a regional scale. Around 29% of the area corresponding to the Valencian Community is affected by high to very high physical soil degradation, 36% by high to very high biological degradation, and 6% by high to very high chemical degradation. It is, therefore, necessary to draw up legislation and to establish the policy framework for actions focused on preventing soil degradation and conserving its productive potential.
NASA Astrophysics Data System (ADS)
Teixeira, Filipe; Melo, André; Cordeiro, M. Natália D. S.
2010-09-01
A linear least-squares methodology was used to determine the vibrational scaling factors for the X3LYP density functional. Uncertainties for these scaling factors were calculated according to the method devised by Irikura et al. [J. Phys. Chem. A 109, 8430 (2005)]. The calibration set was systematically partitioned according to several of its descriptors and the scaling factors for X3LYP were recalculated for each subset. The results show that the scaling factors are only significant up to the second digit, irrespective of the calibration set used. Furthermore, multivariate statistical analysis allowed us to conclude that the scaling factors and the associated uncertainties are independent of the size of the calibration set and strongly suggest the practical impossibility of obtaining vibrational scaling factors with more than two significant digits.
Teixeira, Filipe; Melo, André; Cordeiro, M Natália D S
2010-09-21
A linear least-squares methodology was used to determine the vibrational scaling factors for the X3LYP density functional. Uncertainties for these scaling factors were calculated according to the method devised by Irikura et al. [J. Phys. Chem. A 109, 8430 (2005)]. The calibration set was systematically partitioned according to several of its descriptors and the scaling factors for X3LYP were recalculated for each subset. The results show that the scaling factors are only significant up to the second digit, irrespective of the calibration set used. Furthermore, multivariate statistical analysis allowed us to conclude that the scaling factors and the associated uncertainties are independent of the size of the calibration set and strongly suggest the practical impossibility of obtaining vibrational scaling factors with more than two significant digits.
Pathways for scaling up public health interventions.
Indig, Devon; Lee, Karen; Grunseit, Anne; Milat, Andrew; Bauman, Adrian
2017-08-01
To achieve population-wide health improvement, public health interventions found effective in selected samples need to be 'scaled up' and implemented more widely. The pathways through which interventions are scaled up are not well characterised. The aim of this paper is to identify examples of public health interventions which have been scaled up and to develop a conceptual framework which quantifies and describes this process. A multi-stage international literature search was undertaken to identify examples of public health interventions in high income countries that have been scaled up or implemented at scale. Initial abstract review identified articles which met all the criteria of being a: 1) public health intervention; 2) chronic disease prevention focus; 3) program delivered at a wide geographical scale (state, national or international). Interventions were reviewed and coded into a conceptual framework pathway to document their scaling up process. For each program, an in-depth review of the identified articles was undertaken along with a broad internet based search to determine the outcomes of the dissemination process. A conceptual framework of scaling up pathways was developed that involved four stages (development, efficacy testing, real world trial and dissemination) to which the 40 programs were mapped. The search identified 40 public health interventions that showed evidence of being scaled up. Four pathways were identified to capture the different scaling up trajectories taken which included: 'Type I - Comprehensive' (55%) which passed through all four stages, 'Type II - Efficacy omitters' (5%) which did not conduct efficacy testing, 'Type III - Trial omitters' (25%) which did not conduct a real world trial, and 'Type IV - At scale dissemination' (15%) which skipped both efficacy testing and a real world trial. This is the first study to classify and quantify the potential pathways through which public health interventions in high income countries are scaled up to reach the broader population. Mapping these pathways not only demonstrates the different trajectories that occur in scaling up public health interventions, but also allows the variation across scaling up pathways to be classified. The policy and practice determinants leading to each pathway remain for future study, especially to identify the conditions under which efficacy and replication stages are missing.
Caudle, Susan E.; Katzenstein, Jennifer M.; Oghalai, John S.; Lin, Jerry; Caudle, Donald D.
2013-01-01
Methodologically, longitudinal assessment of cognitive development in young children has proven difficult because few measures span infancy through school age. This matter is further complicated when the child presents with a sensory deficit such as hearing loss. Few measures are validated in this population, and children who are evaluated for cochlear implantation are often reevaluated annually. The authors sought to evaluate the predictive validity of subscales of the Mullen Scales of Early Learning (MSEL) on Leiter International Performance Scales–Revised (LIPS-R) Full-Scale IQ scores. To further elucidate the relationship of these two measures, comparisons were also made with the Vineland Adaptive Behavior Scale–Second Edition (VABS), which provides a measure of adaptive functioning across the life span. Participants included 35 children (14 female, 21 male) who were evaluated both as part of the precandidacy process for cochlear implantation using the MSEL and VABS and following implantation with the LIPS-R and VABS. Hierarchical linear regression revealed that the MSEL Visual Reception subdomain score significantly predicted 52% of the variance in LIPS-R Full-Scale IQ scores at follow-up, F(1, 34) = 35.80, p < .0001, R2 = .52, β = .72. This result suggests that the Visual Reception subscale offers predictive validity of later LIPS-R Full-Scale IQ scores. The VABS was also significantly correlated with cognitive variables at each time point. PMID:22353228
Costs and Impacts of Scaling up Voluntary Medical Male Circumcision in Tanzania
Menon, Veena; Gold, Elizabeth; Godbole, Ramona; Castor, Delivette; Mahler, Hally; Forsythe, Steven; Ally, Mariam; Njeuhmeli, Emmanuel
2014-01-01
Background Given the proven effectiveness of voluntary medical male circumcision (VMMC) in preventing the spread of HIV, Tanzania is scaling up VMMC as an HIV prevention strategy. This study will inform policymakers about the potential costs and benefits of scaling up VMMC services in Tanzania. Methodology The analysis first assessed the unit costs of delivering VMMC at the facility level in three regions—Iringa, Kagera, and Mbeya—via three currently used VMMC service delivery models (routine, campaign, and mobile/island outreach). Subsequently, using these unit cost data estimates, the study used the Decision Makers' Program Planning Tool (DMPPT) to estimate the costs and impact of a scaled-up VMMC program. Results Increasing VMMC could substantially reduce HIV infection. Scaling up adult VMMC to reach 87.9% coverage by 2015 would avert nearly 23,000 new adult HIV infections through 2015 and an additional 167,500 from 2016 through 2025—at an additional cost of US$253.7 million through 2015 and US$302.3 million from 2016 through 2025. Average cost per HIV infection averted would be US$11,300 during 2010–2015 and US$3,200 during 2010–2025. Scaling up VMMC in Tanzania will yield significant net benefits (benefits of treatment costs averted minus the cost of performing circumcisions) in the long run—around US$4,200 in net benefits for each infection averted. Conclusion VMMC could have an immediate impact on HIV transmission, but the full impact on prevalence and deaths will only be apparent in the longer term because VMMC averts infections some years into the future among people who have been circumcised. Given the health and economic benefits of investing in VMMC, the scale-up of services should continue to be a central component of the national HIV prevention strategy in Tanzania. PMID:24802022
ERIC Educational Resources Information Center
Wall, Kate; Higgins, Steve; Remedios, Richard; Rafferty, Victoria; Tiplady, Lucy
2013-01-01
A key challenge of visual methodology is how to combine large-scale qualitative data sets with epistemologically acceptable and rigorous analysis techniques. The authors argue that a pragmatic approach drawing on ideas from mixed methods is helpful to open up the full potential of visual data. However, before one starts to "mix" the…
Standard Methodology for Assessment of Range of Motion While Wearing Body Armor
2013-09-30
8 Figure 14: Meter stick (with close up...the tester to place against the measurement scale. A variety of rulers, meter sticks (Figure 14), and T square rulers (Figure 15) were used as well...All rulers and meter sticks had cm and mm marks on them. Figure 12: 20 cm block Figure 13: Measuring block Figure 14: Meter stick
REPRESENTATION OF ATMOSPHERIC MOTION IN MODELS OF REGIONAL-SCALE AIR POLLUTION
A method is developed for generating ensembles of wind fields for use in regional scale (1000 km) models of transport and diffusion. The underlying objective is a methodology for representing atmospheric motion in applied air pollution models that permits explicit treatment of th...
An Analysis of Methods Used to Examine Gender Differences in Computer-Related Behavior.
ERIC Educational Resources Information Center
Kay, Robin
1992-01-01
Review of research investigating gender differences in computer-related behavior examines statistical and methodological flaws. Issues addressed include sample selection, sample size, scale development, scale quality, the use of univariate and multivariate analyses, regressional analysis, construct definition, construct testing, and the…
Aerodynamic characteristics of cruciform missiles at high angles of attack
NASA Technical Reports Server (NTRS)
Lesieutre, Daniel J.; Mendenhall, Michael R.; Nazario, Susana M.; Hemsch, Michael J.
1987-01-01
An aerodynamic prediction method for missile aerodynamic performance and preliminary design has been developed to utilize a newly available systematic fin data base and an improved equivalent angle of attack methodology. The method predicts total aerodynamic loads and individual fin forces and moments for body-tail (wing-body) and canard-body-tail configurations with cruciform fin arrangements. The data base and the prediction method are valid for angles of attack up to 45 deg, arbitrary roll angles, fin deflection angles between -40 deg and 40 deg, Mach numbers between 0.6 and 4.5, and fin aspect ratios between 0.25 and 4.0. The equivalent angle of attack concept is employed to include the effects of vorticity and geometric scaling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrieta, Gabriela, E-mail: tonina1903@hotmail.com; Requena, Ignacio, E-mail: requena@decsai.ugr.es; Toro, Javier, E-mail: jjtoroca@unal.edu.co
Treatment and final disposal of Municipal Solid Waste can have a significant role in the generation of negative environmental impacts. As a prevention strategy, such activities are subjected to the process of Environmental Impact Assessment (EIA). Still, the follow-up of Environmental Management Plans or mitigation measures is limited, for one due to a lack of methodological approaches. In searching for possibilities, the University of Granada (Spain) developed a diagnostic methodology named EVIAVE, which allows one to quantify, by means of indexes, the environmental impact of landfills in view of their location and the conditions of exploitation. EVIAVE is applicable withinmore » the legal framework of the European Union and can be adapted to the environmental and legal conditions of other countries. This study entails its adaptation in Colombia, for the follow-up and control of the EIA process for landfills. Modifications involved inclusion of the environmental elements flora and fauna, and the evaluation of the environmental descriptors in agreement with the concept of vulnerability. The application of the modified EVIAVE in Colombian landfills allowed us to identify the elements affected by the operating conditions and maintenance. It may be concluded that this methodology is viable and effective for the follow-up and environmental control of EIA processes for landfills, and to analyze the associated risks, as it takes into account related environmental threats and vulnerabilities. - Highlights: • A modified methodology is used to monitor and follow-up environmental impacts in landfills. • The improved methodology includes the Vulnerability of Flora and Fauna to evaluate environmental impact of landfills. • The methodology serves to identify and evaluate the sources of risk generated in the construction and siting of landfills. • Environmental vulnerability indicators improve effectiveness of the control and follow-up phases of landfill management. • The follow-up of environmental management plans may help diminish the implementation gap in Environmental Impact Assessment.« less
NASA Astrophysics Data System (ADS)
Desai, Darshak A.; Kotadiya, Parth; Makwana, Nikheel; Patel, Sonalinkumar
2015-03-01
Indian industries need overall operational excellence for sustainable profitability and growth in the present age of global competitiveness. Among different quality and productivity improvement techniques, Six Sigma has emerged as one of the most effective breakthrough improvement strategies. Though Indian industries are exploring this improvement methodology to their advantage and reaping the benefits, not much has been presented and published regarding experience of Six Sigma in the food-processing industries. This paper is an effort to exemplify the application of Six Sigma quality improvement drive to one of the large-scale food-processing sectors in India. The paper discusses the phase wiz implementation of define, measure, analyze, improve, and control (DMAIC) on one of the chronic problems, variations in the weight of milk powder pouch. The paper wraps up with the improvements achieved and projected bottom-line gain to the unit by application of Six Sigma methodology.
ERIC Educational Resources Information Center
Wombacher, Jorg; Tagg, Stephen K.; Burgi, Thomas; MacBryde, Jillian
2010-01-01
In this article, the authors present a German Sense of Community (SOC) Scale for use in military settings. The scale is based on the translation and field-testing of an existing U.S.-based measure of neighborhood SOC (Peterson, Speer, & McMillan, 2008). The methodological intricacies underlying cross-cultural scale development are highlighted, as…
Zeng, Xiantao; Zhang, Yonggang; Kwong, Joey S W; Zhang, Chao; Li, Sheng; Sun, Feng; Niu, Yuming; Du, Liang
2015-02-01
To systematically review the methodological assessment tools for pre-clinical and clinical studies, systematic review and meta-analysis, and clinical practice guideline. We searched PubMed, the Cochrane Handbook for Systematic Reviews of Interventions, Joanna Briggs Institute (JBI) Reviewers Manual, Centre for Reviews and Dissemination, Critical Appraisal Skills Programme (CASP), Scottish Intercollegiate Guidelines Network (SIGN), and the National Institute for Clinical Excellence (NICE) up to May 20th, 2014. Two authors selected studies and extracted data; quantitative analysis was performed to summarize the characteristics of included tools. We included a total of 21 assessment tools for analysis. A number of tools were developed by academic organizations, and some were developed by only a small group of researchers. The JBI developed the highest number of methodological assessment tools, with CASP coming second. Tools for assessing the methodological quality of randomized controlled studies were most abundant. The Cochrane Collaboration's tool for assessing risk of bias is the best available tool for assessing RCTs. For cohort and case-control studies, we recommend the use of the Newcastle-Ottawa Scale. The Methodological Index for Non-Randomized Studies (MINORS) is an excellent tool for assessing non-randomized interventional studies, and the Agency for Healthcare Research and Quality (ARHQ) methodology checklist is applicable for cross-sectional studies. For diagnostic accuracy test studies, the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) tool is recommended; the SYstematic Review Centre for Laboratory animal Experimentation (SYRCLE) risk of bias tool is available for assessing animal studies; Assessment of Multiple Systematic Reviews (AMSTAR) is a measurement tool for systematic reviews/meta-analyses; an 18-item tool has been developed for appraising case series studies, and the Appraisal of Guidelines, Research and Evaluation (AGREE)-II instrument is widely used to evaluate clinical practice guidelines. We have successfully identified a variety of methodological assessment tools for different types of study design. However, further efforts in the development of critical appraisal tools are warranted since there is currently a lack of such tools for other fields, e.g. genetic studies, and some existing tools (nested case-control studies and case reports, for example) are in need of updating to be in line with current research practice and rigor. In addition, it is very important that all critical appraisal tools remain subjective and performance bias is effectively avoided. © 2015 Chinese Cochrane Center, West China Hospital of Sichuan University and Wiley Publishing Asia Pty Ltd.
Spicer, Neil; Bhattacharya, Dipankar; Dimka, Ritgak; Fanta, Feleke; Mangham-Jefferies, Lindsay; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Walt, Gill; Wickremasinghe, Deepthi
2014-11-01
Donors and other development partners commonly introduce innovative practices and technologies to improve health in low and middle income countries. Yet many innovations that are effective in improving health and survival are slow to be translated into policy and implemented at scale. Understanding the factors influencing scale-up is important. We conducted a qualitative study involving 150 semi-structured interviews with government, development partners, civil society organisations and externally funded implementers, professional associations and academic institutions in 2012/13 to explore scale-up of innovative interventions targeting mothers and newborns in Ethiopia, the Indian state of Uttar Pradesh and the six states of northeast Nigeria, which are settings with high burdens of maternal and neonatal mortality. Interviews were analysed using a common analytic framework developed for cross-country comparison and themes were coded using Nvivo. We found that programme implementers across the three settings require multiple steps to catalyse scale-up. Advocating for government to adopt and finance health innovations requires: designing scalable innovations; embedding scale-up in programme design and allocating time and resources; building implementer capacity to catalyse scale-up; adopting effective approaches to advocacy; presenting strong evidence to support government decision making; involving government in programme design; invoking policy champions and networks; strengthening harmonisation among external programmes; aligning innovations with health systems and priorities. Other steps include: supporting government to develop policies and programmes and strengthening health systems and staff; promoting community uptake by involving media, community leaders, mobilisation teams and role models. We conclude that scale-up has no magic bullet solution - implementers must embrace multiple activities, and require substantial support from donors and governments in doing so. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Verma, Nishant; Beretvas, S Natasha; Pascual, Belen; Masdeu, Joseph C; Markey, Mia K
2015-11-12
As currently used, the Alzheimer's Disease Assessment Scale-Cognitive subscale (ADAS-Cog) has low sensitivity for measuring Alzheimer's disease progression in clinical trials. A major reason behind the low sensitivity is its sub-optimal scoring methodology, which can be improved to obtain better sensitivity. Using item response theory, we developed a new scoring methodology (ADAS-CogIRT) for the ADAS-Cog, which addresses several major limitations of the current scoring methodology. The sensitivity of the ADAS-CogIRT methodology was evaluated using clinical trial simulations as well as a negative clinical trial, which had shown an evidence of a treatment effect. The ADAS-Cog was found to measure impairment in three cognitive domains of memory, language, and praxis. The ADAS-CogIRT methodology required significantly fewer patients and shorter trial durations as compared to the current scoring methodology when both were evaluated in simulated clinical trials. When validated on data from a real clinical trial, the ADAS-CogIRT methodology had higher sensitivity than the current scoring methodology in detecting the treatment effect. The proposed scoring methodology significantly improves the sensitivity of the ADAS-Cog in measuring progression of cognitive impairment in clinical trials focused in the mild-to-moderate Alzheimer's disease stage. This provides a boost to the efficiency of clinical trials requiring fewer patients and shorter durations for investigating disease-modifying treatments.
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. Techniques of preprocessing, interpretation, classification, and ground truth sampling were studied. It has shown the need for a low cost, low level technology, viable, operational methodology to replace the emphasis given in the U.S. to machine processing, which many developing countries cannot afford, understand, nor implement.
Kastner, Randee J.; Sicuri, Elisa; Stone, Christopher M.; Matwale, Gabriel; Onapa, Ambrose; Tediosi, Fabrizio
2017-01-01
Introduction Lymphatic filariasis (LF), a neglected tropical disease (NTD) preventable through mass drug administration (MDA), is one of six diseases deemed possibly eradicable. Previously we developed one LF elimination scenario, which assumes MDA scale-up to continue in all countries that have previously undertaken MDA. In contrast, our three previously developed eradication scenarios assume all LF endemic countries will undertake MDA at an average (eradication I), fast (eradication II), or instantaneous (eradication III) rate of scale-up. In this analysis we use a micro-costing model to project the financial and economic costs of each of these scenarios in order to provide evidence to decision makers about the investment required to eliminate and eradicate LF. Methodology/Key findings Costing was undertaken from a health system perspective, with all results expressed in 2012 US dollars (USD). A discount rate of 3% was applied to calculate the net present value of future costs. Prospective NTD budgets from LF endemic countries were reviewed to preliminarily determine activities and resources necessary to undertake a program to eliminate LF at a country level. In consultation with LF program experts, activities and resources were further reviewed and a refined list of activities and necessary resources, along with their associated quantities and costs, were determined and grouped into the following activities: advocacy and communication, capacity strengthening, coordination and strengthening partnerships, data management, ongoing surveillance, monitoring and supervision, drug delivery, and administration. The costs of mapping and undertaking transmission assessment surveys and the value of donated drugs and volunteer time were also accounted for. Using previously developed scenarios and deterministic estimates of MDA duration, the financial and economic costs of interrupting LF transmission under varying rates of MDA scale-up were then modelled using a micro-costing approach. The elimination scenario, which includes countries that previously undertook MDA, is estimated to cost 929 million USD (95% Credible Interval: 884m-972m). Proceeding to eradication is anticipated to require a higher financial investment, estimated at 1.24 billion USD (1.17bn-1.30bn) in the eradication III scenario (immediate scale-up), with eradication II (intensified scale-up) projected at 1.27 billion USD (1.21bn-1.33bn), and eradication I (slow scale-up) estimated at 1.29 billion USD (1.23bn-1.34bn). The economic costs of the eradication III scenario are estimated at approximately 7.57 billion USD (7.12bn-7.94bn), while the elimination scenario is projected to have an economic cost of 5.21 billion USD (4.91bn-5.45bn). Countries in the AFRO region will require the greatest investment to reach elimination or eradication, but also stand to gain the most in cost savings. Across all scenarios, capacity strengthening and advocacy and communication represent the greatest financial costs, whereas mapping, post-MDA surveillance, and administration comprise the least. Conclusions/Significance Though challenging to implement, our results indicate that financial and economic savings are greatest under the eradication III scenario. Thus, if eradication for LF is the objective, accelerated scale-up is projected to be the best investment. PMID:28949987
Methodological Issues in Questionnaire Design.
Song, Youngshin; Son, Youn Jung; Oh, Doonam
2015-06-01
The process of designing a questionnaire is complicated. Many questionnaires on nursing phenomena have been developed and used by nursing researchers. The purpose of this paper was to discuss questionnaire design and factors that should be considered when using existing scales. Methodological issues were discussed, such as factors in the design of questions, steps in developing questionnaires, wording and formatting methods for items, and administrations methods. How to use existing scales, how to facilitate cultural adaptation, and how to prevent socially desirable responding were discussed. Moreover, the triangulation method in questionnaire development was introduced. Steps were recommended for designing questions such as appropriately operationalizing key concepts for the target population, clearly formatting response options, generating items and confirming final items through face or content validity, sufficiently piloting the questionnaire using item analysis, demonstrating reliability and validity, finalizing the scale, and training the administrator. Psychometric properties and cultural equivalence should be evaluated prior to administration when using an existing questionnaire and performing cultural adaptation. In the context of well-defined nursing phenomena, logical and systematic methods will contribute to the development of simple and precise questionnaires.
Development of Time-Series Human Settlement Mapping System Using Historical Landsat Archive
NASA Astrophysics Data System (ADS)
Miyazaki, H.; Nagai, M.; Shibasaki, R.
2016-06-01
Methodology of automated human settlement mapping is highly needed for utilization of historical satellite data archives for urgent issues of urban growth in global scale, such as disaster risk management, public health, food security, and urban management. As development of global data with spatial resolution of 10-100 m was achieved by some initiatives using ASTER, Landsat, and TerraSAR-X, next goal has targeted to development of time-series data which can contribute to studies urban development with background context of socioeconomy, disaster risk management, public health, transport and other development issues. We developed an automated algorithm to detect human settlement by classification of built-up and non-built-up in time-series Landsat images. A machine learning algorithm, Local and Global Consistency (LLGC), was applied with improvements for remote sensing data. The algorithm enables to use MCD12Q1, a MODIS-based global land cover map with 500-m resolution, as training data so that any manual process is not required for preparation of training data. In addition, we designed the method to composite multiple results of LLGC into a single output to reduce uncertainty. The LLGC results has a confidence value ranging 0.0 to 1.0 representing probability of built-up and non-built-up. The median value of the confidence for a certain period around a target time was expected to be a robust output of confidence to identify built-up or non-built-up areas against uncertainties in satellite data quality, such as cloud and haze contamination. Four scenes of Landsat data for each target years, 1990, 2000, 2005, and 2010, were chosen among the Landsat archive data with cloud contamination less than 20%.We developed a system with the algorithms on the Data Integration and Analysis System (DIAS) in the University of Tokyo and processed 5200 scenes of Landsat data for cities with more than one million people worldwide.
Continental hydrosystem modelling: the concept of nested stream-aquifer interfaces
NASA Astrophysics Data System (ADS)
Flipo, N.; Mouhri, A.; Labarthe, B.; Biancamaria, S.; Rivière, A.; Weill, P.
2014-08-01
Coupled hydrological-hydrogeological models, emphasising the importance of the stream-aquifer interface, are more and more used in hydrological sciences for pluri-disciplinary studies aiming at investigating environmental issues. Based on an extensive literature review, stream-aquifer interfaces are described at five different scales: local [10 cm-~10 m], intermediate [~10 m-~1 km], watershed [10 km2-~1000 km2], regional [10 000 km2-~1 M km2] and continental scales [>10 M km2]. This led us to develop the concept of nested stream-aquifer interfaces, which extends the well-known vision of nested groundwater pathways towards the surface, where the mixing of low frequency processes and high frequency processes coupled with the complexity of geomorphological features and heterogeneities creates hydrological spiralling. This conceptual framework allows the identification of a hierarchical order of the multi-scale control factors of stream-aquifer hydrological exchanges, from the larger scale to the finer scale. The hyporheic corridor, which couples the river to its 3-D hyporheic zone, is then identified as the key component for scaling hydrological processes occurring at the interface. The identification of the hyporheic corridor as the support of the hydrological processes scaling is an important step for the development of regional studies, which is one of the main concerns for water practitioners and resources managers. In a second part, the modelling of the stream-aquifer interface at various scales is investigated with the help of the conductance model. Although the usage of the temperature as a tracer of the flow is a robust method for the assessment of stream-aquifer exchanges at the local scale, there is a crucial need to develop innovative methodologies for assessing stream-aquifer exchanges at the regional scale. After formulating the conductance model at the regional and intermediate scales, we address this challenging issue with the development of an iterative modelling methodology, which ensures the consistency of stream-aquifer exchanges between the intermediate and regional scales. Finally, practical recommendations are provided for the study of the interface using the innovative methodology MIM (Measurements-Interpolation-Modelling), which is graphically developed, scaling in space the three pools of methods needed to fully understand stream-aquifer interfaces at various scales. In the MIM space, stream-aquifer interfaces that can be studied by a given approach are localised. The efficiency of the method is demonstrated with two examples. The first one proposes an upscaling framework, structured around river reaches of ~10-100 m, from the local to the watershed scale. The second example highlights the usefulness of space borne data to improve the assessment of stream-aquifer exchanges at the regional and continental scales. We conclude that further developments in modelling and field measurements have to be undertaken at the regional scale to enable a proper modelling of stream-aquifer exchanges from the local to the continental scale.
Translation and validation of the Breast-feeding Self-efficacy Scale into Turkish.
Eksioglu, Aysun Basgun; Ceber, Esin
2011-12-01
Recent research indicates that most mothers give up breast feeding their infants early in the postpartum period due to difficulties with breast feeding and the belief that they are inefficient at breast feeding. Using self-efficacy theory as a conceptual framework to measure breast-feeding confidence, a Turkish version of the Breast-feeding Self-Efficacy Scale (BSES) was developed and psychometrically tested among Turkish mothers. To translate the BSES into Turkish and assess its psychometric properties among breast-feeding mothers. A methodological study to assess the reliability, validity and predictive value of the BSES. Women were recruited from two mother and child health-care units in the Altındağ district in Izmir, Turkey between 2006 and 2007, and followed up two months post partum. 165 Turkish-speaking women. Following back-translation, questionnaires were completed in hospital and at home by postnatal women. The BSES was administrated at one, four and eight weeks post partum to determine the method of infant feeding. The interviews and home visits were conducted in mothers' own homes at a mutually convenient time. The psychometric assessment method used to validate the original BSES (English version) was replicated with the translated Turkish version. The well-concordance coefficient of Kendall's W scale was 0.227, p<0.01 and the test-retest reliability coefficient was 0.45. The consistency of the scale in terms of temporal process was efficient (p = 0.00). Cronbach's alpha coefficient was 0.91 and 0.92 at one and four weeks post partum, respectively, and the reliability of the scale was found to be high (0.80 ≤ α<1.00). The Turkish version of the BSES can be used to determine which mothers are at risk of giving up breast feeding early in the postpartum period, and the subjects they need to learn about breast feeding. Copyright © 2010 Elsevier Ltd. All rights reserved.
Espié, Stéphane; Boubezoul, Abderrahmane; Aupetit, Samuel; Bouaziz, Samir
2013-09-01
Instrumented vehicles are key tools for in-depth understanding of drivers' behaviours, thus for the design of scientifically based countermeasures to reduce fatalities and injuries. The instrumentation of Powered Two-Wheelers (PTW) has been less widely implemented that for vehicles, in part due to the technical challenges involved. The last decade has seen the development in Europe of several tools and methodologies to study motorcycle riders' behaviours and motorcycle dynamics for a range of situations, including crash events involving falls. Thanks to these tools, a broad-ranging research programme has been conducted, from the design and tuning of real-time falls detection to the study of riding training systems, as well as studies focusing on naturalistic riding situations such as filtering and line splitting. The methodology designed for the in-depth study of riders' behaviours in naturalistic situations can be based upon the combination of several sources of data such as: PTW sensors, context-based video retrieval system, Global Positioning System (GPS) and verbal data on the riders' decisions making process. The goals of this paper are: (1) to present the methodological tools developed and used by INRETS-MSIS (now Ifsttar-TS2/Simu) in the last decade for the study of riders' behaviours in real-world environment as well as on track for situations up to falls, (2) to illustrate the kind of results that can be gained from the conducted studies, (3) to identify the advantages and limitations of the proposed methodology to conduct large scale naturalistic riding studies, and (4) to highlight how the knowledge gained from this approach will fill many of the knowledge gaps about PTW-riders' behaviours and risk factors. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Fite, Ronald S.
This report details the research activities conducted by Del Mar College, as a subcontractor of Project FOLLOW-UP, in the design, development, and implementation of a graduate follow-up system. The activities included questionnaire design, development of manual and computerized record-keeping systems, student-graduate identification, and…
Anatomy of a Security Operations Center
NASA Technical Reports Server (NTRS)
Wang, John
2010-01-01
Many agencies and corporations are either contemplating or in the process of building a cyber Security Operations Center (SOC). Those Agencies that have established SOCs are most likely working on major revisions or enhancements to existing capabilities. As principle developers of the NASA SOC; this Presenters' goals are to provide the GFIRST community with examples of some of the key building blocks of an Agency scale cyber Security Operations Center. This presentation viII include the inputs and outputs, the facilities or shell, as well as the internal components and the processes necessary to maintain the SOC's subsistence - in other words, the anatomy of a SOC. Details to be presented include the SOC architecture and its key components: Tier 1 Call Center, data entry, and incident triage; Tier 2 monitoring, incident handling and tracking; Tier 3 computer forensics, malware analysis, and reverse engineering; Incident Management System; Threat Management System; SOC Portal; Log Aggregation and Security Incident Management (SIM) systems; flow monitoring; IDS; etc. Specific processes and methodologies discussed include Incident States and associated Work Elements; the Incident Management Workflow Process; Cyber Threat Risk Assessment methodology; and Incident Taxonomy. The Evolution of the Cyber Security Operations Center viII be discussed; starting from reactive, to proactive, and finally to proactive. Finally, the resources necessary to establish an Agency scale SOC as well as the lessons learned in the process of standing up a SOC viII be presented.
Optimization of Large-Scale Daily Hydrothermal System Operations With Multiple Objectives
NASA Astrophysics Data System (ADS)
Wang, Jian; Cheng, Chuntian; Shen, Jianjian; Cao, Rui; Yeh, William W.-G.
2018-04-01
This paper proposes a practical procedure for optimizing the daily operation of a large-scale hydrothermal system. The overall procedure optimizes a monthly model over a period of 1 year and a daily model over a period of up to 1 month. The outputs from the monthly model are used as inputs and boundary conditions for the daily model. The models iterate and update when new information becomes available. The monthly hydrothermal model uses nonlinear programing (NLP) to minimize fuel costs, while maximizing hydropower production. The daily model consists of a hydro model, a thermal model, and a combined hydrothermal model. The hydro model and thermal model generate the initial feasible solutions for the hydrothermal model. The two competing objectives considered in the daily hydrothermal model are minimizing fuel costs and minimizing thermal emissions. We use the constraint method to develop the trade-off curve (Pareto front) between these two objectives. We apply the proposed methodology on the Yunnan hydrothermal system in China. The system consists of 163 individual hydropower plants with an installed capacity of 48,477 MW and 11 individual thermal plants with an installed capacity of 12,400 MW. We use historical operational records to verify the correctness of the model and to test the robustness of the methodology. The results demonstrate the practicability and validity of the proposed procedure.
Scaling a Single Attribute: A Methodological Study of Conservation
ERIC Educational Resources Information Center
Hofmann, Richard J.; Trepanier, Mary
1975-01-01
This study was designed to assess the acquisition of conservation of number on equal addition tasks through scalogram analysis to determine if this analysis defines a scale or continuum. Ten block tasks administered to 85 kindergarten children validated Piaget's theory that cognitive development is sequential and continuous. (Author/ED)
Students' Self-Evaluation and Reflection (Part 1): "Measurement"
ERIC Educational Resources Information Center
Cambra-Fierro, Jesus; Cambra-Berdun, Jesus
2007-01-01
Purpose: The objective of the paper is the development and validation of scales to assess reflective learning. Design/methodology/approach: The research is based on a literature review plus in-classroom experience. For the scale validation process, exploratory and confirmatory analyses were conducted, following proposals made by Anderson and…
van der Kamp, Jonathan; Bachmann, Till M
2015-03-03
"Getting the prices right" through internalizing external costs is a guiding principle of environmental policy making, one recent example being the EU Clean Air Policy Package released at the end of 2013. It is supported by impact assessments, including monetary valuation of environmental and health damages. For over 20 years, related methodologies have been developed in Europe in the Externalities of Energy (ExternE) project series and follow-up activities. In this study, we aim at analyzing the main methodological developments over time from the 1990s until today with a focus on classical air pollution-induced human health damage costs. An up-to-date assessment including the latest European recommendations is also applied. Using a case from the energy sector, we identify major influencing parameters: differences in exposure modeling and related data lead to variations in damage costs of up to 21%; concerning risk assessment and monetary valuation, differences in assessing long-term exposure mortality risks together with assumptions on particle toxicity explain most of the observed changes in damage costs. These still debated influencing parameters deserve particular attention when damage costs are used to support environmental policy making.
Data-Driven Simulation-Enhanced Optimization of People-Based Print Production Service
NASA Astrophysics Data System (ADS)
Rai, Sudhendu
This paper describes a systematic six-step data-driven simulation-based methodology for optimizing people-based service systems on a large distributed scale that exhibit high variety and variability. The methodology is exemplified through its application within the printing services industry where it has been successfully deployed by Xerox Corporation across small, mid-sized and large print shops generating over 250 million in profits across the customer value chain. Each step of the methodology consisting of innovative concepts co-development and testing in partnership with customers, development of software and hardware tools to implement the innovative concepts, establishment of work-process and practices for customer-engagement and service implementation, creation of training and infrastructure for large scale deployment, integration of the innovative offering within the framework of existing corporate offerings and lastly the monitoring and deployment of the financial and operational metrics for estimating the return-on-investment and the continual renewal of the offering are described in detail.
Marchant, Tanya; Bryce, Jennifer; Victora, Cesar; Moran, Allisyn C; Claeson, Mariam; Requejo, Jennifer; Amouzou, Agbessi; Walker, Neff; Boerma, Ties; Grove, John
2016-06-01
An urgent priority in maternal, newborn and child health is to accelerate the scale-up of cost-effective essential interventions, especially during labor, the immediate postnatal period and for the treatment of serious infectious diseases and acute malnutrition. Tracking intervention coverage is a key activity to support scale-up and in this paper we examine priorities in coverage measurement, distinguishing between essential interventions that can be measured now and those that require methodological development. We conceptualized a typology of indicators related to intervention coverage that distinguishes access to care from receipt of an intervention by the population in need. We then built on documented evidence on coverage measurement to determine the status of indicators for essential interventions and to identify areas for development. Contact indicators from pregnancy to childhood were identified as current indicators for immediate use, but indicators reflecting the quality of care provided during these contacts need development. At each contact point, some essential interventions can be measured now, but the need for development of indicators predominates around interventions at the time of birth and interventions to treat infections. Addressing this need requires improvements in routine facility based data capture, methods for linking provider and community-based data, and improved guidance for effective coverage measurement that reflects the provision of high-quality care. Coverage indicators for some essential interventions can be measured accurately through household surveys and be used to track progress in maternal, newborn and child health. Other essential interventions currently rely on contact indicators as proxies for coverage but urgent attention is needed to identify new measurement approaches that directly and reliably measure their effective coverage.
Building laboratory capacity to support HIV care in Nigeria: Harvard/APIN PEPFAR, 2004–2012
Hamel, Donald J.; Sankalé, Jean-Louis; Samuels, Jay Osi; Sarr, Abdoulaye D.; Chaplin, Beth; Ofuche, Eke; Meloni, Seema T.; Okonkwo, Prosper; Kanki, Phyllis J.
2015-01-01
Introduction From 2004–2012, the Harvard/AIDS Prevention Initiative in Nigeria, funded through the US President’s Emergency Plan for AIDS Relief programme, scaled up HIV care and treatment services in Nigeria. We describe the methodologies and collaborative processes developed to improve laboratory capacity significantly in a resource-limited setting. These methods were implemented at 35 clinic and laboratory locations. Methods Systems were established and modified to optimise numerous laboratory processes. These included strategies for clinic selection and management, equipment and reagent procurement, supply chains, laboratory renovations, equipment maintenance, electronic data management, quality development programmes and trainings. Results Over the eight-year programme, laboratories supported 160 000 patients receiving HIV care in Nigeria, delivering over 2.5 million test results, including regular viral load quantitation. External quality assurance systems were established for CD4+ cell count enumeration, blood chemistries and viral load monitoring. Laboratory equipment platforms were improved and standardised and use of point-of-care analysers was expanded. Laboratory training workshops supported laboratories toward increasing staff skills and improving overall quality. Participation in a World Health Organisation-led African laboratory quality improvement system resulted in significant gains in quality measures at five laboratories. Conclusions Targeted implementation of laboratory development processes, during simultaneous scale-up of HIV treatment programmes in a resource-limited setting, can elicit meaningful gains in laboratory quality and capacity. Systems to improve the physical laboratory environment, develop laboratory staff, create improvements to reduce costs and increase quality are available for future health and laboratory strengthening programmes. We hope that the strategies employed may inform and encourage the development of other laboratories in resource-limited settings. PMID:26900573
Space Launch System Base Heating Test: Experimental Operations & Results
NASA Technical Reports Server (NTRS)
Dufrene, Aaron; Mehta, Manish; MacLean, Matthew; Seaford, Mark; Holden, Michael
2016-01-01
NASA's Space Launch System (SLS) uses four clustered liquid rocket engines along with two solid rocket boosters. The interaction between all six rocket exhaust plumes will produce a complex and severe thermal environment in the base of the vehicle. This work focuses on a recent 2% scale, hot-fire SLS base heating test. These base heating tests are short-duration tests executed with chamber pressures near the full-scale values with gaseous hydrogen/oxygen engines and RSRMV analogous solid propellant motors. The LENS II shock tunnel/Ludwieg tube tunnel was used at or near flight duplicated conditions up to Mach 5. Model development was based on the Space Shuttle base heating tests with several improvements including doubling of the maximum chamber pressures and duplication of freestream conditions. Test methodology and conditions are presented, and base heating results from 76 runs are reported in non-dimensional form. Regions of high heating are identified and comparisons of various configuration and conditions are highlighted. Base pressure and radiometer results are also reported.
Accounting for the cost of scaling-up health interventions.
Johns, Benjamin; Baltussen, Rob
2004-11-01
Recent studies such as the Commission on Macroeconomics and Health have highlighted the need for expanding the coverage of services for HIV/AIDS, malaria, tuberculosis, immunisations and other diseases. In order for policy makers to plan for these changes, they need to analyse the change in costs when interventions are 'scaled-up' to cover greater percentages of the population. Previous studies suggest that applying current unit costs to an entire population can misconstrue the true costs of an intervention. This study presents the methodology used in WHO-CHOICE's generalised cost effectiveness analysis, which includes non-linear cost functions for health centres, transportation and supervision costs, as well as the presence of fixed costs of establishing a health infrastructure. Results show changing marginal costs as predicted by economic theory. 2004 John Wiley & Sons, Ltd.
The assessment of creativity in creativity/psychopathology research - a systematic review.
Thys, E; Sabbe, B; De Hert, M
2014-01-01
The possible link between creativity and psychopathology has been a long time focus of research up to the present day. However, this research is hampered by methodological problems, especially the definition and assessment of creativity. This makes interpretation and comparison of studies difficult and possibly accounts for the contradictory results of this research. In this systematic review of the literature, research articles in the field of creativity and psychopathology were searched for creativity assessment tools. The tools used in the collected articles are presented and discussed. The results indicate that a multitude of creativity assessment tools were used, that many studies only used one tool to assess creativity and that most of these tools were only used in a limited number of studies. A few assessment tools stand out by a more frequent use, also outside psychopathological research, and more solid psychometric properties. Most scales used to evaluate creativity have poor psychometric properties. The scattered methodology to assess creativity compromises the generalizability and validity of this research. The field should creatively develop new validated instruments.
Full-scale testing and progressive damage modeling of sandwich composite aircraft fuselage structure
NASA Astrophysics Data System (ADS)
Leone, Frank A., Jr.
A comprehensive experimental and computational investigation was conducted to characterize the fracture behavior and structural response of large sandwich composite aircraft fuselage panels containing artificial damage in the form of holes and notches. Full-scale tests were conducted where panels were subjected to quasi-static combined pressure, hoop, and axial loading up to failure. The panels were constructed using plain-weave carbon/epoxy prepreg face sheets and a Nomex honeycomb core. Panel deformation and notch tip damage development were monitored during the tests using several techniques, including optical observations, strain gages, digital image correlation (DIC), acoustic emission (AE), and frequency response (FR). Additional pretest and posttest inspections were performed via thermography, computer-aided tap tests, ultrasound, x-radiography, and scanning electron microscopy. The framework to simulate damage progression and to predict residual strength through use of the finite element (FE) method was developed. The DIC provided local and full-field strain fields corresponding to changes in the state-of-damage and identified the strain components driving damage progression. AE was monitored during loading of all panels and data analysis methodologies were developed to enable real-time determination of damage initiation, progression, and severity in large composite structures. The FR technique has been developed, evaluating its potential as a real-time nondestructive inspection technique applicable to large composite structures. Due to the large disparity in scale between the fuselage panels and the artificial damage, a global/local analysis was performed. The global FE models fully represented the specific geometries, composite lay-ups, and loading mechanisms of the full-scale tests. A progressive damage model was implemented in the local FE models, allowing the gradual failure of elements in the vicinity of the artificial damage. A set of modifications to the definitions of the local FE model boundary conditions is proposed and developed to address several issues related to the scalability of progressive damage modeling concepts, especially in regards to full-scale fuselage structures. Notable improvements were observed in the ability of the FE models to predict the strength of damaged composite fuselage structures. Excellent agreement has been established between the FE model predictions and the experimental results recorded by DIC, AE, FR, and visual observations.
The Contribution of Human Factors in Military System Development: Methodological Considerations
1980-07-01
Risk/Uncertainty Analysis - Project Scoring - Utility Scales - Relevance Tree Techniques (Reverse Factor Analysis) 2. Computer Simulation Simulation...effectiveness of mathematical models for R&D project selection. Management Science, April 1973, 18. 6-43 .1~ *.-. Souder, W.E. h scoring methodology for...per some interval PROFICIENCY test scores (written) RADIATION radiation effects aircrew performance on radiation environments REACTION TIME 1) (time
Portable parallel stochastic optimization for the design of aeropropulsion components
NASA Technical Reports Server (NTRS)
Sues, Robert H.; Rhodes, G. S.
1994-01-01
This report presents the results of Phase 1 research to develop a methodology for performing large-scale Multi-disciplinary Stochastic Optimization (MSO) for the design of aerospace systems ranging from aeropropulsion components to complete aircraft configurations. The current research recognizes that such design optimization problems are computationally expensive, and require the use of either massively parallel or multiple-processor computers. The methodology also recognizes that many operational and performance parameters are uncertain, and that uncertainty must be considered explicitly to achieve optimum performance and cost. The objective of this Phase 1 research was to initialize the development of an MSO methodology that is portable to a wide variety of hardware platforms, while achieving efficient, large-scale parallelism when multiple processors are available. The first effort in the project was a literature review of available computer hardware, as well as review of portable, parallel programming environments. The first effort was to implement the MSO methodology for a problem using the portable parallel programming language, Parallel Virtual Machine (PVM). The third and final effort was to demonstrate the example on a variety of computers, including a distributed-memory multiprocessor, a distributed-memory network of workstations, and a single-processor workstation. Results indicate the MSO methodology can be well-applied towards large-scale aerospace design problems. Nearly perfect linear speedup was demonstrated for computation of optimization sensitivity coefficients on both a 128-node distributed-memory multiprocessor (the Intel iPSC/860) and a network of workstations (speedups of almost 19 times achieved for 20 workstations). Very high parallel efficiencies (75 percent for 31 processors and 60 percent for 50 processors) were also achieved for computation of aerodynamic influence coefficients on the Intel. Finally, the multi-level parallelization strategy that will be needed for large-scale MSO problems was demonstrated to be highly efficient. The same parallel code instructions were used on both platforms, demonstrating portability. There are many applications for which MSO can be applied, including NASA's High-Speed-Civil Transport, and advanced propulsion systems. The use of MSO will reduce design and development time and testing costs dramatically.
Wang, Xiao-Ling; Ding, Zhong-Yang; Zhao, Yan; Liu, Gao-Qiang; Zhou, Guo-Ying
2017-01-01
Triterpene acids are among the major bioactive constituents of lucidum. However, submerged fermentation techniques for isolating triterpene acids from G. lucidum have not been optimized for commercial use, and the antitumor activity of the mycelial triterpene acids needs to be further proven. The aim of this work was to optimize the conditions for G. lucidum culture with respect to triterpene acid production, scaling up the process, and examining the in vitro antitumor activity of mycelial triterpene acids. The key conditions (i.e., initial pH, fermentation temperature, and rotation speed) were optimized using response surface methodology, and the in vitro antitumor activity was evaluated using the MTT method. The optimum key fermentation conditions for triterpene acid production were pH 6.0; rotation speed, 161.9 rpm; and temperature, 30.1°C, resulting in a triterpene acid yield of 291.0 mg/L in the validation experiment in a 5-L stirred bioreactor; this yield represented a 70.8% increase in titer compared with the nonoptimized conditions. Furthermore, the optimized conditions were then successfully scaled up to a production scale of 200 L, and a triterpene productivity of 47.9 mg/L/day was achieved, which is, to our knowledge, the highest reported in the large-scale fermentation of G. lucidum. In addition, the mycelial triterpene acids were found to be cytotoxic to the SMMC-7721 and SW620 cell lines in vitro. Chemical analysis showed that the key active triterpene acid compounds, ganoderic acids T and Me, predominated in the extract, at 69.2 and 41.6 mg/g, respectively. Thus, this work develops a simple and feasible batch fermentation technique for the large-scale production of antitumor triterpene acids from G. lucidum.
ERIC Educational Resources Information Center
Wine, Jennifer; Janson, Natasha; Wheeless, Sara
2011-01-01
This report describes and evaluates the methods and procedures used in the 2004/09 Beginning Postsecondary Students Longitudinal Study (BPS:04/09). BPS:04/09 is the second and final follow-up interview for the cohort of first-time beginning postsecondary students identified in the 2004 National Postsecondary Student Aid Study. For the first time…
Flow Chemistry on Multigram Scale: Continuous Synthesis of Boronic Acids within 1 s.
Hafner, Andreas; Meisenbach, Mark; Sedelmeier, Joerg
2016-08-05
The benefits and limitations of a simple continuous flow setup for handling and performing of organolithium chemistry on the multigram scale is described. The developed metalation platform embodies a valuable complement to existing methodologies, as it combines the benefits of Flash Chemistry (chemical synthesis on a time scale of <1 s) with remarkable throughput (g/min) while mitigating the risk of blockages.
Large Scale Density Estimation of Blue and Fin Whales (LSD)
2015-09-30
1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse
Subramanian, Savitha; Naimoli, Joseph; Matsubayashi, Toru; Peters, David H
2011-12-14
There is widespread agreement on the need for scaling up in the health sector to achieve the Millennium Development Goals (MDGs). But many countries are not on track to reach the MDG targets. The dominant approach used by global health initiatives promotes uniform interventions and targets, assuming that specific technical interventions tested in one country can be replicated across countries to rapidly expand coverage. Yet countries scale up health services and progress against the MDGs at very different rates. Global health initiatives need to take advantage of what has been learned about scaling up. A systematic literature review was conducted to identify conceptual models for scaling up health in developing countries, with the articles assessed according to the practical concerns of how to scale up, including the planning, monitoring and implementation approaches. We identified six conceptual models for scaling up in health based on experience with expanding pilot projects and diffusion of innovations. They place importance on paying attention to enhancing organizational, functional, and political capabilities through experimentation and adaptation of strategies in addition to increasing the coverage and range of health services. These scaling up approaches focus on fostering sustainable institutions and the constructive engagement between end users and the provider and financing organizations. The current approaches to scaling up health services to reach the MDGs are overly simplistic and not working adequately. Rather than relying on blueprint planning and raising funds, an approach characteristic of current global health efforts, experience with alternative models suggests that more promising pathways involve "learning by doing" in ways that engage key stakeholders, uses data to address constraints, and incorporates results from pilot projects. Such approaches should be applied to current strategies to achieve the MDGs.
Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle
2016-01-01
Background Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students’ communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. Methods We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Results Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Discussion Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students’ academic success. PMID:27031506
Cömert, Musa; Zill, Jördis Maria; Christalle, Eva; Dirmaier, Jörg; Härter, Martin; Scholl, Isabelle
2016-01-01
Teaching and assessment of communication skills have become essential in medical education. The Objective Structured Clinical Examination (OSCE) has been found as an appropriate means to assess communication skills within medical education. Studies have demonstrated the importance of a valid assessment of medical students' communication skills. Yet, the validity of the performance scores depends fundamentally on the quality of the rating scales used in an OSCE. Thus, this systematic review aimed at providing an overview of existing rating scales, describing their underlying definition of communication skills, determining the methodological quality of psychometric studies and the quality of psychometric properties of the identified rating scales. We conducted a systematic review to identify psychometrically tested rating scales, which have been applied in OSCE settings to assess communication skills of medical students. Our search strategy comprised three databases (EMBASE, PsycINFO, and PubMed), reference tracking and consultation of experts. We included studies that reported psychometric properties of communication skills assessment rating scales used in OSCEs by examiners only. The methodological quality of included studies was assessed using the COnsensus based Standards for the selection of health status Measurement INstruments (COSMIN) checklist. The quality of psychometric properties was evaluated using the quality criteria of Terwee and colleagues. Data of twelve studies reporting on eight rating scales on communication skills assessment in OSCEs were included. Five of eight rating scales were explicitly developed based on a specific definition of communication skills. The methodological quality of studies was mainly poor. The psychometric quality of the eight rating scales was mainly intermediate. Our results reveal that future psychometric evaluation studies focusing on improving the methodological quality are needed in order to yield psychometrically sound results of the OSCEs assessing communication skills. This is especially important given that most OSCE rating scales are used for summative assessment, and thus have an impact on medical students' academic success.
Alcaraz, Mar; García-Gil, Alejandro; Vázquez-Suñé, Enric; Velasco, Violeta
2016-02-01
Borehole Heat Exchangers (BHEs) are increasingly being used to exploit shallow geothermal energy. This paper presents a new methodology to provide a response to the need for a regional quantification of the geothermal potential that can be extracted by BHEs and the associated environmental impacts. A set of analytical solutions facilitates accurate calculation of the heat exchange of BHEs with the ground and its environmental impacts. For the first time, advection and dispersion heat transport mechanisms and the temporal evolution from the start of operation of the BHE are taken into account in the regional estimation of shallow geothermal resources. This methodology is integrated in a GIS environment, which facilitates the management of input and output data at a regional scale. An example of the methodology's application is presented for Barcelona, in Spain. As a result of the application, it is possible to show the strengths and improvements of this methodology in the development of potential maps of low temperature geothermal energy as well as maps of environmental impacts. The minimum and maximum energy potential values for the study site are 50 and 1800 W/m(2) for a drilled depth of 100 m, proportionally to Darcy velocity. Regarding to thermal impacts, the higher the groundwater velocity and the energy potential, the higher the size of the thermal plume after 6 months of exploitation, whose length ranges from 10 to 27 m long. A sensitivity analysis was carried out in the calculation of heat exchange rate and its impacts for different scenarios and for a wide range of Darcy velocities. The results of this analysis lead to the conclusion that the consideration of dispersion effects and temporal evolution of the exploitation prevent significant differences up to a factor 2.5 in the heat exchange rate accuracy and up to several orders of magnitude in the impacts generated. Copyright © 2015 Elsevier B.V. All rights reserved.
Lexchin, J; Holbrook, A
1994-07-01
To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). Analytic study. All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion.
Lexchin, J; Holbrook, A
1994-01-01
OBJECTIVE: To evaluate the methodologic quality and relevance of references in pharmaceutical advertisements in the Canadian Medical Association Journal (CMAJ). DESIGN: Analytic study. DATA SOURCE: All 114 references cited in the first 22 distinct pharmaceutical advertisements in volume 146 of CMAJ. MAIN OUTCOME MEASURES: Mean methodologic quality score (modified from the 6-point scale used to assess articles in the American College of Physicians' Journal Club) and mean relevance score (based on a new 5-point scale) for all references in each advertisement. MAIN RESULTS: Twenty of the 22 companies responded, sending 78 (90%) of the 87 references requested. The mean methodologic quality score was 58% (95% confidence limits [CL] 51% and 65%) and the mean relevance score 76% (95% CL 72% and 80%). The two mean scores were statistically lower than the acceptable score of 80% (p < 0.05), and the methodologic quality score was outside the preset clinically significant difference of 15%. The poor rating for methodologic quality was primarily because of the citation of references to low-quality review articles and "other" sources (i.e., other than reports of clinical trials). Half of the advertisements had a methodologic quality score of less than 65%, but only five had a relevance score of less than 65%. CONCLUSIONS: Although the relevance of most of the references was within minimal acceptable limits, the methodologic quality was often unacceptable. Because advertisements are an important part of pharmaceutical marketing and education, we suggest that companies develop written standards for their advertisements and monitor their advertisements for adherence to these standards. We also suggest that the Pharmaceutical Advertising Advisory Board develop more stringent guidelines for advertising and that it enforce these guidelines in a consistent, rigorous fashion. PMID:8004560
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karasaki, K.; Galloway, D.
1991-06-01
The planned high-level nuclear waste repository at Yucca Mountain, Nevada, would exist in unsaturated, fractured welded tuff. One possible contaminant pathway to the accessible environment is transport by groundwater infiltrating to the water table and flowing through the saturated zone. Therefore, an effort to characterize the hydrology of the saturated zone is being undertaken in parallel with that of the unsaturated zone. As a part of the saturated zone investigation, there wells-UE-25c{number_sign}1, UE-25c{number_sign}2, and UE-25c{number_sign}3 (hereafter called the c-holes)-were drilled to study hydraulic and transport properties of rock formations underlying the planned waste repository. The location of the c-holes ismore » such that the formations penetrated in the unsaturated zone occur at similar depths and with similar thicknesses as at the planned repository site. In characterizing a highly heterogeneous flow system, several issues emerge. (1) The characterization strategy should allow for the virtual impossibility to enumerate and characterize all heterogeneities. (2) The methodology to characterize the heterogeneous flow system at the scale of the well tests needs to be established. (3) Tools need to be developed for scaling up the information obtained at the well-test scale to the larger scale of the site. In the present paper, the characterization strategy and the methods under development are discussed with the focus on the design and analysis of the field experiments at the c-holes.« less
Quantitative studies of bird movement: A methodological review
Nichols, J.D.; Kaiser, A.
1999-01-01
The past several years have seen the development of a number of statistical models and methods for drawing inferences about bird movement using data from marked individuals. It can be difficult to keep up with this rapid development of new methods, so our purpose here is to categorize and review methods for drawing inferences about avian movement. We also outline recommendations about future work, dealing both with methodological developments and with studies directed at hypotheses about bird movement of interest from conservation, management, or ecological perspectives.
NASA Astrophysics Data System (ADS)
LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon
2014-07-01
Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat heterogeneity over various spatial scales. The approaches were also able to integrate various data at differing spatial resolutions. The classification outputs exhibited similar results, including the number of habitat classes generated, the number of species defining the classes, the level of distinction of the biological communities, and dominance by tube-building amphipods. These results indicate that both approaches are able to discern a comparable degree of habitat variability and produce cohesive macrofaunal assemblages. The mapping approaches identify broadly similar benthic habitats at the two study sites and methods were able to distinguish the differing levels of heterogeneity between them. The top-down approach to habitat classification was faster and simpler to accomplish with the data available in this study when compared to the bottom-up approach. Additionally, the top-down approach generated full-coverage habitat classes that are clearly delineated and can easily be interpreted by the map user, which is desirable from a management perspective for providing a more complete assessment of the areas of interest. However, a higher level of biological variability was noted in some of the habitat classes created, indicating that the biological communities present in this area are influenced by factors not captured in the broad-scale geological habitat units used in this approach. The bottom-up approach was valuable in its ability to more clearly define macrofaunal assemblages among habitats, discern finer-scale habitat characteristics, and directly assess the degree of macrofaunal assemblage variability captured by the environmental parameters. From a user perspective, the map is more complex, which may be perceived as a limitation, though likely reflects natural gradations in habitat structure and likely presents a more ecologically realistic portrayal of the study areas. Though more comprehensive, the bottom-up approach in this study was limited by the reliance on full-coverage data to create full-coverage habitat classes. Such classes could only be developed when sediment data was excluded, since this point-sample dataset could not be interpolated due to high spatial heterogeneity of the study areas. Given a higher density of bottom samples, this issue could be rectified. While the top-down approach was more appropriate for this study, both approaches were found to be suitable for mapping and classifying benthic habitats. In the United States, objectives for mapping and classification for renewable energy development have not been well established. Therefore, at this time, the best-suited approach primarily depends on mapping objectives, resource availability, data quality and coverage, and geographical location, as these factors impact the types of data included, the analyses and modeling that can be performed, and the biotic-abiotic relationships identified.
New well pattern optimization methodology in mature low-permeability anisotropic reservoirs
NASA Astrophysics Data System (ADS)
Qin, Jiazheng; Liu, Yuetian; Feng, Yueli; Ding, Yao; Liu, Liu; He, Youwei
2018-02-01
In China, lots of well patterns were designed before people knew the principal permeability direction in low-permeability anisotropic reservoirs. After several years’ production, it turns out that well line direction is unparallel with principal permeability direction. However, traditional well location optimization methods (in terms of the objective function such as net present value and/or ultimate recovery) are inapplicable, since wells are not free to move around in a mature oilfield. Thus, the well pattern optimization (WPO) of mature low-permeability anisotropic reservoirs is a significant but challenging task, since the original well pattern (WP) will be distorted and reconstructed due to permeability anisotropy. In this paper, we investigate the destruction and reconstruction of WP when the principal permeability direction and well line direction are unparallel. A new methodology was developed to quantitatively optimize the well locations of mature large-scale WP through a WPO algorithm on the basis of coordinate transformation (i.e. rotating and stretching). For a mature oilfield, large-scale WP has settled, so it is not economically viable to carry out further infill drilling. This paper circumvents this difficulty by combining the WPO algorithm with the well status (open or shut-in) and schedule adjustment. Finally, this methodology is applied to an example. Cumulative oil production rates of the optimized WP are higher, and water-cut is lower, which highlights the potential of the WPO methodology application in mature large-scale field development projects.
Student Life Balance: Myth or Reality?
ERIC Educational Resources Information Center
Doble, Niharika; Supriya, M. V.
2011-01-01
Purpose: Student life stress, student family conflict and student life balance are issues that are scarcely researched. This paper aims to develop a scale for assessing the concept of student life balance. Design/methodology/approach: The study evaluated a 54-item scale for assessing the construct. The data are obtained from 612 Indian students.…
Evaluation of Scale Reliability with Binary Measures Using Latent Variable Modeling
ERIC Educational Resources Information Center
Raykov, Tenko; Dimitrov, Dimiter M.; Asparouhov, Tihomir
2010-01-01
A method for interval estimation of scale reliability with discrete data is outlined. The approach is applicable with multi-item instruments consisting of binary measures, and is developed within the latent variable modeling methodology. The procedure is useful for evaluation of consistency of single measures and of sum scores from item sets…
On the Cross-Country Comparability of Indicators of Socioeconomic Resources in PISA
ERIC Educational Resources Information Center
Pokropek, Artur; Borgonovi, Francesca; McCormick, Carina
2017-01-01
Large-scale international assessments rely on indicators of the resources that students report having in their homes to capture the financial capital of their families. The scaling methodology currently used to develop the Programme for International Student Assessment (PISA) background indices is designed to maximize within-country comparability…
ERIC Educational Resources Information Center
Park, Joonwook; Desarbo, Wayne S.; Liechty, John
2008-01-01
Multidimensional scaling (MDS) models for the analysis of dominance data have been developed in the psychometric and classification literature to simultaneously capture subjects' "preference heterogeneity" and the underlying dimentional structure for a set of designated stimuli in a parsimonious manner. There are two major types of latent utility…
Evaluation of Weighted Scale Reliability and Criterion Validity: A Latent Variable Modeling Approach
ERIC Educational Resources Information Center
Raykov, Tenko
2007-01-01
A method is outlined for evaluating the reliability and criterion validity of weighted scales based on sets of unidimensional measures. The approach is developed within the framework of latent variable modeling methodology and is useful for point and interval estimation of these measurement quality coefficients in counseling and education…
NASA Astrophysics Data System (ADS)
Ozevin, Didem; Fazel, Hossein; Cox, Justin; Hardman, William; Kessler, Seth S.; Timmons, Alan
2014-04-01
Gearbox components of aerospace structures are typically made of brittle materials with high fracture toughness, but susceptible to fatigue failure due to continuous cyclic loading. Structural Health Monitoring (SHM) methods are used to monitor the crack growth in gearbox components. Damage detection methodologies developed in laboratory-scale experiments may not represent the actual gearbox structural configuration, and are usually not applicable to real application as the vibration and wave properties depend on the material, structural layers and thicknesses. Also, the sensor types and locations are key factors for frequency content of ultrasonic waves, which are essential features for pattern recognition algorithm development in noisy environments. Therefore, a deterministic damage detection methodology that considers all the variables influencing the waveform signature should be considered in the preliminary computation before any experimental test matrix. In order to achieve this goal, we developed two dimensional finite element models of a gearbox cross section from front view and shaft section. The cross section model consists of steel revolving teeth, a thin layer of oil, and retention plate. An ultrasonic wave up to 1 MHz frequency is generated, and waveform histories along the gearbox are recorded. The received waveforms under pristine and cracked conditions are compared in order to analyze the crack influence on the wave propagation in gearbox, which can be utilized by both active and passive SHM methods.
Massetti, Greta M; Simon, Thomas R; Smith, Deborah Gorman
2016-10-01
Drawing on research that has identified specific predictors and trajectories of risk for violence and related negative outcomes, a multitude of small- and large-scale preventive interventions for specific risk behaviors have been developed, implemented, and evaluated. One of the principal challenges of these approaches is that a number of separate problem-specific programs targeting different risk areas have emerged. However, as many negative health behaviors such as substance abuse and violence share a multitude of risk factors, many programs target identical risk factors. There are opportunities to understand whether evidence-based programs can be leveraged for potential effects across a spectrum of outcomes and over time. Some recent work has documented longitudinal effects of evidence-based interventions on generalized outcomes. This work has potential for advancing our understanding of the effectiveness of promising and evidence-based prevention strategies. However, conducting longitudinal follow-up of established interventions presents a number of methodological and design challenges. To answer some of these questions, the Centers for Disease Control and Prevention convened a panel of multidisciplinary experts to discuss opportunities to take advantage of evaluations of early prevention programs and evaluating multiple long-term outcomes. This special section of the journal Prevention Science includes a series of papers that begin to address the relevant considerations for conducting longitudinal follow-up evaluation research. This collection of papers is intended to inform our understanding of the challenges and strategies for conducting longitudinal follow-up evaluation research that could be used to drive future research endeavors.
NASA Astrophysics Data System (ADS)
Forte, J.; Brilha, J.; Pereira, D.; Nolasco, M.
2012-04-01
Although geodiversity is considered the setting for biodiversity, there is still a huge gap in the social recognition of these two concepts. The concept of geodiversity, less developed, is now making its own way as a robust and fundamental idea concerning the abiotic component of nature. From a conservationist point of view, the lack of a broader knowledge concerning the type and spatial variation of geodiversity, as well as its relationship with biodiversity, makes the protection and management of natural or semi-natural areas incomplete. There is a growing need to understand the patterns of geodiversity in different landscapes and to translate this knowledge for territorial management in a practical and effective point of view. This kind of management can also represent an important tool for the development of sustainable tourism, particularly geotourism, which can bring benefits not only for the environment, but also for social and economic purposes. The quantification of geodiversity is an important step in all this process but still few researchers are investing in the development of a proper methodology. The assessment methodologies that were published so far are mainly focused on the evaluation of geomorphological elements, sometimes complemented with information about lithology, soils, hidrology, morphometric variables, climatic surfaces and geosites. This results in very dissimilar areas at very different spatial scales, showing the complexity of the task and the need of further research. This current work aims the development of an effective methodology for the assessment of the maximum elements of geodiversity possible (rocks, minerals, fossils, landforms, soils), based on GIS routines. The main determinant factor for the quantitative assessment is scale, but other factors are also very important, such as the existence of suitable spatial data with sufficient degree of detail. It is expected to attain the proper procedures in order to assess geodiversity at different scales and to produce maps with the spatial representation of the geodiversity index, which could be an inestimable contribute for land-use management.
Research with Children: Methodological Issues and Innovative Techniques
ERIC Educational Resources Information Center
Fargas-Malet, Montserrat; McSherry, Dominic; Larkin, Emma; Robinson, Clive
2010-01-01
In the past few decades, a growing body of literature examining children's perspectives on their own lives has developed within a variety of disciplines, such as sociology, psychology, anthropology and geography. This article provides a brief up-to-date examination of methodological and ethical issues that researchers may need to consider when…
Approaches for advancing scientific understanding of macrosystems
Levy, Ofir; Ball, Becky A.; Bond-Lamberty, Ben; Cheruvelil, Kendra S.; Finley, Andrew O.; Lottig, Noah R.; Surangi W. Punyasena,; Xiao, Jingfeng; Zhou, Jizhong; Buckley, Lauren B.; Filstrup, Christopher T.; Keitt, Tim H.; Kellner, James R.; Knapp, Alan K.; Richardson, Andrew D.; Tcheng, David; Toomey, Michael; Vargas, Rodrigo; Voordeckers, James W.; Wagner, Tyler; Williams, John W.
2014-01-01
The emergence of macrosystems ecology (MSE), which focuses on regional- to continental-scale ecological patterns and processes, builds upon a history of long-term and broad-scale studies in ecology. Scientists face the difficulty of integrating the many elements that make up macrosystems, which consist of hierarchical processes at interacting spatial and temporal scales. Researchers must also identify the most relevant scales and variables to be considered, the required data resources, and the appropriate study design to provide the proper inferences. The large volumes of multi-thematic data often associated with macrosystem studies typically require validation, standardization, and assimilation. Finally, analytical approaches need to describe how cross-scale and hierarchical dynamics and interactions relate to macroscale phenomena. Here, we elaborate on some key methodological challenges of MSE research and discuss existing and novel approaches to meet them.
NASA Astrophysics Data System (ADS)
Agarwal, Smriti; Bisht, Amit Singh; Singh, Dharmendra; Pathak, Nagendra Prasad
2014-12-01
Millimetre wave imaging (MMW) is gaining tremendous interest among researchers, which has potential applications for security check, standoff personal screening, automotive collision-avoidance, and lot more. Current state-of-art imaging techniques viz. microwave and X-ray imaging suffers from lower resolution and harmful ionizing radiation, respectively. In contrast, MMW imaging operates at lower power and is non-ionizing, hence, medically safe. Despite these favourable attributes, MMW imaging encounters various challenges as; still it is very less explored area and lacks suitable imaging methodology for extracting complete target information. Keeping in view of these challenges, a MMW active imaging radar system at 60 GHz was designed for standoff imaging application. A C-scan (horizontal and vertical scanning) methodology was developed that provides cross-range resolution of 8.59 mm. The paper further details a suitable target identification and classification methodology. For identification of regular shape targets: mean-standard deviation based segmentation technique was formulated and further validated using a different target shape. For classification: probability density function based target material discrimination methodology was proposed and further validated on different dataset. Lastly, a novel artificial neural network based scale and rotation invariant, image reconstruction methodology has been proposed to counter the distortions in the image caused due to noise, rotation or scale variations. The designed neural network once trained with sample images, automatically takes care of these deformations and successfully reconstructs the corrected image for the test targets. Techniques developed in this paper are tested and validated using four different regular shapes viz. rectangle, square, triangle and circle.
Yeager, David S.; Romero, Carissa; Paunesku, Dave; Hulleman, Christopher S.; Schneider, Barbara; Hinojosa, Cintia; Lee, Hae Yeon; O’Brien, Joseph; Flint, Kate; Roberts, Alice; Trott, Jill; Greene, Daniel; Walton, Gregory M.; Dweck, Carol S.
2016-01-01
There are many promising psychological interventions on the horizon, but there is no clear methodology for preparing them to be scaled up. Drawing on design thinking, the present research formalizes a methodology for redesigning and tailoring initial interventions. We test the methodology using the case of fixed versus growth mindsets during the transition to high school. Qualitative inquiry and rapid, iterative, randomized “A/B” experiments were conducted with ~3,000 participants to inform intervention revisions for this population. Next, two experimental evaluations showed that the revised growth mindset intervention was an improvement over previous versions in terms of short-term proxy outcomes (Study 1, N=7,501), and it improved 9th grade core-course GPA and reduced D/F GPAs for lower achieving students when delivered via the Internet under routine conditions with ~95% of students at 10 schools (Study 2, N=3,676). Although the intervention could still be improved even further, the current research provides a model for how to improve and scale interventions that begin to address pressing educational problems. It also provides insight into how to teach a growth mindset more effectively. PMID:27524832
Cavitation Inside High-Pressure Optically Transparent Fuel Injector Nozzles
NASA Astrophysics Data System (ADS)
Falgout, Z.; Linne, M.
2015-12-01
Nozzle-orifice flow and cavitation have an important effect on primary breakup of sprays. For this reason, a number of studies in recent years have used injectors with optically transparent nozzles so that orifice flow cavitation can be examined directly. Many of these studies use injection pressures scaled down from realistic injection pressures used in modern fuel injectors, and so the geometry must be scaled up so that the Reynolds number can be matched with the industrial applications of interest. A relatively small number of studies have shown results at or near the injection pressures used in real systems. Unfortunately, neither the specifics of the design of the optical nozzle nor the design methodology used is explained in detail in these papers. Here, a methodology demonstrating how to prevent failure of a finished design made from commonly used optically transparent materials will be explained in detail, and a description of a new design for transparent nozzles which minimizes size and cost will be shown. The design methodology combines Finite Element Analysis with relevant materials science to evaluate the potential for failure of the finished assembly. Finally, test results imaging a cavitating flow at elevated pressures are presented.
Scaling Relations and Self-Similarity of 3-Dimensional Reynolds-Averaged Navier-Stokes Equations.
Ercan, Ali; Kavvas, M Levent
2017-07-25
Scaling conditions to achieve self-similar solutions of 3-Dimensional (3D) Reynolds-Averaged Navier-Stokes Equations, as an initial and boundary value problem, are obtained by utilizing Lie Group of Point Scaling Transformations. By means of an open-source Navier-Stokes solver and the derived self-similarity conditions, we demonstrated self-similarity within the time variation of flow dynamics for a rigid-lid cavity problem under both up-scaled and down-scaled domains. The strength of the proposed approach lies in its ability to consider the underlying flow dynamics through not only from the governing equations under consideration but also from the initial and boundary conditions, hence allowing to obtain perfect self-similarity in different time and space scales. The proposed methodology can be a valuable tool in obtaining self-similar flow dynamics under preferred level of detail, which can be represented by initial and boundary value problems under specific assumptions.
Pressure Decay Testing Methodology for Quantifying Leak Rates of Full-Scale Docking System Seals
NASA Technical Reports Server (NTRS)
Dunlap, Patrick H., Jr.; Daniels, Christopher C.; Wasowski, Janice L.; Garafolo, Nicholas G.; Penney, Nicholas; Steinetz, Bruce M.
2010-01-01
NASA is developing a new docking system to support future space exploration missions to low-Earth orbit and the Moon. This system, called the Low Impact Docking System, is a mechanism designed to connect the Orion Crew Exploration Vehicle to the International Space Station, the lunar lander (Altair), and other future Constellation Project vehicles. NASA Glenn Research Center is playing a key role in developing the main interface seal for this docking system. This seal will be relatively large with an outside diameter in the range of 54 to 58 in. (137 to 147 cm). As part of this effort, a new test apparatus has been designed, fabricated, and installed to measure leak rates of candidate full-scale seals under simulated thermal, vacuum, and engagement conditions. Using this test apparatus, a pressure decay testing and data processing methodology has been developed to quantify full-scale seal leak rates. Tests performed on untreated 54 in. diameter seals at room temperature in a fully compressed state resulted in leak rates lower than the requirement of less than 0.0025 lbm, air per day (0.0011 kg/day).
Nanowire nanocomputer as a finite-state machine.
Yao, Jun; Yan, Hao; Das, Shamik; Klemic, James F; Ellenbogen, James C; Lieber, Charles M
2014-02-18
Implementation of complex computer circuits assembled from the bottom up and integrated on the nanometer scale has long been a goal of electronics research. It requires a design and fabrication strategy that can address individual nanometer-scale electronic devices, while enabling large-scale assembly of those devices into highly organized, integrated computational circuits. We describe how such a strategy has led to the design, construction, and demonstration of a nanoelectronic finite-state machine. The system was fabricated using a design-oriented approach enabled by a deterministic, bottom-up assembly process that does not require individual nanowire registration. This methodology allowed construction of the nanoelectronic finite-state machine through modular design using a multitile architecture. Each tile/module consists of two interconnected crossbar nanowire arrays, with each cross-point consisting of a programmable nanowire transistor node. The nanoelectronic finite-state machine integrates 180 programmable nanowire transistor nodes in three tiles or six total crossbar arrays, and incorporates both sequential and arithmetic logic, with extensive intertile and intratile communication that exhibits rigorous input/output matching. Our system realizes the complete 2-bit logic flow and clocked control over state registration that are required for a finite-state machine or computer. The programmable multitile circuit was also reprogrammed to a functionally distinct 2-bit full adder with 32-set matched and complete logic output. These steps forward and the ability of our unique design-oriented deterministic methodology to yield more extensive multitile systems suggest that proposed general-purpose nanocomputers can be realized in the near future.
Nanowire nanocomputer as a finite-state machine
Yao, Jun; Yan, Hao; Das, Shamik; Klemic, James F.; Ellenbogen, James C.; Lieber, Charles M.
2014-01-01
Implementation of complex computer circuits assembled from the bottom up and integrated on the nanometer scale has long been a goal of electronics research. It requires a design and fabrication strategy that can address individual nanometer-scale electronic devices, while enabling large-scale assembly of those devices into highly organized, integrated computational circuits. We describe how such a strategy has led to the design, construction, and demonstration of a nanoelectronic finite-state machine. The system was fabricated using a design-oriented approach enabled by a deterministic, bottom–up assembly process that does not require individual nanowire registration. This methodology allowed construction of the nanoelectronic finite-state machine through modular design using a multitile architecture. Each tile/module consists of two interconnected crossbar nanowire arrays, with each cross-point consisting of a programmable nanowire transistor node. The nanoelectronic finite-state machine integrates 180 programmable nanowire transistor nodes in three tiles or six total crossbar arrays, and incorporates both sequential and arithmetic logic, with extensive intertile and intratile communication that exhibits rigorous input/output matching. Our system realizes the complete 2-bit logic flow and clocked control over state registration that are required for a finite-state machine or computer. The programmable multitile circuit was also reprogrammed to a functionally distinct 2-bit full adder with 32-set matched and complete logic output. These steps forward and the ability of our unique design-oriented deterministic methodology to yield more extensive multitile systems suggest that proposed general-purpose nanocomputers can be realized in the near future. PMID:24469812
A simple landslide susceptibility analysis for hazard and risk assessment in developing countries
NASA Astrophysics Data System (ADS)
Guinau, M.; Vilaplana, J. M.
2003-04-01
In recent years, a number of techniques and methodologies have been developed for mitigating natural disasters. The complexity of these methodologies and the scarcity of material and data series justify the need for simple methodologies to obtain the necessary information for minimising the effects of catastrophic natural phenomena. The work with polygonal maps using a GIS allowed us to develop a simple methodology, which was developed in an area of 473 Km2 in the Departamento de Chinandega (NW Nicaragua). This area was severely affected by a large number of landslides (mainly debris flows), triggered by the Hurricane Mitch rainfalls in October 1998. With the aid of aerial photography interpretation at 1:40.000 scale, amplified to 1:20.000, and detailed field work, a landslide map at 1:10.000 scale was constructed. The failure zones of landslides were digitized in order to obtain a failure zone digital map. A terrain unit digital map, in which a series of physical-environmental terrain factors are represented, was also used. Dividing the studied area into two zones (A and B) with homogeneous physical and environmental characteristics, allows us to develop the proposed methodology and to validate it. In zone A, the failure zone digital map is superimposed onto the terrain unit digital map to establish the relationship between the different terrain factors and the failure zones. The numerical expression of this relationship enables us to classify the terrain by its landslide susceptibility. In zone B, this numerical relationship was employed to obtain a landslide susceptibility map, obviating the need for a failure zone map. The validity of the methodology can be tested in this area by using the degree of superposition of the susceptibility map and the failure zone map. The implementation of the methodology in tropical countries with physical and environmental characteristics similar to those of the study area allows us to carry out a landslide susceptibility analysis in areas where landslide records do not exist. This analysis is essential to landslide hazard and risk assessment, which is necessary to determine the actions for mitigating landslide effects, e.g. land planning, emergency aid actions, etc.
Weighted Ensemble Simulation: Review of Methodology, Applications, and Software.
Zuckerman, Daniel M; Chong, Lillian T
2017-05-22
The weighted ensemble (WE) methodology orchestrates quasi-independent parallel simulations run with intermittent communication that can enhance sampling of rare events such as protein conformational changes, folding, and binding. The WE strategy can achieve superlinear scaling-the unbiased estimation of key observables such as rate constants and equilibrium state populations to greater precision than would be possible with ordinary parallel simulation. WE software can be used to control any dynamics engine, such as standard molecular dynamics and cell-modeling packages. This article reviews the theoretical basis of WE and goes on to describe successful applications to a number of complex biological processes-protein conformational transitions, (un)binding, and assembly processes, as well as cell-scale processes in systems biology. We furthermore discuss the challenges that need to be overcome in the next phase of WE methodological development. Overall, the combined advances in WE methodology and software have enabled the simulation of long-timescale processes that would otherwise not be practical on typical computing resources using standard simulation.
NASA Technical Reports Server (NTRS)
Huston, R. J. (Compiler)
1982-01-01
The establishment of a realistic plan for NASA and the U.S. helicopter industry to develop a design-for-noise methodology, including plans for the identification and development of promising noise reduction technology was discussed. Topics included: noise reduction techniques, scaling laws, empirical noise prediction, psychoacoustics, and methods of developing and validing noise prediction methods.
Development and Initial Results of a Longitudinal Secondary Follow-Up Study.
ERIC Educational Resources Information Center
Levin, Benjamin
1984-01-01
Reviews the literature and difficulties of school follow-up studies. Describes the purpose, design, and methodology of the Peel Secondary Follow-up study. Shows how results from the first round of the study raise important issues about students' expectations and how they are or are not borne out. (SB)
Butt, Michelle L; Pinelli, Janet; Boyle, Michael H; Thomas, Helen; Hunsberger, Mabel; Saigal, Saroj; Lee, David S; Fanning, Jamie K; Austin, Patricia
2009-02-01
The goal of this study was to develop and subsequently evaluate the psychometric properties of a new discriminative instrument to measure parental satisfaction with the quality of care provided in neonatal follow-up (NFU) programs. The methodological framework for developing and evaluating measurement scales described by Streiner and Norman (Health Measurement Scales: A Practical Guide to Their Development and Use. 3rd ed. New York: Oxford University Press; 2003) was used for the study. Informing the phases of the research was a sample of 24 health care professionals and 381 parents who use NFU services. A comprehensive list of items representing the construct, parental satisfaction with quality of care, was generated from published reliable and valid instruments, research studies, focus groups with health care experts, and focus groups with parents. Using a clinimetric approach, the 62 items generated were reduced to 39 items based on parents' ratings of importance and refinement of the items by the research team. After content validation and pretesting, the instrument was tested with parents and underwent item-analysis. The resulting 16-item instrument was composed of 2 subscales, Process and Outcomes. Evaluation of the instrument's psychometric properties indicated adequate test-retest reliability (intraclass correlation coefficient = 0.72) and internal consistency (Process subscale, alpha = 0.77; Outcomes subscale, alpha = 0.90; overall instrument, alpha = 0.90), as well as good content and construct validity. A confirmatory factor analysis supported the multidimensionality of the construct. This new instrument provides clinicians and policy-makers with a tool to assess parental satisfaction with the quality of care in NFU, so areas of dissatisfaction can be identified and changes implemented to optimize service provision.
NASA Astrophysics Data System (ADS)
Jing, B. Y.; Wu, L.; Mao, H. J.; Gong, S. L.; He, J. J.; Zou, C.; Song, G. H.; Li, X. Y.; Wu, Z.
2015-10-01
As the ownership of vehicles and frequency of utilization increase, vehicle emissions have become an important source of air pollution in Chinese cities. An accurate emission inventory for on-road vehicles is necessary for numerical air quality simulation and the assessment of implementation strategies. This paper presents a bottom-up methodology based on the local emission factors, complemented with the widely used emission factors of Computer Programme to Calculate Emissions from Road Transport (COPERT) model and near real time (NRT) traffic data on road segments to develop a high temporal-spatial resolution vehicle emission inventory (HTSVE) for the urban Beijing area. To simulate real-world vehicle emissions accurately, the road has been divided into segments according to the driving cycle (traffic speed) on this road segment. The results show that the vehicle emissions of NOx, CO, HC and PM were 10.54 × 104, 42.51 × 104 and 2.13 × 104 and 0.41 × 104 Mg, respectively. The vehicle emissions and fuel consumption estimated by the model were compared with the China Vehicle Emission Control Annual Report and fuel sales thereafter. The grid-based emissions were also compared with the vehicular emission inventory developed by the macro-scale approach. This method indicates that the bottom-up approach better estimates the levels and spatial distribution of vehicle emissions than the macro-scale method, which relies on more information. Additionally, the on-road vehicle emission inventory model and control effect assessment system in Beijing, a vehicle emission inventory model, was established based on this study in a companion paper (He et al., 2015).
The Relevancy of Large-Scale, Quantitative Methodologies in Middle Grades Education Research
ERIC Educational Resources Information Center
Mertens, Steven B.
2006-01-01
This article examines the relevancy of large-scale, quantitative methodologies in middle grades education research. Based on recommendations from national advocacy organizations, the need for more large-scale, quantitative research, combined with the application of more rigorous methodologies, is presented. Subsequent sections describe and discuss…
Zhang, Yong-Feng; Chiang, Hsiao-Dong
2017-09-01
A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.
Implications of construction method and spatial scale on measures of the built environment.
Strominger, Julie; Anthopolos, Rebecca; Miranda, Marie Lynn
2016-04-28
Research surrounding the built environment (BE) and health has resulted in inconsistent findings. Experts have identified the need to examine methodological choices, such as development and testing of BE indices at varying spatial scales. We sought to examine the impact of construction method and spatial scale on seven measures of the BE using data collected at two time points. The Children's Environmental Health Initiative conducted parcel-level assessments of 57 BE variables in Durham, NC (parcel N = 30,319). Based on a priori defined variable groupings, we constructed seven mutually exclusive BE domains (housing damage, property disorder, territoriality, vacancy, public nuisances, crime, and tenancy). Domain-based indices were developed according to four different index construction methods that differentially account for number of parcels and parcel area. Indices were constructed at the census block level and two alternative spatial scales that better depict the larger neighborhood context experienced by local residents: the primary adjacency community and secondary adjacency community. Spearman's rank correlation was used to assess if indices and relationships among indices were preserved across methods. Territoriality, public nuisances, and tenancy were weakly to moderately preserved across methods at the block level while all other indices were well preserved. Except for the relationships between public nuisances and crime or tenancy, and crime and housing damage or territoriality, relationships among indices were poorly preserved across methods. The number of indices affected by construction method increased as spatial scale increased, while the impact of construction method on relationships among indices varied according to spatial scale. We found that the impact of construction method on BE measures was index and spatial scale specific. Operationalizing and developing BE measures using alternative methods at varying spatial scales before connecting to health outcomes allows researchers to better understand how methodological decisions may affect associations between health outcomes and BE measures. To ensure that associations between the BE and health outcomes are not artifacts of methodological decisions, researchers would be well-advised to conduct sensitivity analysis using different construction methods. This approach may lead to more robust results regarding the BE and health outcomes.
Beillas, Philippe; Berthet, Fabien
2017-05-29
Human body models have the potential to better describe the human anatomy and variability than dummies. However, data sets available to verify the human response to impact are typically limited in numbers, and they are not size or gender specific. The objective of this study was to investigate the use of model morphing methodologies within that context. In this study, a simple human model scaling methodology was developed to morph two detailed human models (Global Human Body Model Consortium models 50th male, M50, and 5th female, F05) to the dimensions of post mortem human surrogates (PMHS) used in published literature. The methodology was then successfully applied to 52 PMHS tested in 14 impact conditions loading the abdomen. The corresponding 104 simulations were compared to the responses of the PMHS and to the responses of the baseline models without scaling (28 simulations). The responses were analysed using the CORA method and peak values. The results suggest that model scaling leads to an improvement of the predicted force and deflection but has more marginal effects on the predicted abdominal compressions. M50 and F05 models scaled to the same PMHS were also found to have similar external responses, but large differences were found between the two sets of models for the strain energy densities in the liver and the spleen for mid-abdomen impact simulations. These differences, which were attributed to the anatomical differences in the abdomen of the baseline models, highlight the importance of the selection of the impact condition for simulation studies, especially if the organ location is not known in the test. While the methodology could be further improved, it shows the feasibility of using model scaling methodologies to compare human models of different sizes and to evaluate scaling approaches within the context of human model validation.
Intelligent systems engineering methodology
NASA Technical Reports Server (NTRS)
Fouse, Scott
1990-01-01
An added challenge for the designers of large scale systems such as Space Station Freedom is the appropriate incorporation of intelligent system technology (artificial intelligence, expert systems, knowledge-based systems, etc.) into their requirements and design. This presentation will describe a view of systems engineering which successfully addresses several aspects of this complex problem: design of large scale systems, design with requirements that are so complex they only completely unfold during the development of a baseline system and even then continue to evolve throughout the system's life cycle, design that involves the incorporation of new technologies, and design and development that takes place with many players in a distributed manner yet can be easily integrated to meet a single view of the requirements. The first generation of this methodology was developed and evolved jointly by ISX and the Lockheed Aeronautical Systems Company over the past five years on the Defense Advanced Research Projects Agency/Air Force Pilot's Associate Program, one of the largest, most complex, and most successful intelligent systems constructed to date. As the methodology has evolved it has also been applied successfully to a number of other projects. Some of the lessons learned from this experience may be applicable to Freedom.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kojima, S.; Yokosawa, M.; Matsuyama, M.
To study the practical application of a tritium separation process using Self-Developing Gas Chromatography (SDGC) using a Pd-Pt alloy, intermediate scale-up experiments (22 mm ID x 2 m length column) and the development of a computational simulation method have been conducted. In addition, intermediate scale production of Pd-Pt powder has been developed for the scale-up experiments.The following results were obtained: (1) a 50-fold scale-up from 3 mm to 22 mm causes no significant impact on the SDGC process; (2) the Pd-Pt alloy powder is applicable to a large size SDGC process; and (3) the simulation enables preparation of a conceptualmore » design of a SDGC process for tritium separation.« less
Dell’Acqua, F.; Gamba, P.; Jaiswal, K.
2012-01-01
This paper discusses spatial aspects of the global exposure dataset and mapping needs for earthquake risk assessment. We discuss this in the context of development of a Global Exposure Database for the Global Earthquake Model (GED4GEM), which requires compilation of a multi-scale inventory of assets at risk, for example, buildings, populations, and economic exposure. After defining the relevant spatial and geographic scales of interest, different procedures are proposed to disaggregate coarse-resolution data, to map them, and if necessary to infer missing data by using proxies. We discuss the advantages and limitations of these methodologies and detail the potentials of utilizing remote-sensing data. The latter is used especially to homogenize an existing coarser dataset and, where possible, replace it with detailed information extracted from remote sensing using the built-up indicators for different environments. Present research shows that the spatial aspects of earthquake risk computation are tightly connected with the availability of datasets of the resolution necessary for producing sufficiently detailed exposure. The global exposure database designed by the GED4GEM project is able to manage datasets and queries of multiple spatial scales.
Supersampling and Network Reconstruction of Urban Mobility.
Sagarra, Oleguer; Szell, Michael; Santi, Paolo; Díaz-Guilera, Albert; Ratti, Carlo
2015-01-01
Understanding human mobility is of vital importance for urban planning, epidemiology, and many other fields that draw policies from the activities of humans in space. Despite the recent availability of large-scale data sets of GPS traces or mobile phone records capturing human mobility, typically only a subsample of the population of interest is represented, giving a possibly incomplete picture of the entire system under study. Methods to reliably extract mobility information from such reduced data and to assess their sampling biases are lacking. To that end, we analyzed a data set of millions of taxi movements in New York City. We first show that, once they are appropriately transformed, mobility patterns are highly stable over long time scales. Based on this observation, we develop a supersampling methodology to reliably extrapolate mobility records from a reduced sample based on an entropy maximization procedure, and we propose a number of network-based metrics to assess the accuracy of the predicted vehicle flows. Our approach provides a well founded way to exploit temporal patterns to save effort in recording mobility data, and opens the possibility to scale up data from limited records when information on the full system is required.
ERIC Educational Resources Information Center
van Fleet, Justin W.
2012-01-01
Scaling up good corporate social investment practices in developing countries is crucial to realizing the "Education for All" and "Millennium Development Goals". Yet very few corporate social investments have the right mix of vision, financing, cross-sector engagement and leadership to come to scale. Globally, 67 million…
1993/03 Baccalaureate and Beyond Longitudinal Study (B&B:93/03). Methodology Report. NCES 2006-166
ERIC Educational Resources Information Center
Wine, Jennifer S.; Cominole, Melissa B.; Wheeless, Sara; Dudley, Kristin; Franklin, Jeff
2005-01-01
This report describes the procedures and results of the full-scale implementation of the B&B:93/03 study. Students who earned a bachelor's degree in 1992-93 were first interviewed in 1993 and then subsequently in 1994 and 1997. This is the final follow-up interview of the B&B:93 cohort, 10 years following completion of the bachelor's…
Introduction to a special issue on concept mapping.
Trochim, William M; McLinden, Daniel
2017-02-01
Concept mapping was developed in the 1980s as a unique integration of qualitative (group process, brainstorming, unstructured sorting, interpretation) and quantitative (multidimensional scaling, hierarchical cluster analysis) methods designed to enable a group of people to articulate and depict graphically a coherent conceptual framework or model of any topic or issue of interest. This introduction provides the basic definition and description of the methodology for the newcomer and describes the steps typically followed in its most standard canonical form (preparation, generation, structuring, representation, interpretation and utilization). It also introduces this special issue which reviews the history of the methodology, describes its use in a variety of contexts, shows the latest ways it can be integrated with other methodologies, considers methodological advances and developments, and sketches a vision of the future of the method's evolution. Copyright © 2016 Elsevier Ltd. All rights reserved.
Evaluation of Lithofacies Up-Scaling Methods for Probabilistic Prediction of Carbon Dioxide Behavior
NASA Astrophysics Data System (ADS)
Park, J. Y.; Lee, S.; Lee, Y. I.; Kihm, J. H.; Kim, J. M.
2017-12-01
Behavior of carbon dioxide injected into target reservoir (storage) formations is highly dependent on heterogeneities of geologic lithofacies and properties. These heterogeneous lithofacies and properties basically have probabilistic characteristics. Thus, their probabilistic evaluation has to be implemented properly into predicting behavior of injected carbon dioxide in heterogeneous storage formations. In this study, a series of three-dimensional geologic modeling is performed first using SKUA-GOCAD (ASGA and Paradigm) to establish lithofacies models of the Janggi Conglomerate in the Janggi Basin, Korea within a modeling domain. The Janggi Conglomerate is composed of mudstone, sandstone, and conglomerate, and it has been identified as a potential reservoir rock (clastic saline formation) for geologic carbon dioxide storage. Its lithofacies information are obtained from four boreholes and used in lithofacies modeling. Three different up-scaling methods (i.e., nearest to cell center, largest proportion, and random) are applied, and lithofacies modeling is performed 100 times for each up-scaling method. The lithofacies models are then compared and analyzed with the borehole data to evaluate the relative suitability of the three up-scaling methods. Finally, the lithofacies models are converted into coarser lithofacies models within the same modeling domain with larger grid blocks using the three up-scaling methods, and a series of multiphase thermo-hydrological numerical simulation is performed using TOUGH2-MP (Zhang et al., 2008) to predict probabilistically behavior of injected carbon dioxide. The coarser lithofacies models are also compared and analyzed with the borehole data and finer lithofacies models to evaluate the relative suitability of the three up-scaling methods. Three-dimensional geologic modeling, up-scaling, and multiphase thermo-hydrological numerical simulation as linked methodologies presented in this study can be utilized as a practical probabilistic evaluation tool to predict behavior of injected carbon dioxide and even to analyze its leakage risk. This work was supported by the Korea CCS 2020 Project of the Korea Carbon Capture and Sequestration R&D Center (KCRC) funded by the National Research Foundation (NRF), Ministry of Science and ICT (MSIT), Korea.
U.S. Solar Photovoltaic System Cost Benchmark: Q1 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Ran; Feldman, David; Margolis, Robert
This report benchmarks U.S. solar photovoltaic (PV) system installed costs as of the first quarter of 2017 (Q1 2017). We use a bottom-up methodology, accounting for all system and projectdevelopment costs incurred during the installation to model the costs for residential, commercial, and utility-scale systems. In general, we attempt to model the typical installation techniques and business operations from an installed-cost perspective. Costs are represented from the perspective of the developer/installer; thus, all hardware costs represent the price at which components are purchased by the developer/installer, not accounting for preexisting supply agreements or other contracts. Importantly, the benchmark also representsmore » the sales price paid to the installer; therefore, it includes profit in the cost of the hardware, 1 along with the profit the installer/developer receives, as a separate cost category. However, it does not include any additional net profit, such as a developer fee or price gross-up, which is common in the marketplace. We adopt this approach owing to the wide variation in developer profits in all three sectors, where project pricing is highly dependent on region and project specifics such as local retail electricity rate structures, local rebate and incentive structures, competitive environment, and overall project or deal structures. Finally, our benchmarks are national averages weighted by state installed capacities.« less
Martinez, Lauren C; Gatto, Nicole M; Spruijt-Metz, Donna; Davis, Jaimie N
2015-05-01
The LA Sprouts 12-week nutrition, cooking and gardening intervention targets obesity reduction in Latino children. While other gardening and nutrition programs are shown to improve dietary intake, LA Sprouts is unique in that it utilized a curriculum demonstrated to decrease obesity. This methodology paper outlines the design and processes of the LA Sprouts study, and discusses key strategies employed to foster successful implementation of the program. After-school program in four Los Angeles elementary schools. 3rd-5th grade students. Randomized controlled trial. Gardens were built on two of four school campuses, and the 90-minute weekly lessons focused on strategies to increase fruit and vegetable consumption, gardening at school and home, and cooking healthy meals/snacks. Data collection was conducted pre- and post-intervention and included basic clinical and anthropometric measures, dietary intake and psychosocial constructs measured by questionnaire, and an optional fasting blood draw. Baseline data was collected from 364 children, and 320 (88%) completed follow-up. No participants withdrew from the program (data were missing for other reasons). Intervention students attended 9.7 ± 2.3 lessons. Fasting blood samples were collected on 169 children at baseline, and 113 (67%) at follow-up. Questionnaire scales had good internal consistency (IC) and intra-rater reliability (IRR; in child scales: 88% items with IC > 0.7 and 70% items with IRR > 0.50; in parent scales: 75% items with IC > 0.7). The intervention was successfully implemented in the schools and scales appear appropriate to evaluate psychosocial constructs relevant to a gardening intervention. Copyright © 2015 Elsevier Inc. All rights reserved.
Martinez, Lauren C.; Gatto, Nicole M.; Spruijt-Metz, Donna; Davis, Jaimie N.
2015-01-01
Objective The LA Sprouts 12-week nutrition, cooking and gardening intervention targets obesity reduction in Latino children. While other gardening and nutrition programs are shown to improve dietary intake, LA Sprouts is unique in that it utilized a curriculum demonstrated to decrease obesity. This methodology paper outlines the design and processes of the LA Sprouts study, and discusses key strategies employed to foster successful implementation of the program. Setting After-school program in four Los Angeles elementary schools. Subjects 3rd–5th grade students. Design Randomized controlled trial. Gardens were built on two of four school campuses, and the 90-minute weekly lessons focused on strategies to increase fruit and vegetable consumption, gardening at school and home, and cooking healthy meals/snacks. Data collection was conducted pre- and post-intervention and included basic clinical and anthropometric measures, dietary intake and psychosocial constructs measured by questionnaire, and an optional fasting blood draw. Results Baseline data was collected from 364 children, and 320 (88%) completed follow-up. No participants withdrew from the program (data were missing for other reasons). Intervention students attended 9.7 ± 2.3 lessons. Fasting blood samples were collected on 169 children at baseline, and 113 (67%) at follow-up. Questionnaire scales had good internal consistency (IC) and intra-rater reliability (IRR; in child scales: 88% items with IC >0.7 and 70% items with IRR > 0.50; in parent scales: 75% items with IC > 0.7). Conclusions The intervention was successfully implemented in the schools and scales appear appropriate to evaluate psychosocial constructs relevant to a gardening intervention. PMID:25896115
NASA Astrophysics Data System (ADS)
Shafii, Mahyar; Basu, Nandita; Schiff, Sherry; Van Cappellen, Philippe
2017-04-01
Dramatic increase in nitrogen circulating in the biosphere due to anthropogenic activities has resulted in impairment of water quality in groundwater and surface water causing eutrophication in coastal regions. Understanding the fate and transport of nitrogen from landscape to coastal areas requires exploring the drivers of nitrogen processes in both time and space, as well as the identification of appropriate flow pathways. Conceptual models can be used as diagnostic tools to provide insights into such controls. However, diagnostic evaluation of coupled hydrological-biogeochemical models is challenging. This research proposes a top-down methodology utilizing hydrochemical signatures to develop conceptual models for simulating the integrated streamflow and nitrate responses while taking into account dominant controls on nitrate variability (e.g., climate, soil water content, etc.). Our main objective is to seek appropriate model complexity that sufficiently reproduces multiple hydrological and nitrate signatures. Having developed a suitable conceptual model for a given watershed, we employ it in sensitivity studies to demonstrate the dominant process controls that contribute to the nitrate response at scales of interest. We apply the proposed approach to nitrate simulation in a range of small to large sub-watersheds in the Grand River Watershed (GRW) located in Ontario. Such multi-basin modeling experiment will enable us to address process scaling and investigate the consequences of lumping processes in terms of models' predictive capability. The proposed methodology can be applied to the development of large-scale models that can help decision-making associated with nutrients management at regional scale.
EVALUATING THE WATER QUALITY EFFECTIVENESS OF WATERSHED-SCALE SOURCE WATER PROTECTION PROGRAMS
The US EPA Office of Research and Development, the Ohio River Valley Water Sanitation Commission (ORSANCO) and the Upper Big Walnut Creek Quality Partnership created a collaborative team of eleven agencies and universities to develop a methodology for evaluating the effectiveness...
Intercellular Genomics of Subsurface Microbial Colonies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ortoleva, Peter; Tuncay, Kagan; Gannon, Dennis
2007-02-14
This report summarizes progress in the second year of this project. The objective is to develop methods and software to predict the spatial configuration, properties and temporal evolution of microbial colonies in the subsurface. To accomplish this, we integrate models of intracellular processes, cell-host medium exchange and reaction-transport dynamics on the colony scale. At the conclusion of the project, we aim to have the foundations of a predictive mathematical model and software that captures the three scales of these systems – the intracellular, pore, and colony wide spatial scales. In the second year of the project, we refined our transcriptionalmore » regulatory network discovery (TRND) approach that utilizes gene expression data along with phylogenic similarity and gene ontology analyses and applied it successfully to E.coli, human B cells, and Geobacter sulfurreducens. We have developed a new Web interface, GeoGen, which is tailored to the reconstruction of microbial TRNs and solely focuses on Geobacter as one of DOE’s high priority microbes. Our developments are designed such that the frameworks for the TRND and GeoGen can readily be used for other microbes of interest to the DOE. In the context of modeling a single bacterium, we are actively pursuing both steady-state and kinetic approaches. The steady-state approach is based on a flux balance that uses maximizing biomass growth rate as its objective, subjected to various biochemical constraints, for the optimal values of reaction rates and uptake/release of metabolites. For the kinetic approach, we use Karyote, a rigorous cell model developed by us for an earlier DOE grant and the DARPA BioSPICE Project. We are also investigating the interplay between bacterial colonies and environment at both pore and macroscopic scales. The pore scale models use detailed representations for realistic porous media accounting for the distribution of grain size whereas the macroscopic models employ the Darcy-type flow equations and up-scaled advective-diffusive transport equations for chemical species. We are rigorously testing the relationship between these two scales by evaluating macroscopic parameters using the volume averaging methodology applied to pore scale model results.« less
A Comfortability Level Scale for Performance of Cardiopulmonary Resuscitation.
ERIC Educational Resources Information Center
Otten, Robert Drew
1984-01-01
This article discusses the development of an instrument to appraise the comfortability level of college students in performing cardiopulmonary resuscitation. Methodology and findings of data collection are given. (Author/DF)
Inclusion of Community in Self Scale: A Single-Item Pictorial Measure of Community Connectedness
ERIC Educational Resources Information Center
Mashek, Debra; Cannaday, Lisa W.; Tangney, June P.
2007-01-01
We developed a single-item pictorial measure of community connectedness, building on the theoretical and methodological traditions of the self-expansion model (Aron & Aron, 1986). The Inclusion of Community in the Self (ICS) Scale demonstrated excellent test-retest reliability, convergent validity, and discriminant validity in a sample of 190…
ERIC Educational Resources Information Center
Edwards, Lisa M.; Pedrotti, Jennifer Teramoto
2008-01-01
This study describes a comprehensive content and methodological review of articles about multiracial issues in 6 journals related to counseling up to the year 2006. The authors summarize findings about the 18 articles that emerged from this review of the "Journal of Counseling Psychology," "Journal of Counseling & Development," "The Counseling…
Educating for a Change. An ANC Skillshop in Popular Education. Workshop Manual.
ERIC Educational Resources Information Center
Doris Marshall Inst. for Education and Action, Toronto (Ontario).
This manual provides materials for a 6-day workshop to develop skills in democratic learning and teaching practices. Goals of the workshop are as follows: (1) train facilitators to use the methodology; (2) introduce people in the African National Congress (ANC) to the potential of popular education methodology; (3) determine follow-up action to…
Development of an Optimization Methodology for the Aluminum Alloy Wheel Casting Process
NASA Astrophysics Data System (ADS)
Duan, Jianglan; Reilly, Carl; Maijer, Daan M.; Cockcroft, Steve L.; Phillion, Andre B.
2015-08-01
An optimization methodology has been developed for the aluminum alloy wheel casting process. The methodology is focused on improving the timing of cooling processes in a die to achieve improved casting quality. This methodology utilizes (1) a casting process model, which was developed within the commercial finite element package, ABAQUS™—ABAQUS is a trademark of Dassault Systèms; (2) a Python-based results extraction procedure; and (3) a numerical optimization module from the open-source Python library, Scipy. To achieve optimal casting quality, a set of constraints have been defined to ensure directional solidification, and an objective function, based on the solidification cooling rates, has been defined to either maximize, or target a specific, cooling rate. The methodology has been applied to a series of casting and die geometries with different cooling system configurations, including a 2-D axisymmetric wheel and die assembly generated from a full-scale prototype wheel. The results show that, with properly defined constraint and objective functions, solidification conditions can be improved and optimal cooling conditions can be achieved leading to process productivity and product quality improvements.
Agile methodology selection criteria: IT start-up case study
NASA Astrophysics Data System (ADS)
Micic, Lj
2017-05-01
Project management in modern IT companies is often based on agile methodologies which have several advantages compared to traditional methodologies such is waterfall. Having in mind that clients sometimes change project during development it is crucial for an IT company to choose carefully which methodology is going to implement and is it going to be mostly based on one or is it going got be combination of several. There are several modern and often used methodologies but among those Scrum, Kanban and XP programming are usually the most common. Sometimes companies use mostly tools and procedures from one but quite often they use some of the combination of those methodologies. Having in mind that those methodologies are just a framework they allow companies to adapt it for their specific projects as well as for other limitations. These methodologies are in limited usage Bosnia but more and more IT companies are starting to use agile methodologies because it is practice and common not just for their clients abroad but also starting to be the only option in order to deliver quality product on time. However it is always challenging which methodology or combination of several companies should implement and how to connect it to its own project, organizational framework and HR management. This paper presents one case study based on local IT start up and delivers solution based on theoretical framework and practical limitations that case company has.
NASA Technical Reports Server (NTRS)
Vangenderen, J. L. (Principal Investigator); Lock, B. F.
1976-01-01
The author has identified the following significant results. It was found that color composite transparencies and monocular magnification provided the best base for land use interpretation. New methods for determining optimum sample sizes and analyzing interpretation accuracy levels were developed. All stages of the methodology were assessed, in the operational sense, during the production of a 1:250,000 rural land use map of Murcia Province, Southeast Spain.
Determination of real-time predictors of the wind turbine wake meandering
NASA Astrophysics Data System (ADS)
Muller, Yann-Aël; Aubrun, Sandrine; Masson, Christian
2015-03-01
The present work proposes an experimental methodology to characterize the unsteady properties of a wind turbine wake, called meandering, and particularly its ability to follow the large-scale motions induced by large turbulent eddies contained in the approach flow. The measurements were made in an atmospheric boundary layer wind tunnel. The wind turbine model is based on the actuator disc concept. One part of the work has been dedicated to the development of a methodology for horizontal wake tracking by mean of a transverse hot wire rake, whose dynamic response is adequate for spectral analysis. Spectral coherence analysis shows that the horizontal position of the wake correlates well with the upstream transverse velocity, especially for wavelength larger than three times the diameter of the disc but less so for smaller scales. Therefore, it is concluded that the wake is actually a rather passive tracer of the large surrounding turbulent structures. The influence of the rotor size and downstream distance on the wake meandering is studied. The fluctuations of the lateral force and the yawing torque affecting the wind turbine model are also measured and correlated with the wake meandering. Two approach flow configurations are then tested: an undisturbed incoming flow (modelled atmospheric boundary layer) and a disturbed incoming flow, with a wind turbine model located upstream. Results showed that the meandering process is amplified by the presence of the upstream wake. It is shown that the coherence between the lateral force fluctuations and the horizontal wake position is significant up to length scales larger than twice the wind turbine model diameter. This leads to the conclusion that the lateral force is a better candidate than the upstream transverse velocity to predict in real time the meandering process, for either undisturbed (wake free) or disturbed incoming atmospheric flows.
Adaptive surrogate model based multiobjective optimization for coastal aquifer management
NASA Astrophysics Data System (ADS)
Song, Jian; Yang, Yun; Wu, Jianfeng; Wu, Jichun; Sun, Xiaomin; Lin, Jin
2018-06-01
In this study, a novel surrogate model assisted multiobjective memetic algorithm (SMOMA) is developed for optimal pumping strategies of large-scale coastal groundwater problems. The proposed SMOMA integrates an efficient data-driven surrogate model with an improved non-dominated sorted genetic algorithm-II (NSGAII) that employs a local search operator to accelerate its convergence in optimization. The surrogate model based on Kernel Extreme Learning Machine (KELM) is developed and evaluated as an approximate simulator to generate the patterns of regional groundwater flow and salinity levels in coastal aquifers for reducing huge computational burden. The KELM model is adaptively trained during evolutionary search to satisfy desired fidelity level of surrogate so that it inhibits error accumulation of forecasting and results in correctly converging to true Pareto-optimal front. The proposed methodology is then applied to a large-scale coastal aquifer management in Baldwin County, Alabama. Objectives of minimizing the saltwater mass increase and maximizing the total pumping rate in the coastal aquifers are considered. The optimal solutions achieved by the proposed adaptive surrogate model are compared against those solutions obtained from one-shot surrogate model and original simulation model. The adaptive surrogate model does not only improve the prediction accuracy of Pareto-optimal solutions compared with those by the one-shot surrogate model, but also maintains the equivalent quality of Pareto-optimal solutions compared with those by NSGAII coupled with original simulation model, while retaining the advantage of surrogate models in reducing computational burden up to 94% of time-saving. This study shows that the proposed methodology is a computationally efficient and promising tool for multiobjective optimizations of coastal aquifer managements.
Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel
2017-05-01
Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Urban gully erosion and the SDGs: a case study from the Koboko rural town of Uganda
NASA Astrophysics Data System (ADS)
Zolezzi, Guido; Bezzi, Marco
2017-04-01
Urban gully erosion in developing regions has been addressed by the scientific community only recently, while it has been given much less attention in past decades. Nonetheless, recent examples show how relevant urban gully erosion in African towns of different sizes can be in terms of several Sustainable Development Goals, like goals 3 (good health and well being), 6 (clean water and sanitation) and 11 (sustainable cities and communities). The present work illustrate an example of gully erosion in the rapidly growing rural town of Koboko in NW Uganda close to the borders with Congo Democratic Republic and South Sudan. The research aims are (i) to develop a simple, low-cost methodology to quantify gully properties in data-scarce and resource-limited contexts, (ii) to quantify the main properties of and processes related to the urban gullies in the Koboko case study and (iii) to quantify the potential risk associated with urban gully erosion at the country scale in relation to rapid growth of urban centers in a sub-saharan African country. The methodology integrates collection of existing hydrological and land use data, rapid topographic surveys and related data processing, basic hydrological and hydro-morphological modeling, interviews to local inhabitants and stakeholders. Results indicate that Koboko may not represent an isolated hotspot of extensive urban gully development among rapidly growing small towns in Uganda, and, consequently, in countries with similar sustainable and human development challenges. Koboko, established two decades ago as a temporary war refugee camp, has been progressively established as a permanent urban settlement. The urban center is located on the top of an elongated hill and many of its recent neighbourhoods are expanding along the hill sides, where the local slope may reach considerable values, up to 10%. In the last ten years several gully systems with local depth up to 8 to 10 meters have been rapidly evolving especially following the construction of new roads and in the absence of a structured urban drainage plan. The deeper gullies are presently located in densely populated areas and present a variety of risks for people's livelihoods, including personal safety, risk of accidents for small vehicles (especially during night time), sanitation risk related with untreated domestic wastewater and uncontrolled garbage disposal into the deepest parts of the gullies. The methodology is easily repeatable and has the potential to quantify the fundamental properties of gully systems in contexts with scarce hydrological, soil and geomorphological local data availability and where the responsible agencies for urban planning and environmental protection are constrained by severe limitation in financial and human resources. For each gully system it allows to quantify total eroded volumes, length of the unstable gully reaches, time scale of development, drainage area and peak formative streamflow and also to provide process-based insight on the causes of gully development. The related knowledge base can be used to develop guidelines for urban growth aimed at minimizing the risk of gully erosion and related societal impacts.
Crack Growth Simulation and Residual Strength Prediction in Airplane Fuselages
NASA Technical Reports Server (NTRS)
Chen, Chuin-Shan; Wawrzynek, Paul A.; Ingraffea, Anthony R.
1999-01-01
This is the final report for the NASA funded project entitled "Crack Growth Prediction Methodology for Multi-Site Damage." The primary objective of the project was to create a capability to simulate curvilinear fatigue crack growth and ductile tearing in aircraft fuselages subjected to widespread fatigue damage. The second objective was to validate the capability by way of comparisons to experimental results. Both objectives have been achieved and the results are detailed herein. In the first part of the report, the crack tip opening angle (CTOA) fracture criterion, obtained and correlated from coupon tests to predict fracture behavior and residual strength of built-up aircraft fuselages, is discussed. Geometrically nonlinear, elastic-plastic, thin shell finite element analyses are used to simulate stable crack growth and to predict residual strength. Both measured and predicted results of laboratory flat panel tests and full-scale fuselage panel tests show substantial reduction of residual strength due to the occurrence of multi-site damage (MSD). Detailed comparisons of n stable crack growth history, and residual strength between the predicted and experimental results are used to assess the validity of the analysis methodology. In the second part of the report, issues related to crack trajectory prediction in thin shells; an evolving methodology uses the crack turning phenomenon to improve the structural integrity of aircraft structures are discussed, A directional criterion is developed based on the maximum tangential stress theory, but taking into account the effect of T-stress and fracture toughness orthotropy. Possible extensions of the current crack growth directional criterion to handle geometrically and materially nonlinear problems are discussed. The path independent contour integral method for T-stress evaluation is derived and its accuracy is assessed using a p- and hp-version adaptive finite element method. Curvilinear crack growth is simulated in coupon tests and in full-scale fuselage panel tests. Both T-stress and fracture toughness orthotropy are found to be essential to predict the observed crack paths. The analysis methodology and software program (FRANC3D/STAGS) developed herein allows engineers to maintain aging aircraft economically while insuring continuous airworthiness. Consequently, it will improve the technology to support the safe operation of the current aircraft fleet as well as the design of more damage-tolerant aircraft for the next generation fleet.
GREENSCOPE Technical User’s Guide
GREENSCOPE’s methodology has been developed and its software tool designed such that it can be applied to an entire process, to a piece of equipment or process unit, or at the investigatory bench scale.
Richter, Linda M; Daelmans, Bernadette; Lombardi, Joan; Heymann, Jody; Boo, Florencia Lopez; Behrman, Jere R; Lu, Chunling; Lucas, Jane E; Perez-Escamilla, Rafael; Dua, Tarun; Bhutta, Zulfiqar A; Stenberg, Karin; Gertler, Paul; Darmstadt, Gary L
2018-01-01
Building on long-term benefits of early intervention (Paper 2 of this Series) and increasing commitment to early childhood development (Paper 1 of this Series), scaled up support for the youngest children is essential to improving health, human capital, and wellbeing across the life course. In this third paper, new analyses show that the burden of poor development is higher than estimated, taking into account additional risk factors. National programmes are needed. Greater political prioritisation is core to scale-up, as are policies that afford families time and financial resources to provide nurturing care for young children. Effective and feasible programmes to support early child development are now available. All sectors, particularly education, and social and child protection, must play a role to meet the holistic needs of young children. However, health provides a critical starting point for scaling up, given its reach to pregnant women, families, and young children. Starting at conception, interventions to promote nurturing care can feasibly build on existing health and nutrition services at limited additional cost. Failure to scale up has severe personal and social consequences. Children at elevated risk for compromised development due to stunting and poverty are likely to forgo about a quarter of average adult income per year, and the cost of inaction to gross domestic product can be double what some countries currently spend on health. Services and interventions to support early childhood development are essential to realising the vision of the Sustainable Development Goals. PMID:27717610
An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application
ERIC Educational Resources Information Center
Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth
2016-01-01
Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…
Hypersonic Inflatable Aerodynamic Decelerator (HIAD) Technology Development Overview
NASA Technical Reports Server (NTRS)
Hughes, Stephen J.; Cheatwood, F. McNeil; Calomino, Anthony M.; Wright, Henry S.
2013-01-01
The successful flight of the Inflatable Reentry Vehicle Experiment (IRVE)-3 has further demonstrated the potential value of Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology. This technology development effort is funded by NASA's Space Technology Mission Directorate (STMD) Game Changing Development Program (GCDP). This paper provides an overview of a multi-year HIAD technology development effort, detailing the projects completed to date and the additional testing planned for the future. The effort was divided into three areas: Flexible Systems Development (FSD), Mission Advanced Entry Concepts (AEC), and Flight Validation. FSD consists of a Flexible Thermal Protection Systems (FTPS) element, which is investigating high temperature materials, coatings, and additives for use in the bladder, insulator, and heat shield layers; and an Inflatable Structures (IS) element which includes manufacture and testing (laboratory and wind tunnel) of inflatable structures and their associated structural elements. AEC consists of the Mission Applications element developing concepts (including payload interfaces) for missions at multiple destinations for the purpose of demonstrating the benefits and need for the HIAD technology as well as the Next Generation Subsystems element. Ground test development has been pursued in parallel with the Flight Validation IRVE-3 flight test. A larger scale (6m diameter) HIAD inflatable structure was constructed and aerodynamically tested in the National Full-scale Aerodynamics Complex (NFAC) 40ft by 80ft test section along with a duplicate of the IRVE-3 3m article. Both the 6m and 3m articles were tested with instrumented aerodynamic covers which incorporated an array of pressure taps to capture surface pressure distribution to validate Computational Fluid Dynamics (CFD) model predictions of surface pressure distribution. The 3m article also had a duplicate IRVE-3 Thermal Protection System (TPS) to test in addition to testing with the Aerocover configuration. Both the Aerocovers and the TPS were populated with high contrast targets so that photogrammetric solutions of the loaded surface could be created. These solutions both refined the aerodynamic shape for CFD modeling and provided a deformed shape to validate structural Finite Element Analysis (FEA) models. Extensive aerothermal testing has been performed on the TPS candidates. This testing has been conducted in several facilities across the country. The majority of the testing has been conducted in the Boeing Large Core Arc Tunnel (LCAT). HIAD is continuing to mature testing methodology in this facility and is developing new test sample fixtures and control methodologies to improve understanding and quality of the environments to which the samples are subjected. Additional testing has been and continues to be performed in the NASA LaRC 8ft High Temperature Tunnel, where samples up to 2ft by 2ft are being tested over representative underlying structures incorporating construction features such as sewn seams and through-thickness quilting. With the successful completion to the IRVE-3 flight demonstration, mission planning efforts are ramping up on the development of the HIAD Earth Atmospheric Reenty Test (HEART) which will demonstrate a relevant scale vehicle in relevant environments via a large-scale aeroshell (approximately 8.5m) entering at orbital velocity (approximately 7km/sec) with an entry mass on the order of 4MT. Also, the Build to Print (BTP) hardware built as a risk mitigation for the IRVE-3 project to have a "spare" ready to go in the event of a launch vehicle delivery failure is now available for an additional sub-orbital flight experiment. Mission planning is underway to define a mission that can utilize this existing hardware and help the HIAD project further mature this technology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Chenlin; Liang, Ling; Sun, Ning
The study presents the successful scale-up demonstration of the acid-assisted IL deconstruction on feedstock blends of municipal solid wastes and agricultural residues (corn stover) by 30-fold, relative to the bench scale (6L vs 0.2L), at 10% solid loading. By integrating IL pretreatment and acid hydrolysis with subsequent centrifugation and extraction, the sugar and lignin products can be further recovered efficiently. This scale-up development at Advanced Biofuels/Bioproducts Process Demonstration Unit (ABPDU) will leverage the opportunity and synergistic efforts towards developing a cost-effective IL based deconstruction technology by drastically eliminating enzyme, reducing water usage, and simplifying the downstream sugar/lignin recovery and ILmore » recycling. Results indicate that MSW blends are viable and valuable resource to consider when assessing biomass availability and affordability for lignocellulosic biorefineries. This scale-up evaluation demonstrates that the acid-assisted IL deconstruction technology can be effectively scaled up to larger operations and the current study established the baseline of scaling parameters for this process.« less
Li, Chenlin; Liang, Ling; Sun, Ning; ...
2017-01-05
The study presents the successful scale-up demonstration of the acid-assisted IL deconstruction on feedstock blends of municipal solid wastes and agricultural residues (corn stover) by 30-fold, relative to the bench scale (6L vs 0.2L), at 10% solid loading. By integrating IL pretreatment and acid hydrolysis with subsequent centrifugation and extraction, the sugar and lignin products can be further recovered efficiently. This scale-up development at Advanced Biofuels/Bioproducts Process Demonstration Unit (ABPDU) will leverage the opportunity and synergistic efforts towards developing a cost-effective IL based deconstruction technology by drastically eliminating enzyme, reducing water usage, and simplifying the downstream sugar/lignin recovery and ILmore » recycling. Results indicate that MSW blends are viable and valuable resource to consider when assessing biomass availability and affordability for lignocellulosic biorefineries. This scale-up evaluation demonstrates that the acid-assisted IL deconstruction technology can be effectively scaled up to larger operations and the current study established the baseline of scaling parameters for this process.« less
Evaluating the uncertainty of predicting future climate time series at the hourly time scale
NASA Astrophysics Data System (ADS)
Caporali, E.; Fatichi, S.; Ivanov, V. Y.
2011-12-01
A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.
Scaling up in international health: what are the key issues?
Mangham, Lindsay J; Hanson, Kara
2010-03-01
The term 'scaling up' is now widely used in the international health literature, though it lacks an agreed definition. We review what is meant by scaling up in the context of changes in international health and development over the last decade. We argue that the notion of scaling up is primarily used to describe the ambition or process of expanding the coverage of health interventions, though the term has also referred to increasing the financial, human and capital resources required to expand coverage. We discuss four pertinent issues in scaling up the coverage of health interventions: the costs of scaling up coverage; constraints to scaling up; equity and quality concerns; and key service delivery issues when scaling up. We then review recent progress in scaling up the coverage of health interventions. This includes a considerable increase in the volume of aid, accompanied by numerous new health initiatives and financing mechanisms. There have also been improvements in health outcomes and some examples of successful large-scale programmes. Finally, we reflect on the importance of obtaining a better understanding of how to deliver priority health interventions at scale, the current emphasis on health system strengthening and the challenges of sustaining scaling up in the prevailing global economic environment.
A Hybrid Coarse-graining Approach for Lipid Bilayers at Large Length and Time Scales
Ayton, Gary S.; Voth, Gregory A.
2009-01-01
A hybrid analytic-systematic (HAS) coarse-grained (CG) lipid model is developed and employed in a large-scale simulation of a liposome. The methodology is termed hybrid analyticsystematic as one component of the interaction between CG sites is variationally determined from the multiscale coarse-graining (MS-CG) methodology, while the remaining component utilizes an analytic potential. The systematic component models the in-plane center of mass interaction of the lipids as determined from an atomistic-level MD simulation of a bilayer. The analytic component is based on the well known Gay-Berne ellipsoid of revolution liquid crystal model, and is designed to model the highly anisotropic interactions at a highly coarse-grained level. The HAS CG approach is the first step in an “aggressive” CG methodology designed to model multi-component biological membranes at very large length and timescales. PMID:19281167
Biculturalism through Experiential Language Learning.
ERIC Educational Resources Information Center
Brennan, Pamela; Donoghue, Anna Acitelli
This paper describes the English as a Second Language Program developed for educationally disadvantaged Mexican-American adults as part of the educational offerings of Project Step-Up, an OEO-funded demonstration program in San Diego. Project Step-Up features a multifold methodological approach incorporating techniques from (1) life skills…
A stakeholder-driven agenda for advancing the science and practice of scale-up and spread in health.
Norton, Wynne E; McCannon, C Joseph; Schall, Marie W; Mittman, Brian S
2012-12-06
Although significant advances have been made in implementation science, comparatively less attention has been paid to broader scale-up and spread of effective health programs at the regional, national, or international level. To address this gap in research, practice and policy attention, representatives from key stakeholder groups launched an initiative to identify gaps and stimulate additional interest and activity in scale-up and spread of effective health programs. We describe the background and motivation for this initiative and the content, process, and outcomes of two main phases comprising the core of the initiative: a state-of-the-art conference to develop recommendations for advancing scale-up and spread and a follow-up activity to operationalize and prioritize the recommendations. The conference was held in Washington, D.C. during July 2010 and attended by 100 representatives from research, practice, policy, public health, healthcare, and international health communities; the follow-up activity was conducted remotely the following year. Conference attendees identified and prioritized five recommendations (and corresponding sub-recommendations) for advancing scale-up and spread in health: increase awareness, facilitate information exchange, develop new methods, apply new approaches for evaluation, and expand capacity. In the follow-up activity, 'develop new methods' was rated as most important recommendation; expanding capacity was rated as least important, although differences were relatively minor. Based on the results of these efforts, we discuss priority activities that are needed to advance research, practice and policy to accelerate the scale-up and spread of effective health programs.
Quality by design: scale-up of freeze-drying cycles in pharmaceutical industry.
Pisano, Roberto; Fissore, Davide; Barresi, Antonello A; Rastelli, Massimo
2013-09-01
This paper shows the application of mathematical modeling to scale-up a cycle developed with lab-scale equipment on two different production units. The above method is based on a simplified model of the process parameterized with experimentally determined heat and mass transfer coefficients. In this study, the overall heat transfer coefficient between product and shelf was determined by using the gravimetric procedure, while the dried product resistance to vapor flow was determined through the pressure rise test technique. Once model parameters were determined, the freeze-drying cycle of a parenteral product was developed via dynamic design space for a lab-scale unit. Then, mathematical modeling was used to scale-up the above cycle in the production equipment. In this way, appropriate values were determined for processing conditions, which allow the replication, in the industrial unit, of the product dynamics observed in the small scale freeze-dryer. This study also showed how inter-vial variability, as well as model parameter uncertainty, can be taken into account during scale-up calculations.
Adaptive multiresolution modeling of groundwater flow in heterogeneous porous media
NASA Astrophysics Data System (ADS)
Malenica, Luka; Gotovac, Hrvoje; Srzic, Veljko; Andric, Ivo
2016-04-01
Proposed methodology was originally developed by our scientific team in Split who designed multiresolution approach for analyzing flow and transport processes in highly heterogeneous porous media. The main properties of the adaptive Fup multi-resolution approach are: 1) computational capabilities of Fup basis functions with compact support capable to resolve all spatial and temporal scales, 2) multi-resolution presentation of heterogeneity as well as all other input and output variables, 3) accurate, adaptive and efficient strategy and 4) semi-analytical properties which increase our understanding of usually complex flow and transport processes in porous media. The main computational idea behind this approach is to separately find the minimum number of basis functions and resolution levels necessary to describe each flow and transport variable with the desired accuracy on a particular adaptive grid. Therefore, each variable is separately analyzed, and the adaptive and multi-scale nature of the methodology enables not only computational efficiency and accuracy, but it also describes subsurface processes closely related to their understood physical interpretation. The methodology inherently supports a mesh-free procedure, avoiding the classical numerical integration, and yields continuous velocity and flux fields, which is vitally important for flow and transport simulations. In this paper, we will show recent improvements within the proposed methodology. Since "state of the art" multiresolution approach usually uses method of lines and only spatial adaptive procedure, temporal approximation was rarely considered as a multiscale. Therefore, novel adaptive implicit Fup integration scheme is developed, resolving all time scales within each global time step. It means that algorithm uses smaller time steps only in lines where solution changes are intensive. Application of Fup basis functions enables continuous time approximation, simple interpolation calculations across different temporal lines and local time stepping control. Critical aspect of time integration accuracy is construction of spatial stencil due to accurate calculation of spatial derivatives. Since common approach applied for wavelets and splines uses a finite difference operator, we developed here collocation one including solution values and differential operator. In this way, new improved algorithm is adaptive in space and time enabling accurate solution for groundwater flow problems, especially in highly heterogeneous porous media with large lnK variances and different correlation length scales. In addition, differences between collocation and finite volume approaches are discussed. Finally, results show application of methodology to the groundwater flow problems in highly heterogeneous confined and unconfined aquifers.
School Processes That Can Drive Scaling-Up of an Innovation or Contribute to Its Abandonment
ERIC Educational Resources Information Center
Newman, Denis; Zacamy, Jenna; Lazarev, Valeriy; Lin, Li
2017-01-01
This five-year study focused on school processes that promoted the scaling-up of a high school academic literacy framework, Reading Apprenticeship, developed by WestEd's Strategic Literacy Initiative (SLI). Implementing an innovative strategy for scaling-up involving school-based cross-disciplinary teacher teams, SLI brought the framework to 274…
Estimate of the Potential Costs and Effectiveness of Scaling Up CRESST Assessment Software.
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; Herl, Howard E.; Klein, Davina C. D.; O'Neil, Harold F., Jr.; Schacter, John
This report examines issues in the scale-up of assessment software from the Center for Research on Evaluation, Standards, and Student Testing (CRESST). "Scale-up" is used in a metaphorical sense, meaning adding new assessment tools to CRESST's assessment software. During the past several years, CRESST has been developing and evaluating a…
Methodological Issues in Clinical Drug Development for Essential Tremor
Carranza, Michael A.; Snyder, Madeline R.; Elble, Rodger J.; Boutzoukas, Angelique E.; Zesiewicz, Theresa A.
2012-01-01
Essential tremor (ET) is one of the most common tremor disorders in the world. Despite this, only two medications have received Level A recommendations from the American Academy of Neurology to treat it (primidone and propranolol). Even though these medications provide relief to a large group of ET patients, up to 50% of patients are non-responders. Additional medications to treat ET are needed. This review discusses some of the methodological issues that should be addressed for quality clinical drug development in ET. PMID:23440401
Multiscale Analysis of Delamination of Carbon Fiber-Epoxy Laminates with Carbon Nanotubes
NASA Technical Reports Server (NTRS)
Riddick, Jaret C.; Frankland, SJV; Gates, TS
2006-01-01
A multi-scale analysis is presented to parametrically describe the Mode I delamination of a carbon fiber/epoxy laminate. In the midplane of the laminate, carbon nanotubes are included for the purposes of selectively enhancing the fracture toughness of the laminate. To analyze carbon fiber epoxy carbon nanotube laminate, the multi-scale methodology presented here links a series of parameterizations taken at various length scales ranging from the atomistic through the micromechanical to the structural level. At the atomistic scale molecular dynamics simulations are performed in conjunction with an equivalent continuum approach to develop constitutive properties for representative volume elements of the molecular structure of components of the laminate. The molecular-level constitutive results are then used in the Mori-Tanaka micromechanics to develop bulk properties for the epoxy-carbon nanotube matrix system. In order to demonstrate a possible application of this multi-scale methodology, a double cantilever beam specimen is modeled. An existing analysis is employed which uses discrete springs to model the fiber bridging affect during delamination propagation. In the absence of empirical data or a damage mechanics model describing the effect of CNTs on fracture toughness, several tractions laws are postulated, linking CNT volume fraction to fiber bridging in a DCB specimen. Results from this demonstration are presented in terms of DCB specimen load-displacement responses.
Magneto-optical characterization of colloidal dispersions. Application to nickel nanoparticles.
Pascu, Oana; Caicedo, José Manuel; Fontcuberta, Josep; Herranz, Gervasi; Roig, Anna
2010-08-03
We report here on a fast magneto-optical characterization method for colloidal liquid dispersions of magnetic nanoparticles. We have applied our methodology to Ni nanoparticles with size equal or below 15 nm synthesized by a ligand stabilized solution-phase synthesis. We have measured the magnetic circular dichroism (MCD) of colloidal dispersions and found that we can probe the intrinsic magnetic properties within a wide concentration range, from 10(-5) up to 10(-2) M, with sensitivity to concentrations below 1 microg/mL of magnetic Ni particles. We found that the measured MCD signal scales up with the concentration thus providing a means of determining the concentration values of highly diluted dispersions. The methodology presented here exhibits large flexibility and versatility and might be suitable to study either fundamental problems related to properties of nanosize particles including surface related effects which are highly relevant for magnetic colloids in biomedical applications or to be applied to in situ testing and integration in production lines.
Foebel, Andrea D; van Hout, Hein P; van der Roest, Henriëtte G; Topinkova, Eva; Garms-Homolova, Vjenka; Frijters, Dinnus; Finne-Soveri, Harriet; Jónsson, Pálmi V; Hirdes, John P; Bernabei, Roberto; Onder, Graziano
2015-11-14
Evaluating the quality of care provided to older individuals is a key step to ensure that needs are being met and to target interventions to improve care. To this aim, interRAI's second-generation home care quality indicators (HCQIs) were developed in 2013. This study assesses the quality of home care services in six European countries using these HCQIs as well as the two derived summary scales. Data for this study were derived from the Aged in Home Care (AdHOC) study - a cohort study that examined different models of community care in European countries. The current study selected a sub-sample of the AdHOC cohort from six countries whose follow-up data were complete (Czech Republic, Denmark, Finland, Germany, Italy and the Netherlands). Data were collected from the interRAI Home Care instrument (RAI-HC) between 2000 and 2002. The 23 HCQIs of interest were determined according to previously established methodology, including risk adjustment. Two summary measures, the Clinical Balance Scale and Independence Quality Scale were also determined using established methodology. A total of 1,354 individuals from the AdHOC study were included in these analyses. Of the 23 HCQIs that were measured, the highest proportion of individuals experienced declines in Instrumental Activities of Daily Living (IADLs) (48.4 %). Of the clinical quality indicators, mood decline was the most prevalent (30.0 %), while no flu vaccination and being alone and distressed were the most prevalent procedural and social quality indicators, respectively (33.4 and 12.8 %). Scores on the two summary scales varied by country, but were concentrated around the median mark. The interRAI HCQIs can be used to determine the quality of home care services in Europe and identify areas for improvement. Our results suggest functional declines may prove the most beneficial targets for interventions.
Aptitude Requirements Based on Task Difficulty: Methodology for Evaluation.
1982-01-01
developing a bank of scientific data. concering the# various kinds of work performed in tlie Air Force. As a result, most Air Force Specialties (.XFSs) c-ail...bechmi ceark scale. ice order to cla ri I a ec Mtcisun cderstand1 ig of tile metthod or of the le’chmi eark scale. 3.3 Materials lTce caterials provided
D.J. Hayes; W.B. Cohen
2006-01-01
This article describes the development of a methodology for scaling observations of changes in tropical forest cover to large areas at high temporal frequency from coarse-resolution satellite imagery. The approach for estimating proportional forest cover change as a continuous variable is based on a regression model that relates multispectral, multitemporal Moderate...
ERIC Educational Resources Information Center
Dogan, Shannon J.; Sitnick, Stephanie L.; Onati, Lenna L.
2012-01-01
Extension professionals often work with diverse clientele; however, most assessment tools have been developed and validated with English-speaking samples. There is little research and practical guidance on the cultural adaptation and translation of rating scales. The purpose of this article is to summarize the methodological work in this area as…
ERIC Educational Resources Information Center
Camparo, James; Camparo, Lorinda B.
2013-01-01
Though ubiquitous, Likert scaling's traditional mode of analysis is often unable to uncover all of the valid information in a data set. Here, the authors discuss a solution to this problem based on methodology developed by quantum physicists: the state multipole method. The authors demonstrate the relative ease and value of this method by…
Investigating transport pathways in the ocean
NASA Astrophysics Data System (ADS)
Griffa, Annalisa; Haza, Angelique; Özgökmen, Tamay M.; Molcard, Anne; Taillandier, Vincent; Schroeder, Katrin; Chang, Yeon; Poulain, P.-M.
2013-01-01
The ocean is a very complex medium with scales of motion that range from thousands of kilometers to the dissipation scales. Transport by ocean currents plays an important role in many practical applications ranging from climatic problems to coastal management and accident mitigation at sea. Understanding transport is challenging because of the chaotic nature of particle motion. In the last decade, new methods have been put forth to improve our understanding of transport. Powerful tools are provided by dynamical system theory, that allow the identification of the barriers to transport and their time variability for a given flow. A shortcoming of this approach, though, is that it is based on the assumption that the velocity field is known with good accuracy, which is not always the case in practical applications. Improving model performance in terms of transport can be addressed using another important methodology that has been recently developed, namely the assimilation of Lagrangian data provided by floating buoys. The two methodologies are technically different but in many ways complementary. In this paper, we review examples of applications of both methodologies performed by the authors in the last few years, considering flows at different scales and in various ocean basins. The results are among the very first examples of applications of the methodologies to the real ocean including testing with Lagrangian in-situ data. The results are discussed in the general framework of the extended fields related to these methodologies, pointing out to open questions and potential for improvements, with an outlook toward future strategies.
A systematic review of the effects of upper body warm-up on performance and injury.
McCrary, J Matt; Ackermann, Bronwen J; Halaki, Mark
2015-07-01
This systematic review was conducted to identify the impact of upper body warm-up on performance and injury prevention outcomes. Web of Science, MEDLINE, SPORTDiscus, PsycINFO and Cochrane databases were searched using terms related to upper extremity warm-up. Inclusion criteria were English language randomised controlled trials from peer-reviewed journals in which investigation of upper body warm-up on performance and injury prevention outcomes was a primary aim. Included studies were assessed for methodological quality using the PEDro scale. A wide variety of warm-up modes and outcomes precluded meta-analysis except for one group of studies. The majority of warm-ups were assessed as having 'positive', 'neutral', 'negative' or 'specific' effects on outcomes. Thirty-one studies met the inclusion criteria with 21 rated as having 'good' methodological quality. The studies investigated a total of 25 warm-up modes and 43 outcome factors that could be grouped into eight mode and performance outcome categories. No studies of upper body warm-up effects on injury prevention were discovered. Strong research-based evidence was found for the following: high-load dynamic warm-ups enhance power and strength performance; warm-up swings with a standard weight baseball bat are most effective for enhancing bat speed; short-duration static stretching warm-up has no effect on power outcomes; and passive heating/cooling is a largely ineffective warm-up mode. A clear knowledge gap in upper body warm-up literature is the lack of investigation of injury prevention outcomes. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Pillai, Goonaseelan Colin; Mentré, France; Steimer, Jean-Louis
2005-04-01
Few scientific contributions have made significant impact unless there was a champion who had the vision to see the potential for its use in seemingly disparate areas-and who then drove active implementation. In this paper, we present a historical summary of the development of non-linear mixed effects (NLME) modeling up to the more recent extensions of this statistical methodology. The paper places strong emphasis on the pivotal role played by Lewis B. Sheiner (1940-2004), who used this statistical methodology to elucidate solutions to real problems identified in clinical practice and in medical research and on how he drove implementation of the proposed solutions. A succinct overview of the evolution of the NLME modeling methodology is presented as well as ideas on how its expansion helped to provide guidance for a more scientific view of (model-based) drug development that reduces empiricism in favor of critical quantitative thinking and decision making.
Development of a New Measurement Tool for Individualism and Collectivism
ERIC Educational Resources Information Center
Shulruf, Boaz; Hattie, John; Dixon, Robyn
2007-01-01
A new measurement tool for individualism and collectivism has been developed to address critical methodological issues in this field of social psychology. This new measure, the Auckland Individualism and Collectivism Scale (AICS), defines three dimensions of individualism: (a) responsibility (acknowledging one's responsibility for one's actions),…
Training Comprehensiveness: Construct Development and Relation with Role Behaviour
ERIC Educational Resources Information Center
Srivastava, Anugamini Priya; Dhar, Rajib Lochan
2015-01-01
Purpose: This study aims to develop the scale for perception of training comprehensiveness and attempts to examine the influence of perception of training comprehensiveness on role behaviour: teachers' efficacy as a mediator and job autonomy as a moderator. Design/methodology/approach: Through the steps for a generation, refinement, purification…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goswami, D.Y.
1997-05-01
Scientific research on photocatalytic oxidation of hazardous chemicals has been conducted extensively over the last three decades. Use of solar radiation in photocatalytic detoxification and disinfection has only been explored in the last decade. Developments of engineering scale systems, design methodologies, and commercial and industrial applications have occurred even more recently. A number of reactor concepts and designs including concentrating and nonconcentrating types and methods of catalyst deployment have been developed. Some commercial and industrial field tests of solar detoxification systems have been conducted. This paper reviews the engineering developments of the solar photocatalytic detoxification and disinfection processes, including systemmore » design methodologies.« less
Recent lab-on-chip developments for novel drug discovery.
Khalid, Nauman; Kobayashi, Isao; Nakajima, Mitsutoshi
2017-07-01
Microelectromechanical systems (MEMS) and micro total analysis systems (μTAS) revolutionized the biochemical and electronic industries, and this miniaturization process became a key driver for many markets. Now, it is a driving force for innovations in life sciences, diagnostics, analytical sciences, and chemistry, which are called 'lab-on-a-chip, (LOC)' devices. The use of these devices allows the development of fast, portable, and easy-to-use systems with a high level of functional integration for applications such as point-of-care diagnostics, forensics, the analysis of biomolecules, environmental or food analysis, and drug development. In this review, we report on the latest developments in fabrication methods and production methodologies to tailor LOC devices. A brief overview of scale-up strategies is also presented together with their potential applications in drug delivery and discovery. The impact of LOC devices on drug development and discovery has been extensively reviewed in the past. The current research focuses on fast and accurate detection of genomics, cell mutations and analysis, drug delivery, and discovery. The current research also differentiates the LOC devices into new terminology of microengineering, like organ-on-a-chip, stem cells-on-a-chip, human-on-a-chip, and body-on-a-chip. Key challenges will be the transfer of fabricated LOC devices from lab-scale to industrial large-scale production. Moreover, extensive toxicological studies are needed to justify the use of microfabricated drug delivery vehicles in biological systems. It will also be challenging to transfer the in vitro findings to suitable and promising in vivo models. WIREs Syst Biol Med 2017, 9:e1381. doi: 10.1002/wsbm.1381 For further resources related to this article, please visit the WIREs website. © 2017 Wiley Periodicals, Inc.
Household Energy Consumption Segmentation Using Hourly Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kwac, J; Flora, J; Rajagopal, R
2014-01-01
The increasing US deployment of residential advanced metering infrastructure (AMI) has made hourly energy consumption data widely available. Using CA smart meter data, we investigate a household electricity segmentation methodology that uses an encoding system with a pre-processed load shape dictionary. Structured approaches using features derived from the encoded data drive five sample program and policy relevant energy lifestyle segmentation strategies. We also ensure that the methodologies developed scale to large data sets.
Leavesley, G.; Hay, L.
1998-01-01
Coupled atmospheric and hydrological models provide an opportunity for the improved management of water resources in headwater basins. Issues currently limiting full implementation of coupled-model methodologies include (a) the degree of uncertainty in the accuracy of precipitation and other meteorological variables simulated by atmospheric models, and (b) the problem of discordant scales between atmospheric and bydrological models. Alternative methodologies being developed to address these issues are reviewed.
Pathways To Scaling-Up in Community Based Rehabilitation Agencies.
ERIC Educational Resources Information Center
Boyce, W.; Johnston, C.; Thomas, M.; Enns, H.; Naidu, D. M.; Tjandrakusuma, H.
1997-01-01
Scaling-up (the expansion or development of organizational activities of nongovernmental agencies to achieve greater impact) in community-based rehabilitation is described by using case study materials from industrialized and less-developed countries (India, Canada, and Indonesia) and focusing on differences in structural characteristics of…
Smith, Jennifer A; Anderson, Sarah-Jane; Harris, Kate L; McGillen, Jessica B; Lee, Edward; Garnett, Geoff P; Hallett, Timothy B
2016-07-01
Many ways of preventing HIV infection have been proposed and more are being developed. We sought to construct a strategic approach to HIV prevention that would use limited resources to achieve the greatest possible prevention impact through the use of interventions available today and in the coming years. We developed a deterministic compartmental model of heterosexual HIV transmission in South Africa and formed assumptions about the costs and effects of a range of interventions, encompassing the further scale-up of existing interventions (promoting condom use, male circumcision, early antiretroviral therapy [ART] initiation for all [including increased HIV testing and counselling activities], and oral pre-exposure prophylaxis [PrEP]), the introduction of new interventions in the medium term (offering intravaginal rings, long-acting injectable antiretroviral drugs) and long term (vaccine, broadly neutralising antibodies [bNAbs]). We examined how available resources could be allocated across these interventions to achieve maximum impact, and assessed how this would be affected by the failure of the interventions to be developed or scaled up. If all interventions are available, the optimum mix would place great emphasis on the following: scale-up of male circumcision and early ART initiation with outreach testing, as these are available immediately and assumed to be low cost and highly efficacious; intravaginal rings targeted to sex workers; and vaccines, as these can achieve a large effect if scaled up even if imperfectly efficacious. The optimum mix would rely less on longer term developments, such as long-acting antiretroviral drugs and bNAbs, unless the costs of these reduced. However, if impossible to scale up existing interventions to the extent assumed, emphasis on oral PrEP, intravaginal rings, and long-acting antiretroviral drugs would increase. The long-term effect on the epidemic is most affected by scale-up of existing interventions and the successful development of a vaccine. With current information, a strategic approach in which limited resources are used to maximise prevention impact would focus on strengthening the scale-up of existing interventions, while pursuing a workable vaccine and developing other approaches that can be used if further scale-up of existing interventions is limited. Bill & Melinda Gates Foundation. Copyright © 2016 Smith et al. Open Access article distributed under the terms of CC BY. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Vijaykumar, Adithya; Ouldridge, Thomas E.; ten Wolde, Pieter Rein; Bolhuis, Peter G.
2017-03-01
The modeling of complex reaction-diffusion processes in, for instance, cellular biochemical networks or self-assembling soft matter can be tremendously sped up by employing a multiscale algorithm which combines the mesoscopic Green's Function Reaction Dynamics (GFRD) method with explicit stochastic Brownian, Langevin, or deterministic molecular dynamics to treat reactants at the microscopic scale [A. Vijaykumar, P. G. Bolhuis, and P. R. ten Wolde, J. Chem. Phys. 143, 214102 (2015)]. Here we extend this multiscale MD-GFRD approach to include the orientational dynamics that is crucial to describe the anisotropic interactions often prevalent in biomolecular systems. We present the novel algorithm focusing on Brownian dynamics only, although the methodology is generic. We illustrate the novel algorithm using a simple patchy particle model. After validation of the algorithm, we discuss its performance. The rotational Brownian dynamics MD-GFRD multiscale method will open up the possibility for large scale simulations of protein signalling networks.
Modeling Near-Crack-Tip Plasticity from Nano- to Micro-Scales
NASA Technical Reports Server (NTRS)
Glaessgen, Edward H.; Saether, Erik; Hochhalter, Jake D.; Yamakov, Vesselin I.
2010-01-01
Several efforts that are aimed at understanding the plastic deformation mechanisms related to crack propagation at the nano-, meso- and micro-length scales including atomistic simulation, discrete dislocation plasticity, strain gradient plasticity and crystal plasticity are discussed. The paper focuses on discussion of newly developed methodologies and their application to understanding damage processes in aluminum and its alloys. Examination of plastic mechanisms as a function of increasing length scale illustrates increasingly complex phenomena governing plasticity
Scale in Education Research: Towards a Multi-Scale Methodology
ERIC Educational Resources Information Center
Noyes, Andrew
2013-01-01
This article explores some theoretical and methodological problems concerned with scale in education research through a critique of a recent mixed-method project. The project was framed by scale metaphors drawn from the physical and earth sciences and I consider how recent thinking around scale, for example, in ecosystems and human geography might…
Esteves, F; Gaspar, J; de Sousa, B; Antunes, F; Mansinho, K; Matos, O
2012-06-01
Specific single-nucleotide polymorphisms (SNPs) are recognized as important DNA sequence variations influencing the pathogenesis of Pneumocystis jirovecii and the clinical outcome of Pneumocystis pneumonia, which is a major worldwide cause of illness among immunocompromised patients. Genotyping platforms for pooled DNA samples are promising methodologies for genetic characterization of infectious organisms. We have developed a new typing strategy for P. jirovecii, which consisted of DNA pools prepared according to clinical data (HIV diagnosis, microscopic and molecular detection of P. jirovecii, parasite burden, clinical diagnosis and follow-up of infection) from individual samples using quantitative real-time PCR followed by multiplex-PCR/single base extension (MPCR/SBE). The frequencies of multiple P. jirovecii SNPs (DHFR312, mt85, SOD215 and SOD110) encoded at three distinct loci, the dihydrofolate reductase (DHFR), the mitochondrial large-subunit rRNA (mtLSU rRNA) and the superoxide dismutase (SOD) loci, were estimated in seven DNA pooled samples, representing a total of 100 individual samples. The studied SNPs were confirmed to be associated with distinct clinical parameters of infection such as parasite burden and follow-up. The MPCR/SBE-DNA pooling methodology, described in the present study, was demonstrated to be a useful high-throughput procedure for large-scale P. jirovecii SNPs screening and a powerful tool for evaluation of clinically relevant SNPs potentially related to parasite burden, clinical diagnosis and follow-up of P. jirovecii infection. In further studies, the candidate SNPs mt85, SOD215 and SOD110 may be used as molecular markers in association with MPCR/SBE-DNA pooling to generate useful information for understanding the patterns and causes of Pneumocystis pneumonia. © 2012 The Authors. Clinical Microbiology and Infection © 2012 European Society of Clinical Microbiology and Infectious Diseases.
Gilbert-López, Bienvenida; García-Reyes, Juan F; Lozano, Ana; Fernández-Alba, Amadeo R; Molina-Díaz, Antonio
2010-09-24
In this work we have evaluated the performance of two sample preparation methodologies for the large-scale multiresidue analysis of pesticides in olives using liquid chromatography-electrospray tandem mass spectrometry (LC-MS/MS). The tested sample treatment methodologies were: (1) liquid-liquid partitioning with acetonitrile followed by dispersive solid-phase extraction clean-up using GCB, PSA and C18 sorbents (QuEChERS method - modified for fatty vegetables) and (2) matrix solid-phase dispersion (MSPD) using aminopropyl as sorbent material and a final clean-up performed in the elution step using Florisil. An LC-MS/MS method covering 104 multiclass pesticides was developed to examine the performance of these two protocols. The separation of the compounds from the olive extracts was achieved using a short C18 column (50 mm x 4.6 mm i.d.) with 1.8 microm particle size. The identification and confirmation of the compounds was based on retention time matching along with the presence (and ratio) of two typical MRM transitions. Limits of detection obtained were lower than 10 microgkg(-1) for 89% analytes using both sample treatment protocols. Recoveries studies performed on olives samples spiked at two concentration levels (10 and 100 microgkg(-1)) yielded average recoveries in the range 70-120% for most analytes when QuEChERS procedure is employed. When MSPD was the choice for sample extraction, recoveries obtained were in the range 50-70% for most of target compounds. The proposed methods were successfully applied to the analysis of real olives samples, revealing the presence of some of the target species in the microgkg(-1) range. Besides the evaluation of the sample preparation approaches, we also discuss the use of advanced software features associated to MRM method development that overcome several limitations and drawbacks associated to MS/MS methods (time segments boundaries, tedious method development/manual scheduling and acquisition limitations). This software feature recently offered by different vendors is based on an algorithm that associates retention time data for each individual MS/MS transition, so that the number of simultaneously traced transitions throughout the entire chromatographic run (dwell times and sensitivity) is maximized. Copyright 2010 Elsevier B.V. All rights reserved.
Multi-scaling allometric analysis for urban and regional development
NASA Astrophysics Data System (ADS)
Chen, Yanguang
2017-01-01
The concept of allometric growth is based on scaling relations, and it has been applied to urban and regional analysis for a long time. However, most allometric analyses were devoted to the single proportional relation between two elements of a geographical system. Few researches focus on the allometric scaling of multielements. In this paper, a process of multiscaling allometric analysis is developed for the studies on spatio-temporal evolution of complex systems. By means of linear algebra, general system theory, and by analogy with the analytical hierarchy process, the concepts of allometric growth can be integrated with the ideas from fractal dimension. Thus a new methodology of geo-spatial analysis and the related theoretical models emerge. Based on the least squares regression and matrix operations, a simple algorithm is proposed to solve the multiscaling allometric equation. Applying the analytical method of multielement allometry to Chinese cities and regions yields satisfying results. A conclusion is reached that the multiscaling allometric analysis can be employed to make a comprehensive evaluation for the relative levels of urban and regional development, and explain spatial heterogeneity. The notion of multiscaling allometry may enrich the current theory and methodology of spatial analyses of urban and regional evolution.
Li, Fenfang; Li, Qiao; Wu, Shuanggen; Tan, Zhijian
2017-02-15
Salting-out extraction (SOE) based on lower molecular organic solvent and inorganic salt was considered as a good substitute for conventional polymers aqueous two-phase extraction (ATPE) used for the extraction of some bioactive compounds from natural plants resources. In this study, the ethanol/ammonium sulfate was screened as the optimal SOE system for the extraction and preliminary purification of allicin from garlic. Response surface methodology (RSM) was developed to optimize the major conditions. The maximum extraction efficiency of 94.17% was obtained at the optimized conditions for routine use: 23% (w/w) ethanol concentration and 24% (w/w) salt concentration, 31g/L loaded sample at 25°C with pH being not adjusted. The extraction efficiency had no obvious decrease after amplification of the extraction. This ethanol/ammonium sulfate SOE is much simpler, cheaper, and effective, which has the potentiality of scale-up production for the extraction and purification of other compounds from plant resources. Copyright © 2016 Elsevier Ltd. All rights reserved.
Barriga, H M G; Tyler, A I I; McCarthy, N L C; Parsons, E S; Ces, O; Law, R V; Seddon, J M; Brooks, N J
2015-01-21
Bicontinuous cubic structures offer enormous potential in applications ranging from protein crystallisation to drug delivery systems and have been observed in cellular membrane structures. One of the current bottlenecks in understanding and exploiting these structures is that cubic scaffolds produced in vitro are considerably smaller in size than those observed in biological systems, differing by almost an order of magnitude in some cases. We have addressed this technological bottleneck and developed a methodology capable of manufacturing highly swollen bicontinuous cubic membranes with length scales approaching those seen in vivo. Crucially, these cubic systems do not require the presence of proteins. We have generated highly swollen Im3m symmetry bicontinuous cubic phases with lattice parameters of up to 480 Å, composed of ternary mixtures of monoolein, cholesterol and negatively charged lipid (DOPS or DOPG) and we have been able to tune their lattice parameters. The swollen cubic phases are highly sensitive to both temperature and pressure; these structural changes are likely to be controlled by a fine balance between lipid headgroup repulsions and lateral pressure in the hydrocarbon chain region.
Surgical model pig ex vivo for venous dissection teaching in medical schools.
Tube, Milton Ignacio Carvalho; Spencer-Netto, Fernando Antonio Campelo; Oliveira, Anderson Igor Pereira de; Holanda, Arthur Cesário de; Barros, Bruno Leão Dos Santos; Rezende, Caio Cezar Gomes; Cavalcanti, João Pedro Guerra; Batista, Marília Apolinário; Campos, Josemberg Marins
2017-02-01
To investigate a method for development of surgical skills in medical students simulating venous dissection in surgical ex vivo pig model. Prospective, analytical, experimental, controlled study with four stages: selection, theoretical teaching, training and assessment. Sample of 312 students was divided into two groups: Group A - 2nd semester students; Group B - students of 8th semester. The groups were divided into five groups of 12 students, trained two hours per week in the semester. They set up four models to three students in each skill station assisted by a monitor. Teaching protocol emergency procedures training were applied to venous dissection, test goal-discursive and OSATS scale. The pre-test confirmed that the methodology has not been previously applied to the students. The averages obtained in the theoretical evaluation reached satisfactory parameters in both groups. The results of applying OSATS scale showed the best performance in group A compared to group B, however, both groups had satisfactory medium. The method was enough to raise a satisfactory level of skill both groups in venous dissection running on surgical swine ex vivo models.
Federsel, Hans-Jürgen
2009-05-19
In process research and development (PR&D), the generation and manipulation of small-molecule drugs ranges from bench-scale (laboratory) chemistry to pilot plant manufacture to commercial production. A broad range of disciplines, including process chemistry (organic synthesis), analytical chemistry, process engineering (mass and heat transfer, unit operations), process safety (chemical risk assessment), regulatory compliance, and plant operation, must be effectively applied. In the critical handover between medicinal chemistry and PR&D, compound production is typically scaled up from a few hundred grams to several kilograms. Can the methodologies applied to the former also satisfy the technical, safety, and scalability aspects that come into play in the latter? Occasionally, the transition might occur smoothly, but more often the situation is the opposite: much work and resources must be invested to design a process that is feasible for manufacturing on pilot scale and, eventually, for commercial production. Authentic examples provide enlightening illustrations of dos and don'ts for developing syntheses designed for round-flask operation into production-scale processes. Factors that are easily underestimated or even neglected in the laboratory, such as method robustness, chemical hazards, safety concerns, environmental impact, availability of starting materials and building blocks in bulk quantities, intellectual property (IP) issues, and the final cost of the product, will come into play and need to be addressed appropriately. The decision on which route will be the best for further development is a crucial event and should come into focus early on the R&D timeline. In addition to scientific and technical concerns, the parameter of speed has come to the forefront in the pharmaceutical arena. Although historically the drug industry has tolerated a total time investment of far more than 10 years from idea to market, the current worldwide paradigm requires a reduction to under 10 years for the specific segment covering preclinical development through launch. This change puts enormous pressure on the entire organization, and the implication for PR&D is that the time allowed for conducting route design and scale-up has shrunk accordingly. Furthermore, molecular complexity has become extremely challenging in many instances, and demand steadily grows for process understanding and knowledge generation about low-level byproduct, which often must be controlled even at trace concentrations to meet regulatory specifications (especially in the case of potentially genotoxic impurities). In this Account, we paint a broad picture of the technical challenges the PR&D community is grappling with today, focusing on what measures have been taken over the years to create more efficiency and effectiveness.
Richter, Linda M; Daelmans, Bernadette; Lombardi, Joan; Heymann, Jody; Boo, Florencia Lopez; Behrman, Jere R; Lu, Chunling; Lucas, Jane E; Perez-Escamilla, Rafael; Dua, Tarun; Bhutta, Zulfiqar A; Stenberg, Karin; Gertler, Paul; Darmstadt, Gary L
2017-01-07
Building on long-term benefits of early intervention (Paper 2 of this Series) and increasing commitment to early childhood development (Paper 1 of this Series), scaled up support for the youngest children is essential to improving health, human capital, and wellbeing across the life course. In this third paper, new analyses show that the burden of poor development is higher than estimated, taking into account additional risk factors. National programmes are needed. Greater political prioritisation is core to scale-up, as are policies that afford families time and financial resources to provide nurturing care for young children. Effective and feasible programmes to support early child development are now available. All sectors, particularly education, and social and child protection, must play a role to meet the holistic needs of young children. However, health provides a critical starting point for scaling up, given its reach to pregnant women, families, and young children. Starting at conception, interventions to promote nurturing care can feasibly build on existing health and nutrition services at limited additional cost. Failure to scale up has severe personal and social consequences. Children at elevated risk for compromised development due to stunting and poverty are likely to forgo about a quarter of average adult income per year, and the cost of inaction to gross domestic product can be double what some countries currently spend on health. Services and interventions to support early childhood development are essential to realising the vision of the Sustainable Development Goals. Copyright © 2017 Elsevier Ltd. All rights reserved.
Vernooij, Robin W. M.; Alonso-Coello, Pablo; Brouwers, Melissa
2017-01-01
Background Scientific knowledge is in constant development. Consequently, regular review to assure the trustworthiness of clinical guidelines is required. However, there is still a lack of preferred reporting items of the updating process in updated clinical guidelines. The present article describes the development process of the Checklist for the Reporting of Updated Guidelines (CheckUp). Methods and Findings We developed an initial list of items based on an overview of research evidence on clinical guideline updating, the Appraisal of Guidelines for Research and Evaluation (AGREE) II Instrument, and the advice of the CheckUp panel (n = 33 professionals). A multistep process was used to refine this list, including an assessment of ten existing updated clinical guidelines, interviews with key informants (response rate: 54.2%; 13/24), a three-round Delphi consensus survey with the CheckUp panel (33 participants), and an external review with clinical guideline methodologists (response rate: 90%; 53/59) and users (response rate: 55.6%; 10/18). CheckUp includes 16 items that address (1) the presentation of an updated guideline, (2) editorial independence, and (3) the methodology of the updating process. In this article, we present the methodology to develop CheckUp and include as a supplementary file an explanation and elaboration document. Conclusions CheckUp can be used to evaluate the completeness of reporting in updated guidelines and as a tool to inform guideline developers about reporting requirements. Editors may request its completion from guideline authors when submitting updated guidelines for publication. Adherence to CheckUp will likely enhance the comprehensiveness and transparency of clinical guideline updating for the benefit of patients and the public, health care professionals, and other relevant stakeholders. PMID:28072838
DOT National Transportation Integrated Search
1999-03-01
A methodology for developing modal vehicle emissions and fuel consumption models has been developed by Oak Ridge National Laboratory (ORNL), sponsored by the Federal Highway Administration. These models, in the form of look-up tables for fuel consump...
NASA Astrophysics Data System (ADS)
Schinckus, C.
2016-12-01
This article aimed at presenting the scattered econophysics literature as a unified and coherent field through a specific lens imported from philosophy science. More precisely, I used the methodology developed by Imre Lakatos to cover the methodological evolution of econophysics over these last two decades. In this perspective, three co-existing approaches have been identified: statistical econophysics, bottom-up agent based econophysics and top-down agent based econophysics. Although the last is presented here as the last step of the methodological evolution of econophysics, it is worth mentioning that this tradition is still very new. A quick look on the econophysics literature shows that the vast majority of works in this field deal with a strictly statistical approach or a classical bottom-up agent-based modelling. In this context of diversification, the objective (and contribution) of this article is to emphasize the conceptual coherence of econophysics as a unique field of research. With this purpose, I used a theoretical framework coming from philosophy of science to characterize how econophysics evolved by combining a methodological enrichment with the preservation of its core conceptual statements.
Development and Validation of a Gender Ideology Scale for Family Planning Services in Rural China
Yang, Xueyan; Li, Shuzhuo; Feldman, Marcus W.
2013-01-01
The objectives of this study are to develop a scale of gender role ideology appropriate for assessing Quality of Care in family planning services for rural China. Literature review, focus-group discussions and in-depth interviews with service providers and clients from two counties in eastern and western China, as well as experts’ assessments, were used to develop a scale for family planning services. Psychometric methodologies were applied to samples of 601 service clients and 541 service providers from a survey in a district in central China to validate its internal consistency, reliability, and construct validity with realistic and strategic dimensions. This scale is found to be reliable and valid, and has prospects for application both academically and practically in the field. PMID:23573222
Scaling-Up Successfully: Pathways to Replication for Educational NGOs
ERIC Educational Resources Information Center
Jowett, Alice; Dyer, Caroline
2012-01-01
Non-government organisations (NGOs) are big players in international development, critical to the achievement of the Millennium Development Goals (MDGs) and constantly under pressure to "achieve more". Scaling-up their initiatives successfully and sustainably can be an efficient and cost effective way for NGOs to increase their impact across a…
Universal access to electricity in Burkina Faso: scaling-up renewable energy technologies
NASA Astrophysics Data System (ADS)
Moner-Girona, M.; Bódis, K.; Huld, T.; Kougias, I.; Szabó, S.
2016-08-01
This paper describes the status quo of the power sector in Burkina Faso, its limitations, and develops a new methodology that through spatial analysis processes with the aim to provide a possible pathway for universal electricity access. Following the SE4All initiative approach, it recommends the more extensive use of distributed renewable energy systems to increase access to electricity on an accelerated timeline. Less than 5% of the rural population in Burkina Faso have currently access to electricity and supply is lacking at many social structures such as schools and hospitals. Energy access achievements in Burkina Faso are still very modest. According to the latest SE4All Global Tracking Framework (2015), the access to electricity annual growth rate in Burkina Faso from 2010 to 2012 is 0%. The rural electrification strategy for Burkina Faso is scattered in several electricity sector development policies: there is a need of defining a concrete action plan. Planning and coordination between grid extension and the off-grid electrification programme is essential to reach a long-term sustainable energy model and prevent high avoidable infrastructure investments. This paper goes into details on the methodology and findings of the developed Geographic Information Systems tool. The aim of the dynamic planning tool is to provide support to the national government and development partners to define an alternative electrification plan. Burkina Faso proves to be paradigm case for the methodology as its national policy for electrification is still dominated by grid extension and the government subsidising fossil fuel electricity production. However, the results of our analysis suggest that the current grid extension is becoming inefficient and unsustainable in order to reach the national energy access targets. The results also suggest that Burkina Faso’s rural electrification strategy should be driven local renewable resources to power distributed mini-grids. We find that this approach would connect more people to power more quickly, and would reduce fossil fuel use that would otherwise be necessary for grid extension options.
Lightning protection technology for small general aviation composite material aircraft
NASA Technical Reports Server (NTRS)
Plumer, J. A.; Setzer, T. E.; Siddiqi, S.
1993-01-01
An on going NASA (Small Business Innovative Research) SBIR Phase II design and development program will produce the first lightning protected, fiberglass, General Aviation aircraft that is available as a kit. The results obtained so far in development testing of typical components of the aircraft kit, such as the wing and fuselage panels indicate that the lightning protection design methodology and materials chosen are capable of protecting such small composite airframes from lightning puncture and structural damage associated with severe threat lightning strikes. The primary objective of the program has been to develop a lightening protection design for full scale test airframe and verify its adequacy with full scale laboratory testing, thus enabling production and sale of owner-built, lightning-protected, Stoddard-Hamilton Aircraft, Inc. Glasair II airplanes. A second objective has been to provide lightning protection design guidelines for the General Aviation industry, and to enable these airplanes to meet lightening protection requirements for certification of small airplanes. This paper describes the protection design approaches and development testing results obtained thus far in the program, together with design methodology which can achieve the design goals listed above. The presentation of this paper will also include results of some of the full scale verification tests, which will have been completed by the time of this conference.
Three Collaborative Models for Scaling Up Evidence-Based Practices
Roberts, Rosemarie; Jones, Helen; Marsenich, Lynne; Sosna, Todd; Price, Joseph M.
2015-01-01
The current paper describes three models of research-practice collaboration to scale-up evidence-based practices (EBP): (1) the Rolling Cohort model in England, (2) the Cascading Dissemination model in San Diego County, and (3) the Community Development Team model in 53 California and Ohio counties. Multidimensional Treatment Foster Care (MTFC) and KEEP are the focal evidence-based practices that are designed to improve outcomes for children and families in the child welfare, juvenile justice, and mental health systems. The three scale-up models each originated from collaboration between community partners and researchers with the shared goal of wide-spread implementation and sustainability of MTFC/KEEP. The three models were implemented in a variety of contexts; Rolling Cohort was implemented nationally, Cascading Dissemination was implemented within one county, and Community Development Team was targeted at the state level. The current paper presents an overview of the development of each model, the policy frameworks in which they are embedded, system challenges encountered during scale-up, and lessons learned. Common elements of successful scale-up efforts, barriers to success, factors relating to enduring practice relationships, and future research directions are discussed. PMID:21484449
A Comparative Study of Spatial Aggregation Methodologies under the BioEarth Framework
NASA Astrophysics Data System (ADS)
Chandrasekharan, B.; Rajagopalan, K.; Malek, K.; Stockle, C. O.; Adam, J. C.; Brady, M.
2014-12-01
The increasing probability of water resource scarcity due to climate change has highlighted the need for adopting an economic focus in modelling water resource uses. Hydro-economic models, developed by integrating economic optimization with biophysical crop models, are driven by the economic value of water, revealing it's most efficient uses and helping policymakers evaluate different water management strategies. One of the challenges in integrating biophysical models with economic models is the difference in the spatial scales in which they operate. Biophysical models that provide crop production functions typically run at smaller scale than economic models, and substantial spatial aggregation is required. However, any aggregation introduces a bias, i.e., a discrepancy between the functional value at the higher spatial scale and the value at the spatial scale of the aggregated units. The objective of this work is to study the sensitivity of net economic benefits in the Yakima River basin (YRB) to different spatial aggregation methods for crop production functions. The spatial aggregation methodologies that we compare involve agro-ecological zones (AEZs) and aggregation levels that reflect water management regimes (e.g. irrigation districts). Aggregation bias can distort the underlying data and result in extreme solutions. In order to avoid this we use an economic optimization model that incorporates the synthetic and historical crop mixes approach (Onal & Chen, 2012). This restricts the solutions between the weighted averages of historical and simulated feasible planting decisions, with the weights associated with crop mixes being treated as endogenous variables. This study is focused on 5 major irrigation districts of the YRB in the Pacific Northwest US. The biophysical modeling framework we use, BioEarth, includes the coupled hydrology and crop growth model, VIC-Cropsyst and an economic optimization model. Preliminary findings indicate that the standard approach of developing AEZs does not perform well when overlaid with irrigation districts. Moreover, net economic benefits were significantly different between the two aggregation methodologies. Therefore, while developing hydro-economic models, significant consideration should be placed on the aggregation methodology.
NASA Astrophysics Data System (ADS)
Kuo, Ching-Wen
2010-06-01
Modern military aircraft jet engines are designed with variable geometry nozzles to provide optimum thrust in different operating conditions within the flight envelope. However, the acoustic measurements for such nozzles are scarce, due to the cost involved in making full-scale measurements and the lack of details about the exact geometry of these nozzles. Thus the present effort at The Pennsylvania State University and the NASA Glenn Research Center, in partnership with GE Aviation, is aiming to study and characterize the acoustic field produced by supersonic jets issuing from converging-diverging military style nozzles. An equally important objective is to develop a scaling methodology for using data obtained from small- and moderate-scale experiments which exhibits the independence of the jet sizes to the measured noise levels. The experimental results presented in this thesis have shown reasonable agreement between small-scale and moderate-scale jet acoustic data, as well as between heated jets and heat-simulated ones. As the scaling methodology is validated, it will be extended to using acoustic data measured with small-scale supersonic model jets to the prediction of the most important components of full-scale engine noise. When comparing the measured acoustic spectra with a microphone array set at different radial locations, the characteristics of the jet noise source distribution may induce subtle inaccuracies, depending on the conditions of jet operation. A close look is taken at the details of the noise generation region in order to better understand the mismatch between spectra measured at various acoustic field radial locations. A processing methodology was developed to correct the effect of the noise source distribution and efficiently compare near-field and far-field spectra with unprecedented accuracy. This technique then demonstrates that the measured noise levels in the physically restricted space of an anechoic chamber can be appropriately extrapolated to represent the expected noise levels at different noise monitoring locations of practical interest. With the emergence of more powerful fighter aircraft, supersonic jet noise reduction devices are being intensely researched. Small-scale measurements are a crucial step in evaluating the potential of noise reduction concepts at an early stage in the design process. With this in mind, the present thesis provides an acoustic assessment methodology for small-scale military-style nozzles with chevrons. Comparisons are made between the present measurements and those made by NASA at moderate-scale. The effect of chevrons on supersonic jets was investigated, highlighting the crucial role of the jet operating conditions on the effects of chevrons on the jet flow and the subsequent acoustic benefits. A small-scale heat simulated jet is investigated in the over-expanded condition and shows no substantial noise reduction from the chevrons. This is contrary to moderate-scale measurements. The discrepancy is attributed to a Reynolds number low enough to sustain an annular laminar boundary layer in the nozzle that separates in the over-expanded flow condition. These results are important in assessing the limitations of small-scale measurements in this particular jet noise reduction method. Lastly, to successfully present the results from the acoustic measurements of small-scale jets with high quality, a newly developed PSU free-field response was empirically derived to match the specific orientation and grid cap geometry of the microphones. Application to measured data gives encouraging results validating the capability of the method to produce superior accuracy in measurements even at the highest response frequencies of the microphones.
Junior, Garibaldi Dantas Gurgel
2014-01-01
Health Sector Reform and Social Determinants of Health are central issues for the current international policy debate, considering the turbulent scenario and the threat of economic recession in a global scale. Although these themes have been discussed for a long time, three major issues still calls the attention of the scientific community and health policymakers. The first one is the matter of how to approach scientifically the intricate connections between them in order to understand the consequences of policies for healthcare services, once this debate will become much more tensioned in the coming years. The second one is the lack of explanatory frameworks to investigate the policies of reform strategies, simultaneously observed in a variety of countries within distinct health services, which aim to achieve multiple and contradictory goals vis-à-vis the so-called social determinants of health. The third one is the challenge that governments face in developing and sustaining equitable health services, bearing in mind the intense political dispute behind the health sector reform processes. This article discusses an all-embracing theoretical and methodological scheme to address these questions. The aim is to connect macro- and middle-range theories to examine Social Determinants and Health Sector Reform interdependent issues, with view to developing new knowledge and attaining scientific understanding upon the role of universal and equitable healthcare systems, in order to avoid deepening economic crises.
Joseph, Adrian; Kenty, Brian; Mollet, Michael; Hwang, Kenneth; Rose, Steven; Goldrick, Stephen; Bender, Jean; Farid, Suzanne S.
2016-01-01
ABSTRACT In the production of biopharmaceuticals disk‐stack centrifugation is widely used as a harvest step for the removal of cells and cellular debris. Depth filters followed by sterile filters are often then employed to remove residual solids remaining in the centrate. Process development of centrifugation is usually conducted at pilot‐scale so as to mimic the commercial scale equipment but this method requires large quantities of cell culture and significant levels of effort for successful characterization. A scale‐down approach based upon the use of a shear device and a bench‐top centrifuge has been extended in this work towards a preparative methodology that successfully predicts the performance of the continuous centrifuge and polishing filters. The use of this methodology allows the effects of cell culture conditions and large‐scale centrifugal process parameters on subsequent filtration performance to be assessed at an early stage of process development where material availability is limited. Biotechnol. Bioeng. 2016;113: 1934–1941. © 2016 The Authors. Biotechnology and Bioengineering Published by Wiley Periodicals, Inc. PMID:26927621
NASA Astrophysics Data System (ADS)
Cheyney, S.; Fishwick, S.; Hill, I. A.; Linford, N. T.
2015-08-01
Despite the development of advanced processing and interpretation tools for magnetic data sets in the fields of mineral and hydrocarbon industries, these methods have not achieved similar levels of adoption for archaeological or very near surface surveys. Using a synthetic data set we demonstrate that certain methodologies and assumptions used to successfully invert more regional-scale data can lead to large discrepancies between the true and recovered depths when applied to archaeological-type anomalies. We propose variations to the current approach, analysing the choice of the depth-weighting function, mesh design and parameter constraints, to develop an appropriate technique for the 3-D inversion of archaeological-scale data sets. The results show a successful recovery of a synthetic scenario, as well as a case study of a Romano-Celtic temple in the UK. For the case study, the final susceptibility model is compared with two coincident ground penetrating radar surveys, showing a high correlation with the comparative depth slices. The new approach takes interpretation of archaeological data sets beyond a simple 2-D visual interpretation based on pattern recognition.
Schenk, Katie D
2009-07-01
Children affected by HIV in their families and communities face multiple risks to their health, education and psychosocial wellbeing. Community interventions for children who have been orphaned or rendered vulnerable take many forms, including educational assistance, home-based care, legal protection and psychosocial support. Despite a recent influx of funding for programme implementation, there exists little evidence to inform policymakers about whether their investments are improving the lives of vulnerable children and meeting key benchmarks including the Millennium Development Goals. This paper reviews the current evidence base on evaluations of community interventions for orphans and vulnerable children (OVC) in high HIV-prevalence African settings, focusing on studies' methodologies. Sources reviewed include published research studies and evidence from the unpublished programmatic "grey literature" located through database and internet searches. A total of 21 studies, varying in scope and generalisability, were identified. Interventions reviewed address children's wellbeing through various strategies within their communities. Evaluation methodologies reflect quantitative and qualitative approaches, including surveys (with and without baseline or comparison data), costing studies, focus groups, interviews, case studies, and participatory review techniques. Varied study methodologies reflect diverse research questions, various intervention types, and the challenges associated with evaluating complex interventions; highlighting the need to broaden the research paradigm in order to build the evidence base by including quasi-experimental and process evaluation approaches, and seeking further insights through participatory qualitative methodologies and costing studies. Although findings overall indicate the value of community interventions in effecting measurable improvements in child and family wellbeing, the quality and rigour of evidence is varied. A strategic research agenda is urgently needed to inform resource allocation and programme management decisions. Immediate imperatives include building local technical capacity to conduct quantitative and qualitative evaluation research, and strengthening monitoring and evaluation systems to collect process and outcome data (including costing) on key support models. Donors and implementers must support the collection of sound empirical evidence to inform the development and scale-up of OVC programmes.
Hbim Methodology as a Bridge Between Italy and Argentina
NASA Astrophysics Data System (ADS)
Moreira, A.; Quattrini, R.; Maggiolo, G.; Mammoli, R.
2018-05-01
The availability of efficient HBIM workflows could represent a very important change towards a more efficient management of the historical real estate. The present work shows how to obtain accurate and reliable information of heritage buildings through reality capture and 3D modelling to support restoration purposes or knowledge-based applications. Two cases studies metaphorically joint Italy with Argentina. The research article explains the workflows applied at the Palazzo Ferretti at Ancona and the Manzana Histórica de la Universidad National del Litoral, providing a constructive comparison and blending technological and theoretical approaches. In a bottom-up process, the assessment of two cases study validates a workflow allowing the achievement of a useful and proper data enrichment of each HBIM model. Another key aspect is the Level of Development (LOD) evaluation of both models: different ranges and scales are defined in America (100-500) and in Italy (A-G), nevertheless is possible to obtain standard shared procedures, enabling facilitation of HBIM development and diffusion in operating workflows.
Feature Selection for Wheat Yield Prediction
NASA Astrophysics Data System (ADS)
Ruß, Georg; Kruse, Rudolf
Carrying out effective and sustainable agriculture has become an important issue in recent years. Agricultural production has to keep up with an everincreasing population by taking advantage of a field’s heterogeneity. Nowadays, modern technology such as the global positioning system (GPS) and a multitude of developed sensors enable farmers to better measure their fields’ heterogeneities. For this small-scale, precise treatment the term precision agriculture has been coined. However, the large amounts of data that are (literally) harvested during the growing season have to be analysed. In particular, the farmer is interested in knowing whether a newly developed heterogeneity sensor is potentially advantageous or not. Since the sensor data are readily available, this issue should be seen from an artificial intelligence perspective. There it can be treated as a feature selection problem. The additional task of yield prediction can be treated as a multi-dimensional regression problem. This article aims to present an approach towards solving these two practically important problems using artificial intelligence and data mining ideas and methodologies.
Zimmermann, Hartmut F; Hentschel, Norbert
2011-01-01
With the publication of the quality guideline ICH Q9 "Quality Risk Management" by the International Conference on Harmonization, risk management has already become a standard requirement during the life cycle of a pharmaceutical product. Failure mode and effect analysis (FMEA) is a powerful risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to biopharmaceutical processes brings about some difficulties. The proposal presented here is intended to serve as a brief but nevertheless comprehensive and detailed guideline on how to conduct a biopharmaceutical process FMEA. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. The application for such a biopharmaceutical process FMEA is widespread. It can be useful whenever a biopharmaceutical manufacturing process is developed or scaled-up, or when it is transferred to a different manufacturing site. It may also be conducted during substantial optimization of an existing process or the development of a second-generation process. According to their resulting risk ratings, process parameters can be ranked for importance and important variables for process development, characterization, or validation can be identified. Health authorities around the world ask pharmaceutical companies to manage risk during development and manufacturing of pharmaceuticals. The so-called failure mode and effect analysis (FMEA) is an established risk analysis tool that has been used for decades in mechanical and electrical industries. However, the adaptation of the FMEA methodology to pharmaceutical processes that use modern biotechnology (biopharmaceutical processes) brings about some difficulties, because those biopharmaceutical processes differ from processes in mechanical and electrical industries. The proposal presented here explains how a biopharmaceutical process FMEA can be conducted. It includes a detailed 1-to-10-scale FMEA rating table for occurrence, severity, and detectability of failures that has been especially designed for typical biopharmaceutical processes. With the help of this guideline, different details of the manufacturing process can be ranked according to their potential risks, and this can help pharmaceutical companies to identify aspects with high potential risks and to react accordingly to improve the safety of medicines.
Accouting for Greenhouse Gas Emissions from Reservoirs
NASA Astrophysics Data System (ADS)
Beaulieu, J. J.; Deemer, B. R.; Harrison, J. A.; Nietch, C. T.; Waldo, S.
2016-12-01
Nearly three decades of research has demonstrated that the impoundment of rivers and the flooding of terrestrial ecosystems behind dams can increase rates of greenhouse gas emission, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used as a `basis for future methodological development' due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. In the U.S., research approaches include: 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions linked to the National Lakes Assessment.
Production methodologies of polymeric and hydrogel particles for drug delivery applications.
Lima, Ana Catarina; Sher, Praveen; Mano, João F
2012-02-01
Polymeric particles are ideal vehicles for controlled delivery applications due to their ability to encapsulate a variety of substances, namely low- and high-molecular mass therapeutics, antigens or DNA. Micro and nano scale spherical materials have been developed as carriers for therapies, using appropriated methodologies, in order to achieve a prolonged and controlled drug administration. This paper reviews the methodologies used for the production of polymeric micro/nanoparticles. Emulsions, phase separation, spray drying, ionic gelation, polyelectrolyte complexation and supercritical fluids precipitation are all widely used processes for polymeric micro/nanoencapsulation. This paper also discusses the recent developments and patents reported in this field. Other less conventional methodologies are also described, such as the use of superhydrophobic substrates to produce hydrogel and polymeric particulate biomaterials. Polymeric drug delivery systems have gained increased importance due to the need for improving the efficiency and versatility of existing therapies. This allows the development of innovative concepts that could create more efficient systems, which in turn may address many healthcare needs worldwide. The existing methods to produce polymeric release systems have some critical drawbacks, which compromise the efficiency of these techniques. Improvements and development of new methodologies could be achieved by using multidisciplinary approaches and tools taken from other subjects, including nanotechnologies, biomimetics, tissue engineering, polymer science or microfluidics.
[Methodology for clinical research in Orthodontics, the assets of the beOrtho website].
Ruiz, Martial; Thibult, François
2014-06-01
The rules applying to the "evidence-based" methodology strongly influenced the clinical research in orthodontics. However, the implementation of clinical studies requires rigour, important statistical and methodological knowledge, as well as a reliable environment in order to compile and store the data obtained from research. We developed the project "beOrtho.com" (based on orthodontic evidence) in order to fill up the gap between our desire to drive clinical research and the necessity of methodological rigour in the exploitation of its results. BeOrtho website was created to answer the issue of sample recruitment, data compilation and storage, while providing help for the methodological design of clinical studies. It allows the development and monitoring of clinical studies, as well as the creation of databases. On the other hand, we designed an evaluation grid for clinical studies which helps developing systematic reviews. In order to illustrate our point, we tested a research protocol evaluating the interest of the mandibular advancement in the framework of Class II treatment. © EDP Sciences, SFODF, 2014.
1980-10-01
Development; Problem Identification and Assessment for Aquatic Plant Management; Natural Succession of Aquatic Plants; Large-Scale Operations Management Test...of Insects and Pathogens for Control of Waterhyacinth in Louisiana; Large-Scale Operations Management Test to Evaluate Prevention Methodology for...Control of Eurasian Watermilfoil in Washington; Large-Scale Operations Management Test Using the White Amur at Lake Conway, Florida; and Aquatic Plant Control Activities in the Panama Canal Zone.
Scaling properties of foreign exchange volatility
NASA Astrophysics Data System (ADS)
Gençay, Ramazan; Selçuk, Faruk; Whitcher, Brandon
2001-01-01
In this paper, we investigate the scaling properties of foreign exchange volatility. Our methodology is based on a wavelet multi-scaling approach which decomposes the variance of a time series and the covariance between two time series on a scale by scale basis through the application of a discrete wavelet transformation. It is shown that foreign exchange rate volatilities follow different scaling laws at different horizons. Particularly, there is a smaller degree of persistence in intra-day volatility as compared to volatility at one day and higher scales. Therefore, a common practice in the risk management industry to convert risk measures calculated at shorter horizons into longer horizons through a global scaling parameter may not be appropriate. This paper also demonstrates that correlation between the foreign exchange volatilities is the lowest at the intra-day scales but exhibits a gradual increase up to a daily scale. The correlation coefficient stabilizes at scales one day and higher. Therefore, the benefit of currency diversification is the greatest at the intra-day scales and diminishes gradually at higher scales (lower frequencies). The wavelet cross-correlation analysis also indicates that the association between two volatilities is stronger at lower frequencies.
NASA Astrophysics Data System (ADS)
Chen, Liang-Chia; Chen, Yi-Shiuan; Chang, Yi-Wei; Lin, Shyh-Tsong; Yeh, Sheng Lih
2013-01-01
In this research, new nano-scale measurement methodology based on spectrally-resolved chromatic confocal interferometry (SRCCI) was successfully developed by employing integration of chromatic confocal sectioning and spectrally-resolve white light interferometry (SRWLI) for microscopic three dimensional surface profilometry. The proposed chromatic confocal method (CCM) using a broad band while light in combination with a specially designed chromatic dispersion objective is capable of simultaneously acquiring multiple images at a large range of object depths to perform surface 3-D reconstruction by single image shot without vertical scanning and correspondingly achieving a high measurement depth range up to hundreds of micrometers. A Linnik-type interferometric configuration based on spectrally resolved white light interferometry is developed and integrated with the CCM to simultaneously achieve nanoscale axis resolution for the detection point. The white-light interferograms acquired at the exit plane of the spectrometer possess a continuous variation of wavelength along the chromaticity axis, in which the light intensity reaches to its peak when the optical path difference equals to zero between two optical arms. To examine the measurement accuracy of the developed system, a pre-calibrated accurate step height target with a total step height of 10.10 μm was measured. The experimental result shows that the maximum measurement error was verified to be less than 0.3% of the overall measuring height.
Interdigital pair bonding for high frequency (20-50 MHz) ultrasonic composite transducers.
Liu, R; Harasiewicz, K A; Foster, F S
2001-01-01
Interdigital pair bonding is a novel methodology that enables the fabrication of high frequency piezoelectric composites with high volume fractions of the ceramic phase. This enhancement in ceramic volume fraction significantly reduces the dimensional scale of the epoxy phase and increases the related effective physical parameters of the composite, such as dielectric constant and the longitudinal sound velocity, which are major concerns in the development of high frequency piezoelectric composites. In this paper, a method called interdigital pair bonding (IPB) is used to prepare 1-3 piezoelectric composite with a pitch of 40 microns, a kerf of 4 microns, and a ceramic volume fraction of 81%. The composites prepared in this fashion exhibited a very pure thickness-mode resonance up to a frequency of 50 MHz. Unlike the 2-2 piezoelectric composites with the same ceramic and epoxy scales developed earlier, the anticipated lateral modes between 50 to 100 MHz were not observed in the current 1-3 composites. The mechanisms for the elimination of the lateral modes at high frequency are discussed. The effective electromechanical coupling coefficient of the composite was 0.72 at a frequency of 50 MHz. The composites showed a high longitudinal sound velocity of 4300 m/s and a high clamped dielectric constant of 1111 epsilon 0, which will benefit the development of high frequency ultrasonic transducers and especially high frequency transducer arrays for medical imaging.
Bertrand, Jane T; Njeuhmeli, Emmanuel; Forsythe, Steven; Mattison, Sarah K; Mahler, Hally; Hankins, Catherine A
2011-01-01
This paper proposes an approach to estimating the costs of demand creation for voluntary medical male circumcision (VMMC) scale-up in 13 countries of eastern and southern Africa. It addresses two key questions: (1) what are the elements of a standardized package for demand creation? And (2) what challenges exist and must be taken into account in estimating the costs of demand creation? We conducted a key informant study on VMMC demand creation using purposive sampling to recruit seven people who provide technical assistance to government programs and manage budgets for VMMC demand creation. Key informants provided their views on the important elements of VMMC demand creation and the most effective funding allocations across different types of communication approaches (e.g., mass media, small media, outreach/mobilization). The key finding was the wide range of views, suggesting that a standard package of core demand creation elements would not be universally applicable. This underscored the importance of tailoring demand creation strategies and estimates to specific country contexts before estimating costs. The key informant interviews, supplemented by the researchers' field experience, identified these issues to be addressed in future costing exercises: variations in the cost of VMMC demand creation activities by country and program, decisions about the quality and comprehensiveness of programming, and lack of data on critical elements needed to "trigger the decision" among eligible men. Based on this study's findings, we propose a seven-step methodological approach to estimate the cost of VMMC scale-up in a priority country, based on our key assumptions. However, further work is needed to better understand core components of a demand creation package and how to cost them. Notwithstanding the methodological challenges, estimating the cost of demand creation remains an essential element in deriving estimates of the total costs for VMMC scale-up in eastern and southern Africa.
Bertrand, Jane T.; Njeuhmeli, Emmanuel; Forsythe, Steven; Mattison, Sarah K.; Mahler, Hally; Hankins, Catherine A.
2011-01-01
Background This paper proposes an approach to estimating the costs of demand creation for voluntary medical male circumcision (VMMC) scale-up in 13 countries of eastern and southern Africa. It addresses two key questions: (1) what are the elements of a standardized package for demand creation? And (2) what challenges exist and must be taken into account in estimating the costs of demand creation? Methods and Findings We conducted a key informant study on VMMC demand creation using purposive sampling to recruit seven people who provide technical assistance to government programs and manage budgets for VMMC demand creation. Key informants provided their views on the important elements of VMMC demand creation and the most effective funding allocations across different types of communication approaches (e.g., mass media, small media, outreach/mobilization). The key finding was the wide range of views, suggesting that a standard package of core demand creation elements would not be universally applicable. This underscored the importance of tailoring demand creation strategies and estimates to specific country contexts before estimating costs. The key informant interviews, supplemented by the researchers' field experience, identified these issues to be addressed in future costing exercises: variations in the cost of VMMC demand creation activities by country and program, decisions about the quality and comprehensiveness of programming, and lack of data on critical elements needed to “trigger the decision” among eligible men. Conclusions Based on this study's findings, we propose a seven-step methodological approach to estimate the cost of VMMC scale-up in a priority country, based on our key assumptions. However, further work is needed to better understand core components of a demand creation package and how to cost them. Notwithstanding the methodological challenges, estimating the cost of demand creation remains an essential element in deriving estimates of the total costs for VMMC scale-up in eastern and southern Africa. PMID:22140450
Analytical Methodology for Predicting the Onset of Widespread Fatigue Damage in Fuselage Structure
NASA Technical Reports Server (NTRS)
Harris, Charles E.; Newman, James C., Jr.; Piascik, Robert S.; Starnes, James H., Jr.
1996-01-01
NASA has developed a comprehensive analytical methodology for predicting the onset of widespread fatigue damage in fuselage structure. The determination of the number of flights and operational hours of aircraft service life that are related to the onset of widespread fatigue damage includes analyses for crack initiation, fatigue crack growth, and residual strength. Therefore, the computational capability required to predict analytically the onset of widespread fatigue damage must be able to represent a wide range of crack sizes from the material (microscale) level to the global structural-scale level. NASA studies indicate that the fatigue crack behavior in aircraft structure can be represented conveniently by the following three analysis scales: small three-dimensional cracks at the microscale level, through-the-thickness two-dimensional cracks at the local structural level, and long cracks at the global structural level. The computational requirements for each of these three analysis scales are described in this paper.
The Secret Life of RNA: Lessons from Emerging Methodologies.
Medioni, Caroline; Besse, Florence
2018-01-01
The last past decade has witnessed a revolution in our appreciation of transcriptome complexity and regulation. This remarkable expansion in our knowledge largely originates from the advent of high-throughput methodologies, and the consecutive discovery that up to 90% of eukaryotic genomes are transcribed, thus generating an unanticipated large range of noncoding RNAs (Hangauer et al., 15(4):112, 2014). Besides leading to the identification of new noncoding RNA species, transcriptome-wide studies have uncovered novel layers of posttranscriptional regulatory mechanisms controlling RNA processing, maturation or translation, and each contributing to the precise and dynamic regulation of gene expression. Remarkably, the development of systems-level studies has been accompanied by tremendous progress in the visualization of individual RNA molecules in single cells, such that it is now possible to image RNA species with a single-molecule resolution from birth to translation or decay. Monitoring quantitatively, with unprecedented spatiotemporal resolution, the fate of individual molecules has been key to understanding the molecular mechanisms underlying the different steps of RNA regulation. This has also revealed biologically relevant, intracellular and intercellular heterogeneities in RNA distribution or regulation. More recently, the convergence of imaging and high-throughput technologies has led to the emergence of spatially resolved transcriptomic techniques that provide a means to perform large-scale analyses while preserving spatial information. By generating transcriptome-wide data on single-cell RNA content, or even subcellular RNA distribution, these methodologies are opening avenues to a wide range of network-level studies at the cell and organ-level, and promise to strongly improve disease diagnostic and treatment.In this introductory chapter, we highlight how recently developed technologies aiming at detecting and visualizing RNA molecules have contributed to the emergence of entirely new research fields, and to dramatic progress in our understanding of gene expression regulation.
Filtered Mass Density Function for Design Simulation of High Speed Airbreathing Propulsion Systems
NASA Technical Reports Server (NTRS)
Drozda, T. G.; Sheikhi, R. M.; Givi, Peyman
2001-01-01
The objective of this research is to develop and implement new methodology for large eddy simulation of (LES) of high-speed reacting turbulent flows. We have just completed two (2) years of Phase I of this research. This annual report provides a brief and up-to-date summary of our activities during the period: September 1, 2000 through August 31, 2001. In the work within the past year, a methodology termed "velocity-scalar filtered density function" (VSFDF) is developed and implemented for large eddy simulation (LES) of turbulent flows. In this methodology the effects of the unresolved subgrid scales (SGS) are taken into account by considering the joint probability density function (PDF) of all of the components of the velocity and scalar vectors. An exact transport equation is derived for the VSFDF in which the effects of the unresolved SGS convection, SGS velocity-scalar source, and SGS scalar-scalar source terms appear in closed form. The remaining unclosed terms in this equation are modeled. A system of stochastic differential equations (SDEs) which yields statistically equivalent results to the modeled VSFDF transport equation is constructed. These SDEs are solved numerically by a Lagrangian Monte Carlo procedure. The consistency of the proposed SDEs and the convergence of the Monte Carlo solution are assessed by comparison with results obtained by an Eulerian LES procedure in which the corresponding transport equations for the first two SGS moments are solved. The unclosed SGS convection, SGS velocity-scalar source, and SGS scalar-scalar source in the Eulerian LES are replaced by corresponding terms from VSFDF equation. The consistency of the results is then analyzed for a case of two dimensional mixing layer.
Teachers' Acceptance of Absenteeism: Towards Developing a Specific Scale
ERIC Educational Resources Information Center
Shapira-Lishchinsky, Orly; Ishan, Gamal
2013-01-01
Purpose: This study aims to develop and validate a measure of a specific attitude toward teachers' absenteeism that predicts this behavior more accurately than other general measures of job attitudes. Design/methodology/approach: Participants were 443 teachers from 21 secondary schools in Israel. In the first phase, the teachers answered anonymous…
A Teamwork-Oriented Air Traffic Control Simulator
2006-06-01
the software development methodology of this work , this chapter is viewed as the acquisition phase of this model. The end of the ...Maintenance phase Changed Verification Retirement Development Maintenance 37 because the different controllers working in these phases usually...traditional operation such as scaling the airport and personalizing the working environment. 4. Pilot Specification The
ERIC Educational Resources Information Center
Sample Mcmeeking, Laura B.; Cobb, R. Brian; Basile, Carole
2010-01-01
This paper introduces a variation on the post-test only cohort control design and addresses questions concerning both the methodological credibility and the practical utility of employing this design variation in evaluations of large-scale complex professional development programmes in mathematics education. The original design and design…
Performance-Based Service Quality Model: An Empirical Study on Japanese Universities
ERIC Educational Resources Information Center
Sultan, Parves; Wong, Ho
2010-01-01
Purpose: This paper aims to develop and empirically test the performance-based higher education service quality model. Design/methodology/approach: The study develops 67-item instrument for measuring performance-based service quality with a particular focus on the higher education sector. Scale reliability is confirmed using the Cronbach's alpha.…
Non-Born-Oppenheimer self-consistent field calculations with cubic scaling
NASA Astrophysics Data System (ADS)
Moncada, Félix; Posada, Edwin; Flores-Moreno, Roberto; Reyes, Andrés
2012-05-01
An efficient nuclear molecular orbital methodology is presented. This approach combines an auxiliary density functional theory for electrons (ADFT) and a localized Hartree product (LHP) representation for the nuclear wave function. A series of test calculations conducted on small molecules exposed that energy and geometry errors introduced by the use of ADFT and LHP approximations are small and comparable to those obtained by the use of electronic ADFT. In addition, sample calculations performed on (HF)n chains disclosed that the combined ADFT/LHP approach scales cubically with system size (n) as opposed to the quartic scaling of Hartree-Fock/LHP or DFT/LHP methods. Even for medium size molecules the improved scaling of the ADFT/LHP approach resulted in speedups of at least 5x with respect to Hartree-Fock/LHP calculations. The ADFT/LHP method opens up the possibility of studying nuclear quantum effects on large size systems that otherwise would be impractical.
TRENDS IN RURAL SULFUR CONCENTRATIONS
As the focus of environmental management has shifted toward regional- scale strategies, there is a growing need to develop statistical methodology for the estimation of regional trends in air pollution. This information is critical to assessing the effects of legislated emission ...
Michez, Adrien; Piégay, Hervé; Lisein, Jonathan; Claessens, Hugues; Lejeune, Philippe
2016-03-01
Riparian forests are critically endangered many anthropogenic pressures and natural hazards. The importance of riparian zones has been acknowledged by European Directives, involving multi-scale monitoring. The use of this very-high-resolution and hyperspatial imagery in a multi-temporal approach is an emerging topic. The trend is reinforced by the recent and rapid growth of the use of the unmanned aerial system (UAS), which has prompted the development of innovative methodology. Our study proposes a methodological framework to explore how a set of multi-temporal images acquired during a vegetative period can differentiate some of the deciduous riparian forest species and their health conditions. More specifically, the developed approach intends to identify, through a process of variable selection, which variables derived from UAS imagery and which scale of image analysis are the most relevant to our objectives.The methodological framework is applied to two study sites to describe the riparian forest through two fundamental characteristics: the species composition and the health condition. These characteristics were selected not only because of their use as proxies for the riparian zone ecological integrity but also because of their use for river management.The comparison of various scales of image analysis identified the smallest object-based image analysis (OBIA) objects (ca. 1 m(2)) as the most relevant scale. Variables derived from spectral information (bands ratios) were identified as the most appropriate, followed by variables related to the vertical structure of the forest. Classification results show good overall accuracies for the species composition of the riparian forest (five classes, 79.5 and 84.1% for site 1 and site 2). The classification scenario regarding the health condition of the black alders of the site 1 performed the best (90.6%).The quality of the classification models developed with a UAS-based, cost-effective, and semi-automatic approach competes successfully with those developed using more expensive imagery, such as multi-spectral and hyperspectral airborne imagery. The high overall accuracy results obtained by the classification of the diseased alders open the door to applications dedicated to monitoring of the health conditions of riparian forest. Our methodological framework will allow UAS users to manage large imagery metric datasets derived from those dense time series.
NASA Technical Reports Server (NTRS)
Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.
1989-01-01
The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.
Wäscher, Sebastian; Salloch, Sabine; Ritter, Peter; Vollmann, Jochen; Schildmann, Jan
2017-05-01
This article describes a process of developing, implementing and evaluating a clinical ethics support service intervention with the goal of building up a context-sensitive structure of minimal clinical-ethics in an oncology department without prior clinical ethics structure. Scholars from different disciplines have called for an improvement in the evaluation of clinical ethics support services (CESS) for different reasons over several decades. However, while a lot has been said about the concepts and methodological challenges of evaluating CESS up to the present time, relatively few empirical studies have been carried out. The aim of this article is twofold. On the one hand, it describes a process of development, modifying and evaluating a CESS intervention as part of the ETHICO research project, using the approach of qualitative-formative evaluation. On the other hand, it provides a methodological analysis which specifies the contribution of qualitative empirical methods to the (formative) evaluation of CESS. We conclude with a consideration of the strengths and limitations of qualitative evaluation research with regards to the evaluation and development of context sensitive CESS. We further discuss our own approach in contrast to rather traditional consult or committee models. © 2017 John Wiley & Sons Ltd.
Cunnama, Lucy; Sinanovic, Edina; Ramma, Lebogang; Foster, Nicola; Berrie, Leigh; Stevens, Wendy; Molapo, Sebaka; Marokane, Puleng; McCarthy, Kerrigan; Churchyard, Gavin; Vassall, Anna
2016-02-01
Estimating the incremental costs of scaling-up novel technologies in low-income and middle-income countries is a methodologically challenging and substantial empirical undertaking, in the absence of routine cost data collection. We demonstrate a best practice pragmatic approach to estimate the incremental costs of new technologies in low-income and middle-income countries, using the example of costing the scale-up of Xpert Mycobacterium tuberculosis (MTB)/resistance to riframpicin (RIF) in South Africa. We estimate costs, by applying two distinct approaches of bottom-up and top-down costing, together with an assessment of processes and capacity. The unit costs measured using the different methods of bottom-up and top-down costing, respectively, are $US16.9 and $US33.5 for Xpert MTB/RIF, and $US6.3 and $US8.5 for microscopy. The incremental cost of Xpert MTB/RIF is estimated to be between $US14.7 and $US17.7. While the average cost of Xpert MTB/RIF was higher than previous studies using standard methods, the incremental cost of Xpert MTB/RIF was found to be lower. Costs estimates are highly dependent on the method used, so an approach, which clearly identifies resource-use data collected from a bottom-up or top-down perspective, together with capacity measurement, is recommended as a pragmatic approach to capture true incremental cost where routine cost data are scarce. © 2016 The Authors. Health Economics published by John Wiley & Sons Ltd.
Sustaining and Scaling up the Impact of Professional Development Programmes
ERIC Educational Resources Information Center
Zehetmeier, Stefan
2015-01-01
This paper deals with a crucial topic: which factors influence the sustainability and scale-up of a professional development programme's impact? Theoretical models and empirical findings from impact research (e.g. Zehetmeier and Krainer, "ZDM Int J Math" 43(6/7):875-887, 2011) and innovation research (e.g. Cobb and Smith,…
Classifying E-Trainer Standards
ERIC Educational Resources Information Center
Julien, Anne
2005-01-01
Purpose: To set-up a classification of the types of profiles and competencies that are required to set-up a good e-learning programme. This approach provides a framework within which a set of standards can be defined for e-trainers. Design/methodology/approach: Open and distance learning (ODL) has been developing in Europe, due to new tools in…
Life Cycle Assessment of Wall Systems
NASA Astrophysics Data System (ADS)
Ramachandran, Sriranjani
Natural resource depletion and environmental degradation are the stark realities of the times we live in. As awareness about these issues increases globally, industries and businesses are becoming interested in understanding and minimizing the ecological footprints of their activities. Evaluating the environmental impacts of products and processes has become a key issue, and the first step towards addressing and eventually curbing climate change. Additionally, companies are finding it beneficial and are interested in going beyond compliance using pollution prevention strategies and environmental management systems to improve their environmental performance. Life-cycle Assessment (LCA) is an evaluative method to assess the environmental impacts associated with a products' life-cycle from cradle-to-grave (i.e. from raw material extraction through to material processing, manufacturing, distribution, use, repair and maintenance, and finally, disposal or recycling). This study focuses on evaluating building envelopes on the basis of their life-cycle analysis. In order to facilitate this analysis, a small-scale office building, the University Services Building (USB), with a built-up area of 148,101 ft2 situated on ASU campus in Tempe, Arizona was studied. The building's exterior envelope is the highlight of this study. The current exterior envelope is made of tilt-up concrete construction, a type of construction in which the concrete elements are constructed horizontally and tilted up, after they are cured, using cranes and are braced until other structural elements are secured. This building envelope is compared to five other building envelope systems (i.e. concrete block, insulated concrete form, cast-in-place concrete, steel studs and curtain wall constructions) evaluating them on the basis of least environmental impact. The research methodology involved developing energy models, simulating them and generating changes in energy consumption due to the above mentioned envelope types. Energy consumption data, along with various other details, such as building floor area, areas of walls, columns, beams etc. and their material types were imported into Life-Cycle Assessment software called ATHENA impact estimator for buildings. Using this four-stepped LCA methodology, the results showed that the Steel Stud envelope performed the best and less environmental impact compared to other envelope types. This research methodology can be applied to other building typologies.
Gdowski, Andrew; Johnson, Kaitlyn; Shah, Sunil; Gryczynski, Ignacy; Vishwanatha, Jamboor; Ranjan, Amalendu
2018-02-12
The process of optimization and fabrication of nanoparticle synthesis for preclinical studies can be challenging and time consuming. Traditional small scale laboratory synthesis techniques suffer from batch to batch variability. Additionally, the parameters used in the original formulation must be re-optimized due to differences in fabrication techniques for clinical production. Several low flow microfluidic synthesis processes have been reported in recent years for developing nanoparticles that are a hybrid between polymeric nanoparticles and liposomes. However, use of high flow microfluidic synthetic techniques has not been described for this type of nanoparticle system, which we will term as nanolipomer. In this manuscript, we describe the successful optimization and functional assessment of nanolipomers fabricated using a microfluidic synthesis method under high flow parameters. The optimal total flow rate for synthesis of these nanolipomers was found to be 12 ml/min and flow rate ratio 1:1 (organic phase: aqueous phase). The PLGA polymer concentration of 10 mg/ml and a DSPE-PEG lipid concentration of 10% w/v provided optimal size, PDI and stability. Drug loading and encapsulation of a representative hydrophobic small molecule drug, curcumin, was optimized and found that high encapsulation efficiency of 58.8% and drug loading of 4.4% was achieved at 7.5% w/w initial concentration of curcumin/PLGA polymer. The final size and polydispersity index of the optimized nanolipomer was 102.11 nm and 0.126, respectively. Functional assessment of uptake of the nanolipomers in C4-2B prostate cancer cells showed uptake at 1 h and increased uptake at 24 h. The nanolipomer was more effective in the cell viability assay compared to free drug. Finally, assessment of in vivo retention in mice of these nanolipomers revealed retention for up to 2 h and were completely cleared at 24 h. In this study, we have demonstrated that a nanolipomer formulation can be successfully synthesized and easily scaled up through a high flow microfluidic system with optimal characteristics. The process of developing nanolipomers using this methodology is significant as the same optimized parameters used for small batches could be translated into manufacturing large scale batches for clinical trials through parallel flow systems.
ERIC Educational Resources Information Center
Major, Louis; Watson, Steven
2018-01-01
Video is increasingly used to support in-service teacher professional development (TPD). Advances in affordability and usability of technology mean that interest is set to develop further. Studies in this area are diverse in terms of scale, methodology and context. This places limitations on undertaking a systematic review; therefore the authors…
Tsai, Jason Sheng-Hong; Du, Yan-Yi; Huang, Pei-Hsiang; Guo, Shu-Mei; Shieh, Leang-San; Chen, Yuhua
2011-07-01
In this paper, a digital redesign methodology of the iterative learning-based decentralized adaptive tracker is proposed to improve the dynamic performance of sampled-data linear large-scale control systems consisting of N interconnected multi-input multi-output subsystems, so that the system output will follow any trajectory which may not be presented by the analytic reference model initially. To overcome the interference of each sub-system and simplify the controller design, the proposed model reference decentralized adaptive control scheme constructs a decoupled well-designed reference model first. Then, according to the well-designed model, this paper develops a digital decentralized adaptive tracker based on the optimal analog control and prediction-based digital redesign technique for the sampled-data large-scale coupling system. In order to enhance the tracking performance of the digital tracker at specified sampling instants, we apply the iterative learning control (ILC) to train the control input via continual learning. As a result, the proposed iterative learning-based decentralized adaptive tracker not only has robust closed-loop decoupled property but also possesses good tracking performance at both transient and steady state. Besides, evolutionary programming is applied to search for a good learning gain to speed up the learning process of ILC. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Choi, Yonghoon; Vay, Stephanie A.; Woo, Jung-Hun; Choi, Kichul; Diskin, Glenn S.; Sachse, G. W.; Vadrevu, Krishna P.; Czech, E.
2009-01-01
Regional-scale measurements were made over the eastern United States (Intercontinental Chemical Transport Experiment - North America (INTEX-NA), summer 2004); Mexico (Megacity Initiative: Local and Global Research Observations (MILAGRO), March 2006); the eastern North Pacific and Alaska (INTEX-B May 2006); and the Canadian Arctic (Arctic Research of the Composition of the Troposphere from Aircraft and Satellites (ARCTAS), spring and summer 2008). For these field campaigns, instrumentation for the in situ measurement of CO2 was integrated on the NASA DC-8 research aircraft providing high-resolution (1 second) data traceable to the WMO CO2 mole fraction scale. These observations provide unique and definitive data sets via their intermediate-scale coverage and frequent vertical profiles (0.1 - 12 km) for examining the variability CO2 exhibits above the Earth s surface. A bottom-up anthropogenic CO2 emissions inventory (1deg 1deg) and processing methodology has also been developed for North America in support of these airborne science missions. In this presentation, the spatio-temporal distributions of CO2 and CO column values derived from the campaign measurements will be examined in conjunction with the emissions inventory and transport histories to aid in the interpretation of the CO2 observations.
Mild cognitive impairment: historical development and summary of research
Golomb, James; Kluger, Alan; Ferris, Steven H
2004-01-01
This review article broadly traces the historical development, diagnostic criteria, clinical and neuropathological characteristics, and treatment strategies related to mild cognitive impairment (MCI), The concept of MCI is considered in the context of other terms that have been developed to characterize the elderly with varying degrees of cognitive impairment Criteria based on clinical global scale ratings, cognitive test performance, and performance on other domains of functioning are discussed. Approaches employing clinical, neuropsychological, neuroimaging, biological, and molecular genetic methodology used in the validation of MCI are considered, including results from cross-sectional, longitudinal, and postmortem investigations. Results of recent drug treatment studies of MCI and related methodological issues are also addressed. PMID:22034453
Bio-markers: traceability in food safety issues.
Raspor, Peter
2005-01-01
Research and practice are focusing on development, validation and harmonization of technologies and methodologies to ensure complete traceability process throughout the food chain. The main goals are: scale-up, implementation and validation of methods in whole food chains, assurance of authenticity, validity of labelling and application of HACCP (hazard analysis and critical control point) to the entire food chain. The current review is to sum the scientific and technological basis for ensuring complete traceability. Tracing and tracking (traceability) of foods are complex processes due to the (bio)markers, technical solutions and different circumstances in different technologies which produces various foods (processed, semi-processed, or raw). Since the food is produced for human or animal consumption we need suitable markers to be stable and traceable all along the production chain. Specific biomarkers can have a function in technology and in nutrition. Such approach would make this development faster and more comprehensive and would make possible that food effect could be monitored with same set of biomarkers in consumer. This would help to develop and implement food safety standards that would be based on real physiological function of particular food component.
Quality Assessment of Physical and Organoleptic Instant Corn Rice on Scale-Up Process
NASA Astrophysics Data System (ADS)
Kumalasari, R.; Ekafitri, R.; Indrianti, N.
2017-12-01
Development of instant corn rice product has been successfully conducted on a laboratory scale. Corn has high carbohydrate content but low in fiber. The addition of fiber in instant corn rice, intended to improve the functioning of the product, and replace fiber loss during the process. Scale up process of Instant corn rice required to increase the production capacity. Scale up was the process to get identic output on a larger scale based on predetermined production scale. This study aimed to assess the changes and differences in the quality of instant corn rice during scale up. Instant corn rice scale up was done on production capacity 3 kg, 4 kg and 5 kg. Results showed that scale up of instant corn rice producing products with rehydration ratio ranges between 514% - 570%, the absorption rate ranged between 414% - 470%, swelling rate ranging between 119% - 134%, bulk density ranged from 0.3661 to 0.4745 (g/ml) and porosity ranging between 30-37%. The physical quality of instant corn rice on scale up were stable from the ones at laboratory scale on swelling rate, rehydration ratio, and absorption rate but not stable on bulk density and porosity. Organoleptic qualities were stable at increased scale compared on a laboratory scale. Bulk density was higher than those at laboratory scale, and the porosity was lower than those at laboratory scale.
Agrawal, Anjali M; Dudhedia, Mayur S; Zimny, Ewa
2016-02-01
The objective of the study was to develop an amorphous solid dispersion (ASD) for an insoluble compound X by hot melt extrusion (HME) process. The focus was to identify material-sparing approaches to develop bioavailable and stable ASD including scale up of HME process using minimal drug. Mixtures of compound X and polymers with and without surfactants or pH modifiers were evaluated by hot stage microscopy (HSM), polarized light microscopy (PLM), and modulated differential scanning calorimetry (mDSC), which enabled systematic selection of ASD components. Formulation blends of compound X with PVP K12 and PVP VA64 polymers were extruded through a 9-mm twin screw mini-extruder. Physical characterization of extrudates by PLM, XRPD, and mDSC indicated formation of single-phase ASD's. Accelerated stability testing was performed that allowed rapid selection of stable ASD's and suitable packaging configurations. Dissolution testing by a discriminating two-step non-sink dissolution method showed 70-80% drug release from prototype ASD's, which was around twofold higher compared to crystalline tablet formulations. The in vivo pharmacokinetic study in dogs showed that bioavailability from ASD of compound X with PVP VA64 was four times higher compared to crystalline tablet formulations. The HME process was scaled up from lab scale to clinical scale using volumetric scale up approach and scale-independent-specific energy parameter. The present study demonstrated systematic development of ASD dosage form and scale up of HME process to clinical scale using minimal drug (∼500 g), which allowed successful clinical batch manufacture of enabled formulation within 7 months.
NASA Astrophysics Data System (ADS)
Ramanathan, Ramya; Guin, Arijit; Ritzi, Robert W.; Dominic, David F.; Freedman, Vicky L.; Scheibe, Timothy D.; Lunt, Ian A.
2010-04-01
A geometric-based simulation methodology was developed and incorporated into a computer code to model the hierarchical stratal architecture, and the corresponding spatial distribution of permeability, in braided channel belt deposits. The code creates digital models of these deposits as a three-dimensional cubic lattice, which can be used directly in numerical aquifer or reservoir models for fluid flow. The digital models have stratal units defined from the kilometer scale to the centimeter scale. These synthetic deposits are intended to be used as high-resolution base cases in various areas of computational research on multiscale flow and transport processes, including the testing of upscaling theories. The input parameters are primarily univariate statistics. These include the mean and variance for characteristic lengths of sedimentary unit types at each hierarchical level, and the mean and variance of log-permeability for unit types defined at only the lowest level (smallest scale) of the hierarchy. The code has been written for both serial and parallel execution. The methodology is described in part 1 of this paper. In part 2 (Guin et al., 2010), models generated by the code are presented and evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ramanathan, Ramya; Guin, Arijit; Ritzi, Robert W.
A geometric-based simulation methodology was developed and incorporated into a computer code to model the hierarchical stratal architecture, and the corresponding spatial distribution of permeability, in braided channel belt deposits. The code creates digital models of these deposits as a three-dimensional cubic lattice, which can be used directly in numerical aquifer or reservoir models for fluid flow. The digital models have stratal units defined from the km scale to the cm scale. These synthetic deposits are intended to be used as high-resolution base cases in various areas of computational research on multiscale flow and transport processes, including the testing ofmore » upscaling theories. The input parameters are primarily univariate statistics. These include the mean and variance for characteristic lengths of sedimentary unit types at each hierarchical level, and the mean and variance of log-permeability for unit types defined at only the lowest level (smallest scale) of the hierarchy. The code has been written for both serial and parallel execution. The methodology is described in Part 1 of this series. In Part 2, models generated by the code are presented and evaluated.« less
Noh, Hyeonseok; Kwon, Seungwon; Cho, Seung-Yeon; Jung, Woo-Sang; Moon, Sang-Kwan; Park, Jung-Mi; Ko, Chang-Nam; Park, Seong-Uk
2017-10-01
This study aimed to examine the effectiveness and safety of acupuncture in the treatment of Parkinson's disease (PD). English, Chinese, and Korean electronic databases were searched up to June 2016. Randomized controlled trials (RCTs) were eligible. The methodological quality was assessed using Cochrane's risk of bias tool. Meta-analysis was performed using RevMan 5.3. In total, 42 studies involving 2625 participants were systematically reviewed. Participants treated using combined acupuncture and conventional medication (CM) showed significant improvements in total Unified PD Rating Scale (UPDRS), UPDRS I, UPDRS II, UPDRS III, and the Webster scale compared to those treated using CM alone. The combination of electroacupuncture and CM was significantly superior to CM alone in total UPDRS, UPDRS I, UPDRS II, and UPDRS IV. Similarly, the combination of scalp electroacupuncture, acupuncture, and CM was significantly more effective than CM alone in total UPDRS. However, our meta-analysis showed that the combination of electroacupuncture and CM was not significantly more effective than CM alone in UPDRS III, the Webster, and the Tension Assessment Scale. The results also failed to show that acupuncture was significantly more effective than placebo acupuncture in total UPDRS. Overall, the methodological quality of the RCTs was low. No serious adverse events were reported. We found that acupuncture might be a safe and useful adjunctive treatment for patients with PD. However, because of methodological flaws in the included studies, conclusive evidence is still lacking. More rigorous and well-designed placebo-controlled trials should be conducted. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Rutledge, Stacey; Cohen-Vogel, Lora; Osborne-Lampkin, La'Tara
2012-01-01
The National Center on Scaling up Effective Schools (NCSU) is a five-year project working to develop, implement, and test new processes to scale up effective practices in high schools that districts will be able to apply within the context of their own unique goals and circumstances. This report describes the activities and findings of the first…
Reference Model 5 (RM5): Oscillating Surge Wave Energy Converter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yu, Y. H.; Jenne, D. S.; Thresher, R.
This report is an addendum to SAND2013-9040: Methodology for Design and Economic Analysis of Marine Energy Conversion (MEC) Technologies. This report describes an Oscillating Water Column Wave Energy Converter (OSWEC) reference model design in a complementary manner to Reference Models 1-4 contained in the above report. A conceptual design for a taut moored oscillating surge wave energy converter was developed. The design had an annual electrical power of 108 kilowatts (kW), rated power of 360 kW, and intended deployment at water depths between 50 m and 100 m. The study includes structural analysis, power output estimation, a hydraulic power conversionmore » chain system, and mooring designs. The results were used to estimate device capital cost and annual operation and maintenance costs. The device performance and costs were used for the economic analysis, following the methodology presented in SAND2013-9040 that included costs for designing, manufacturing, deploying, and operating commercial-scale MEC arrays up to 100 devices. The levelized cost of energy estimated for the Reference Model 5 OSWEC, presented in this report, was for a single device and arrays of 10, 50, and 100 units, and it enabled the economic analysis to account for cost reductions associated with economies of scale. The baseline commercial levelized cost of energy estimate for the Reference Model 5 device in an array comprised of 10 units is $1.44/kilowatt-hour (kWh), and the value drops to approximately $0.69/kWh for an array of 100 units.« less
TEMPORAL VARIABILITY MEASUREMENT OF SPECIFIC VOLATILE ORGANIC COMPOUNDS
Methodology was developed to determine unambiguously trace levels of volatile organic compounds as they vary in concentration over a variety of time scales. his capability is important because volatile organic compounds (VOCs) are usually measure by time-integrative techniques th...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Judith C.
The purpose of this grant is to develop the multi-scale theoretical methods to describe the nanoscale oxidation of metal thin films, as the PI (Yang) extensive previous experience in the experimental elucidation of the initial stages of Cu oxidation by primarily in situ transmission electron microscopy methods. Through the use and development of computational tools at varying length (and time) scales, from atomistic quantum mechanical calculation, force field mesoscale simulations, to large scale Kinetic Monte Carlo (KMC) modeling, the fundamental underpinings of the initial stages of Cu oxidation have been elucidated. The development of computational modeling tools allows for acceleratedmore » materials discovery. The theoretical tools developed from this program impact a wide range of technologies that depend on surface reactions, including corrosion, catalysis, and nanomaterials fabrication.« less
NASA Astrophysics Data System (ADS)
Dai, Xiaoyu; Haussener, Sophia
2018-02-01
A multi-scale methodology for the radiative transfer analysis of heterogeneous media composed of morphologically-complex components on two distinct scales is presented. The methodology incorporates the exact morphology at the various scales and utilizes volume-averaging approaches with the corresponding effective properties to couple the scales. At the continuum level, the volume-averaged coupled radiative transfer equations are solved utilizing (i) effective radiative transport properties obtained by direct Monte Carlo simulations at the pore level, and (ii) averaged bulk material properties obtained at particle level by Lorenz-Mie theory or discrete dipole approximation calculations. This model is applied to a soot-contaminated snow layer, and is experimentally validated with reflectance measurements of such layers. A quantitative and decoupled understanding of the morphological effect on the radiative transport is achieved, and a significant influence of the dual-scale morphology on the macroscopic optical behavior is observed. Our results show that with a small amount of soot particles, of the order of 1ppb in volume fraction, the reduction in reflectance of a snow layer with large ice grains can reach up to 77% (at a wavelength of 0.3 μm). Soot impurities modeled as compact agglomerates yield 2-3% lower reduction of the reflectance in a thick show layer compared to snow with soot impurities modeled as chain-like agglomerates. Soot impurities modeled as equivalent spherical particles underestimate the reflectance reduction by 2-8%. This study implies that the morphology of the heterogeneities in a media significantly affects the macroscopic optical behavior and, specifically for the soot-contaminated snow, indicates the non-negligible role of soot on the absorption behavior of snow layers. It can be equally used in technical applications for the assessment and optimization of optical performance in multi-scale media.
Implementation of the Large-Scale Operations Management Test in the State of Washington.
1982-12-01
During FY 79, the U.S. Army Engineer Waterways Experiment Station (WES), Vicksburg, Miss., completed the first phase of its 3-year Large-Scale Operations Management Test (LSOMT). The LSOMT was designed to develop an operational plan to identify methodologies that can be implemented by the U.S. Army Engineer District, Seattle (NPS), to prevent the exotic aquatic macrophyte Eurasian watermilfoil (Myrophyllum spicatum L.) from reaching problem-level proportions in water bodies in the state of Washington. The WES developed specific plans as integral elements
Moscoso del Prado Martín, Fermín
2013-12-01
I introduce the Bayesian assessment of scaling (BAS), a simple but powerful Bayesian hypothesis contrast methodology that can be used to test hypotheses on the scaling regime exhibited by a sequence of behavioral data. Rather than comparing parametric models, as typically done in previous approaches, the BAS offers a direct, nonparametric way to test whether a time series exhibits fractal scaling. The BAS provides a simpler and faster test than do previous methods, and the code for making the required computations is provided. The method also enables testing of finely specified hypotheses on the scaling indices, something that was not possible with the previously available methods. I then present 4 simulation studies showing that the BAS methodology outperforms the other methods used in the psychological literature. I conclude with a discussion of methodological issues on fractal analyses in experimental psychology. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Generic simulation of multi-element ladar scanner kinematics in USU LadarSIM
NASA Astrophysics Data System (ADS)
Omer, David; Call, Benjamin; Pack, Robert; Fullmer, Rees
2006-05-01
This paper presents a generic simulation model for a ladar scanner with up to three scan elements, each having a steering, stabilization and/or pattern-scanning role. Of interest is the development of algorithms that automatically generate commands to the scan elements given beam-steering objectives out of the ladar aperture, and the base motion of the sensor platform. First, a straight-forward single-element body-fixed beam-steering methodology is presented. Then a unique multi-element redirective and reflective space-fixed beam-steering methodology is explained. It is shown that standard direction cosine matrix decomposition methods fail when using two orthogonal, space-fixed rotations, thus demanding the development of a new algorithm for beam steering. Finally, a related steering control methodology is presented that uses two separate optical elements mathematically combined to determine the necessary scan element commands. Limits, restrictions, and results on this methodology are presented.
ERIC Educational Resources Information Center
Carta, Jungbauer
2011-01-01
We describe an intensive course that integrates graduate and continuing education focused on the development and scale-up of chromatography processes used for the recovery and purification of proteins with special emphasis on biotherapeutics. The course includes lectures, laboratories, teamwork, and a design exercise and offers a complete view of…
STEM_CELL: a software tool for electron microscopy: part 2--analysis of crystalline materials.
Grillo, Vincenzo; Rossi, Francesca
2013-02-01
A new graphical software (STEM_CELL) for analysis of HRTEM and STEM-HAADF images is here introduced in detail. The advantage of the software, beyond its graphic interface, is to put together different analysis algorithms and simulation (described in an associated article) to produce novel analysis methodologies. Different implementations and improvements to state of the art approach are reported in the image analysis, filtering, normalization, background subtraction. In particular two important methodological results are here highlighted: (i) the definition of a procedure for atomic scale quantitative analysis of HAADF images, (ii) the extension of geometric phase analysis to large regions up to potentially 1μm through the use of under sampled images with aliasing effects. Copyright © 2012 Elsevier B.V. All rights reserved.
Parallel Unsteady Overset Mesh Methodology for a Multi-Solver Paradigm with Adaptive Cartesian Grids
2008-08-21
Engineer, U.S. Army Research Laboratory ., Matthew.W.Floros@nasa.gov, AIAA Member ‡Senior Research Scientist, Scaled Numerical Physics LLC., awissink...IV.E and IV.D). Good linear scalability was observed for all three cases up to 12 processors. Beyond that the scalability drops off depending on grid...Research Laboratory for the usage of SUGGAR module and Yikloon Lee at NAVAIR for the usage of the NAVAIR-IHC code. 13 of 22 American Institute of
NASA Technical Reports Server (NTRS)
Madrid, G. A.; Westmoreland, P. T.
1983-01-01
A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.
Analysis of Turbulent Boundary-Layer over Rough Surfaces with Application to Projectile Aerodynamics
1988-12-01
12 V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES ....................... 12 1. COMPONENT BUILD-UP IN DRAG...dimensional roughness. II. CLASSIFICATION OF PREDICTION METHODS Prediction methods can be classified into two main approache-: 1) Correlation methodologies ...data are availaNe. V. APPLICATION IN COMPONENT BUILD-UP METHODOLOGIES 1. COMPONENT BUILD-UP IN DRAG The new correlation can be used for an engine.ring
Griebeler, Eva Maria; Klein, Nicole; Sander, P. Martin
2013-01-01
Information on aging, maturation, and growth is important for understanding life histories of organisms. In extinct dinosaurs, such information can be derived from the histological growth record preserved in the mid-shaft cortex of long bones. Here, we construct growth models to estimate ages at death, ages at sexual maturity, ages at which individuals were fully-grown, and maximum growth rates from the growth record preserved in long bones of six sauropod dinosaur individuals (one indeterminate mamenchisaurid, two Apatosaurus sp., two indeterminate diplodocids, and one Camarasaurus sp.) and one basal sauropodomorph dinosaur individual (Plateosaurus engelhardti). Using these estimates, we establish allometries between body mass and each of these traits and compare these to extant taxa. Growth models considered for each dinosaur individual were the von Bertalanffy model, the Gompertz model, and the logistic model (LGM), all of which have inherently fixed inflection points, and the Chapman-Richards model in which the point is not fixed. We use the arithmetic mean of the age at the inflection point and of the age at which 90% of asymptotic mass is reached to assess respectively the age at sexual maturity or the age at onset of reproduction, because unambiguous indicators of maturity in Sauropodomorpha are lacking. According to an AIC-based model selection process, the LGM was the best model for our sauropodomorph sample. Allometries established are consistent with literature data on other Sauropodomorpha. All Sauropodomorpha reached full size within a time span similar to scaled-up modern mammalian megaherbivores and had similar maximum growth rates to scaled-up modern megaherbivores and ratites, but growth rates of Sauropodomorpha were lower than of an average mammal. Sauropodomorph ages at death probably were lower than that of average scaled-up ratites and megaherbivores. Sauropodomorpha were older at maturation than scaled-up ratites and average mammals, but younger than scaled-up megaherbivores. PMID:23840575
Griebeler, Eva Maria; Klein, Nicole; Sander, P Martin
2013-01-01
Information on aging, maturation, and growth is important for understanding life histories of organisms. In extinct dinosaurs, such information can be derived from the histological growth record preserved in the mid-shaft cortex of long bones. Here, we construct growth models to estimate ages at death, ages at sexual maturity, ages at which individuals were fully-grown, and maximum growth rates from the growth record preserved in long bones of six sauropod dinosaur individuals (one indeterminate mamenchisaurid, two Apatosaurus sp., two indeterminate diplodocids, and one Camarasaurus sp.) and one basal sauropodomorph dinosaur individual (Plateosaurus engelhardti). Using these estimates, we establish allometries between body mass and each of these traits and compare these to extant taxa. Growth models considered for each dinosaur individual were the von Bertalanffy model, the Gompertz model, and the logistic model (LGM), all of which have inherently fixed inflection points, and the Chapman-Richards model in which the point is not fixed. We use the arithmetic mean of the age at the inflection point and of the age at which 90% of asymptotic mass is reached to assess respectively the age at sexual maturity or the age at onset of reproduction, because unambiguous indicators of maturity in Sauropodomorpha are lacking. According to an AIC-based model selection process, the LGM was the best model for our sauropodomorph sample. Allometries established are consistent with literature data on other Sauropodomorpha. All Sauropodomorpha reached full size within a time span similar to scaled-up modern mammalian megaherbivores and had similar maximum growth rates to scaled-up modern megaherbivores and ratites, but growth rates of Sauropodomorpha were lower than of an average mammal. Sauropodomorph ages at death probably were lower than that of average scaled-up ratites and megaherbivores. Sauropodomorpha were older at maturation than scaled-up ratites and average mammals, but younger than scaled-up megaherbivores.
An automated methodology development. [software design for combat simulation
NASA Technical Reports Server (NTRS)
Hawley, L. R.
1985-01-01
The design methodology employed in testing the applicability of Ada in large-scale combat simulations is described. Ada was considered as a substitute for FORTRAN to lower life cycle costs and ease the program development efforts. An object-oriented approach was taken, which featured definitions of military targets, the capability of manipulating their condition in real-time, and one-to-one correlation between the object states and real world states. The simulation design process was automated by the problem statement language (PSL)/problem statement analyzer (PSA). The PSL/PSA system accessed the problem data base directly to enhance the code efficiency by, e.g., eliminating non-used subroutines, and provided for automated report generation, besides allowing for functional and interface descriptions. The ways in which the methodology satisfied the responsiveness, reliability, transportability, modifiability, timeliness and efficiency goals are discussed.
Applying scrum methods to ITS projects.
DOT National Transportation Integrated Search
2017-08-01
The introduction of new technology generally brings new challenges and new methods to help with deployments. Agile methodologies have been introduced in the information technology industry to potentially speed up development. The Federal Highway Admi...
Scaling ethics up and down: moral craft in clinical genetics and in global health research
Parker, Michael
2015-01-01
This paper engages with the question of what it is to ‘do good medical ethics’ in two ways. It begins with an exploration of what it might mean to say that health professionals practise good medical ethics as part of practising good ethical medicine. Using the example of the Genethics Club, a well-established national ethics forum for genetics professionals in the UK, the paper develops an account of moral craftsmanship grounded in the concepts of shared moral commitments and practices, moral work, ethics and living morality. In the light of this discussion, the paper goes on to consider what it might mean for a specialist in medical ethics, a bioethicist, to do good medical ethics. Finally, a research agenda focusing on the challenges of thinking about good medical ethics in a global context and a proposal for an innovative approach to bioethics methodology is outlined. PMID:25516955
SBROME: a scalable optimization and module matching framework for automated biosystems design.
Huynh, Linh; Tsoukalas, Athanasios; Köppe, Matthias; Tagkopoulos, Ilias
2013-05-17
The development of a scalable framework for biodesign automation is a formidable challenge given the expected increase in part availability and the ever-growing complexity of synthetic circuits. To allow for (a) the use of previously constructed and characterized circuits or modules and (b) the implementation of designs that can scale up to hundreds of nodes, we here propose a divide-and-conquer Synthetic Biology Reusable Optimization Methodology (SBROME). An abstract user-defined circuit is first transformed and matched against a module database that incorporates circuits that have previously been experimentally characterized. Then the resulting circuit is decomposed to subcircuits that are populated with the set of parts that best approximate the desired function. Finally, all subcircuits are subsequently characterized and deposited back to the module database for future reuse. We successfully applied SBROME toward two alternative designs of a modular 3-input multiplexer that utilize pre-existing logic gates and characterized biological parts.
CARES/Life Used for Probabilistic Characterization of MEMS Pressure Sensor Membranes
NASA Technical Reports Server (NTRS)
Nemeth, Noel N.
2002-01-01
Microelectromechanical systems (MEMS) devices are typically made from brittle materials such as silicon using traditional semiconductor manufacturing techniques. They can be etched (or micromachined) from larger structures or can be built up with material deposition processes. Maintaining dimensional control and consistent mechanical properties is considerably more difficult for MEMS because feature size is on the micrometer scale. Therefore, the application of probabilistic design methodology becomes necessary for MEMS. This was demonstrated at the NASA Glenn Research Center and Case Western Reserve University in an investigation that used the NASA-developed CARES/Life brittle material design program to study the probabilistic fracture strength behavior of single-crystal SiC, polycrystalline SiC, and amorphous Si3N4 pressurized 1-mm-square thin-film diaphragms. These materials are of interest because of their superior high-temperature characteristics, which are desirable for harsh environment applications such as turbine engine and rocket propulsion system hot sections.
Multiscale design and life-cycle based sustainability assessment of polymer nanocomposite coatings
NASA Astrophysics Data System (ADS)
Uttarwar, Rohan G.
In recent years, nanocoatings with exceptionally improved and new performance properties have found numerous applications in the automotive, aerospace, ship-making, chemical, electronics, steel, construction, and many other industries. Especially the formulations providing multiple functionalities to cured paint films are believed to dominate the coatings market in the near future. It has shifted the focus of research towards building sustainable coating recipes which can deliver multiple functionalities through applied films. The challenge to this exciting area of research arrives from the insufficient knowledge about structure-property correlations of nanocoating materials and their design complexity. Experimental efforts have been successful in developing certain types of nanopaints exhibiting improved properties. However, multifunctional nanopaint design optimality is extremely difficult to address if not impossible solely through experiments. In addition to this, the environmental implications and societal risks associated with this growing field of nanotechnology raise several questions related to its sustainable development. This research focuses on the study of a multiscale sustainable nanocoating design which can have the application from novel function envisioning and idea refinement point of view, to knowledge discovery and design solution derivation, and further to performance testing in industrial applications. The nanocoating design is studied using computational simulations of nano- to macro- scale models and sustainability assessment study over the life-cycle. Computational simulations aim at integrating top-down, goals/means, inductive systems engineering and bottom-up, cause and effect, deductive systems engineering approaches for material development. The in-silico paint resin system is a water-dispersible acrylic polymer with hydrophilic nanoparticles incorporated into it. The nano-scale atomistic and micro-scale coarse-grained (CG) level simulations are performed using molecular dynamics methodology to study several structural and morphological features such as effect of polymer molecular weight, polydispersity, rheology, nanoparticle volume fraction, size, shape and chemical nature on the bulk mechanical and self-cleaning properties of the coating film. At macro-scale, a paint spray system which is used for automotive coating application is studied by using CFD-based simulation methodology to generate crucial information about the effects of nanocoating technology on environmental emissions and coating film quality. The cradle-to-grave life-cycle based sustainability assessment study address all the critical issues related to economic benefits, environmental implications and societal effects of nanocoating technology through case studies of automotive coating systems. It is accomplished by identifying crucial correlations among measurable parameters at different stages and developing sustainability indicator matrices for analysis of each stage of life-cycle. The findings from the research can have great potential to draft useful conclusions in favor of future development of coating systems with novel functionalities and improved sustainability.
K. L. Frank; L. S. Kalkstein; B. W. Geils; H. W. Thistle
2008-01-01
This study developed a methodology to temporally classify large scale, upper level atmospheric conditions over North America, utilizing a newly-developed upper level synoptic classification (ULSC). Four meteorological variables: geopotential height, specific humidity, and u- and v-wind components, at the 500 hPa level over North America were obtained from the NCEP/NCAR...
Quantifying pediatric neuro-oncology risk factors: development of the neurological predictor scale.
Micklewright, Jackie L; King, Tricia Z; Morris, Robin D; Krawiecki, Nicolas
2008-04-01
Pediatric neuro-oncology researchers face methodological challenges associated with quantifying the influence of tumor and treatment-related risk factors on child outcomes. The Neurological Predictor Scale was developed to serve as a cumulative index of a child's exposure to risk factors. The clinical utility of the Neurological Predictor Scale was explored in a sample of 25 children with heterogeneous brain tumors. Consistent with expectation, a series of regression analyses demonstrated that the Neurological Predictor Scale significantly predicted composite intellectual functioning (r(2) = 0.21, p < .05), short-term memory (r(2) = 0.16, p = .05), and abstract visual reasoning abilities (r(2) = 0.28, p < .05). With the exception of chemotherapy, the Neurological Predictor Scale accounted for a significant amount of the variance in child intellectual functioning above and beyond individually examined variables. The Neurological Predictor Scale can be used to quickly quantify the cumulative risk factors associated with pediatric brain tumor diagnoses.
Staveteig, Sarah; Aryeetey, Richmond; Anie-Ansah, Michael; Ahiadeke, Clement; Ortiz, Ladys
2017-01-01
The intended meaning behind responses to standard questions posed in large-scale health surveys are not always well understood. Systematic follow-up studies, particularly those which pose a few repeated questions followed by open-ended discussions, are well positioned to gauge stability and consistency of data and to shed light on the intended meaning behind survey responses. Such follow-up studies require extensive coordination and face challenges in protecting respondent confidentiality during the process of recontacting and reinterviewing participants. We describe practical field strategies for undertaking a mixed methods follow-up study during a large-scale health survey. The study was designed as a mixed methods follow-up study embedded within the 2014 Ghana Demographic and Health Survey (GDHS). The study was implemented in 13 clusters. Android tablets were used to import reference data from the parent survey and to administer the questionnaire, which asked a mixture of closed- and open-ended questions on reproductive intentions, decision-making, and family planning. Despite a number of obstacles related to recontacting respondents and concern about respondent fatigue, over 92 percent of the selected sub-sample were successfully recontacted and reinterviewed; all consented to audio recording. A confidential linkage between GDHS data, follow-up tablet data, and audio transcripts was successfully created for the purpose of analysis. We summarize the challenges in follow-up study design, including ethical considerations, sample size, auditing, filtering, successful use of tablets, and share lessons learned for future such follow-up surveys.
Carta, Mauro Giovanni; Moro, Daniela; Wallet Oumar, Fadimata; Moro, Maria Francesca; Pintus, Mirra; Pintus, Elisa; Minerba, Luigi; Sancassiani, Federica; Pascolo-Fabrici, Elisabetta; Preti, Antonio; Bhugra, Dinesh Kumar
2018-01-01
The aim of this study was to carry out a 2-year follow-up of refugees in a camp in Burkina Faso who had been interviewed previously. We also aimed to verify whether the general conditions in which they lived (e.g., protection by international organizations and the conclusion of negotiations and new hope of returning to Mali and reunification with surviving family members) would affect their mental health state. This is a cross-sectional study repeated over time on a cohort of refugees. People living in the Subgandé camp who had participated in the first survey in 2012 were identified using informational chains and approached for follow-up. Those who agreed were interviewed using the Short Screening Scale for post-traumatic stress disorder (PTSD) and the K6 scale, French versions, to measure general psychopathology and the level of impairment. The second survey shows a dramatic decrease in psychopathological symptoms (positivity at K6 scale). Improvement was also conspicuous in the frequency of people with stress symptoms (positivity at Short Screening Scale for PTSD and simultaneous positivity to K6 scale). The frequency of people screened positive at the Short Screening Scale for PTSD had also decreased, but the level of improvement was not pronounced. Our findings confirm that when physical conditions improve, psychological symptoms can also improve. Although in the studied sample psychological factors, such as the hope of returning to their own land and thus the possibility of maintaining ethnic cohesion, may have played a role, future research carried out with a proper methodology and sufficient resources to identify protective factors is needed.
The ECCO Family of State Estimates: An Overview
NASA Astrophysics Data System (ADS)
Wunsch, C.
2008-12-01
The idea of ECCO (Estimating the Circulation and Climate of the Ocean)originated in the middle 1980s, when it became apparent that a global oceanographic observing system for the general circulation would become a reality as it did through the World Ocean Circulation Experiment. Observational design involved extremely diverse technologies and oceanic flow regimes. To be physically interpretable, these diverse data and physical processes would need to be combined into a useful, coherent, whole. Such a synthesis can only be done with a skillful GCM having useful resolution. ECCO originated as an experiment to demonstrate the technical feasibility of such a synthesis and to determine if any of several possible methods was preferable. In contrast to a number of other superficially similar efforts, mainly derived from weather forecasting methods, the ECCO goal was to estimate the long-term circulation mean and its variability on climate (decadal and longer) time scales in a form exactly satisfying known equations of motion. ECCO was made feasible with the simultaneous construction of a new GCM (MIT) along with the development of an automatic differentiation (AD) software tool(now called TAF) which rendered practical the method of Lagrange multipliers (called the adjoint method in oceanography). Parallel developments of simplified sequential methods (smoothers) provided an alternative, also practical, methodology. One can now use the existing (publicly available) machinery to discuss the ocean circulation and its variability. The huge variety of issues connected with the global circulation has meant that an entire family of estimates has grown up, each having different emphases (primarily global; but some primarily regional---the tropics, the Southern Ocean); some focussed on physics---the role of eddies or sea ice). The methodology leads, usefully, to intense scrutiny of data and model errors and spatio-temporal coverage. As with any estimation problem, no uniquely 'correct' solution is now or ever going to be possible-- -only evolving best estimates. Further development of these and similar methodologies appears to be a necessary, inevitable, and growing component of oceanography and climate.
Bioavailability of indomethacin-saccharin cocrystals.
Jung, Min-Sook; Kim, Jeong-Soo; Kim, Min-Soo; Alhalaweh, Amjad; Cho, Wonkyung; Hwang, Sung-Joo; Velaga, Sitaram P
2010-11-01
Pharmaceutical cocrystals are new solid forms with physicochemical properties that appear promising for drug product development. However, the in-vivo bioavailability of cocrystals has rarely been addressed. The cocrystal of indomethacin (IND), a Biopharmaceutical Classification System class II drug, with saccharin (SAC) has been shown to have higher solubility than IND at all pH. In this study, we aimed to evaluate the in-vitro dissolution and in-vivo bioavailability of IND-SAC cocrystals in comparison with IND in a physical mixture and the marketed product Indomee. Scale-up of the cocrystals was undertaken using cooling batch crystallisation without seeding. The chemical and physical purity of the up-scaled material was verified using high-performance liquid chromatography, differential scanning calorimetry and powder X-ray diffraction. The IND-SAC cocrystals and IND plus SAC were mixed with lactose and the formulations were placed into gelatin capsules. In-vitro dissolution studies were then performed using the rotating basket dissolution method. The intrinsic dissolution rate of IND and IND-SAC cocrystals was also determined. Finally, a bioavailability study for the formulations was conducted in beagle dogs. The plasma samples were analysed using high-performance liquid chromatography and the pharmacokinetic data were analysed using standard methodologies. The bulk cocrystals (i.e. scaled-up material) were chemically and physically pure. The in-vitro dissolution rate of the cocrystals was higher than that of IND and similar to that of Indomee at pH 7.4 and pH 1.2. The in-vivo bioavailability of the IND-SAC cocrystals in dogs was significantly higher (ANOVA, P<0.05) than that of IND but not significantly different from Indomee (ANOVA, P>0.05). The study indicates that the improved aqueous solubility of the cocrystals leads to improved bioavailability of IND. Thus, the cocrystals are a viable alternative solid form that can improve the dissolution rate and bioavailability of poorly soluble drugs. © 2010 The Authors. JPP © 2010 Royal Pharmaceutical Society of Great Britain.
Evaluating a collaborative IT based research and development project.
Khan, Zaheer; Ludlow, David; Caceres, Santiago
2013-10-01
In common with all projects, evaluating an Information Technology (IT) based research and development project is necessary in order to discover whether or not the outcomes of the project are successful. However, evaluating large-scale collaborative projects is especially difficult as: (i) stakeholders from different countries are involved who, almost inevitably, have diverse technological and/or application domain backgrounds and objectives; (ii) multiple and sometimes conflicting application specific and user-defined requirements exist; and (iii) multiple and often conflicting technological research and development objectives are apparent. In this paper, we share our experiences based on the large-scale integrated research project - The HUMBOLDT project - with project duration of 54 months, involving contributions from 27 partner organisations, plus 4 sub-contractors from 14 different European countries. In the HUMBOLDT project, a specific evaluation methodology was defined and utilised for the user evaluation of the project outcomes. The user evaluation performed on the HUMBOLDT Framework and its associated nine application scenarios from various application domains, resulted in not only an evaluation of the integrated project, but also revealed the benefits and disadvantages of the evaluation methodology. This paper presents the evaluation methodology, discusses in detail the process of applying it to the HUMBOLDT project and provides an in-depth analysis of the results, which can be usefully applied to other collaborative research projects in a variety of domains. Copyright © 2013 Elsevier Ltd. All rights reserved.
Conceptual Design of an APT Reusable Spaceplane
NASA Astrophysics Data System (ADS)
Corpino, S.; Viola, N.
This paper concerns the conceptual design of an Aerial Propellant Transfer reusable spaceplane carried out during our PhD course under the supervision of prof. Chiesa. The new conceptual design methodology employed in order to develop the APT concept and the main characteristics of the spaceplane itself will be presented and discussed. The methodology for conceptual design has been worked out during the last three years. It was originally thought for atmospheric vehicle design but, thanks to its modular structure which makes it very flexible, it has been possible to convert it to space transportation systems design by adding and/or modifying a few modules. One of the major improvements has been for example the conception and development of the mission simulation and trajectory optimisation module. The methodology includes as main characteristics and innovations the latest techniques of geometric modelling and logistic, operational and cost aspects since the first stages of the project. Computer aided design techniques are used to obtain a better definition of the product at the end of the conceptual design phase and virtual reality concepts are employed to visualise three-dimensional installation and operational aspects, at least in part replacing full-scale mock- ups. The introduction of parametric three-dimensional CAD software integrated into the conceptual design methodology represents a great improvement because it allows to carry out different layouts and to assess them immediately. It is also possible to link the CAD system to a digital prototyping software which combines 3D visualisation and assembly analysis, useful to define the so-called Digital Mock-Up at Conceptual Level (DMUCL) which studies the integration between the on board systems, sized with simulation algorithms, and the airframe. DMUCL represents a very good means to integrate the conceptual design with a methodology turned towards dealing with Reliability, Availability, Maintainability and Safety characteristics. Several applications of this conceptual design methodology have been carried out in order to validate it. Here we will show one of the most challenging case studies: the APT73 spaceplane. Today the demand for getting access to space is increasing and fully reusable launch vehicles are likely to play a key role in future space activities, but up until now this kind of space system has not been successfully developed. The ideal reusable launcher should be a vehicle able to maintain physical integrity during its mission, to takeoff and land at any conventional airport, to be operated with a minimum maintenance effort and to guarantee an adequate safety level. Thanks to its flexibility it should be able to enter the desired orbital plane and to abort its mission any time in case of mishap. Moreover considerable cost reduction could be expected only by having extremely high launch rates comparable to today's aircraft fleets in the commercial airlines business. In our opinion the solution which better meets these specifications is the Aerial Propellant Transfer spaceplane concept, the so called "one stage and a half" space vehicle, which takes off and climbs to meet a tanker aircraft to be aerially re-fuelled and then, after disconnecting from the tanker, it flies to reach the orbit. The APT73 has been designed to reach the Low Earth Orbit to perform two kinds of mission: 1) to release payloads; 2) to be flown as crew return vehicle from the ISS. The concept has emerged from a set of preliminary choices established at the beginning of the project: Possible variants to the basic plan have been investigated and a trade off analysis has been carried out in order to obtain the optimum configuration. Listed below are the options that have been evaluated: This paper provides a technical description of the APT73 and illustrates the design challenges encountered in the development of the project.
NASA Astrophysics Data System (ADS)
Mujumdar, Pradeep P.
2014-05-01
Climate change results in regional hydrologic change. The three prominent signals of global climate change, viz., increase in global average temperatures, rise in sea levels and change in precipitation patterns convert into signals of regional hydrologic change in terms of modifications in water availability, evaporative water demand, hydrologic extremes of floods and droughts, water quality, salinity intrusion in coastal aquifers, groundwater recharge and other related phenomena. A major research focus in hydrologic sciences in recent years has been assessment of impacts of climate change at regional scales. An important research issue addressed in this context deals with responses of water fluxes on a catchment scale to the global climatic change. A commonly adopted methodology for assessing the regional hydrologic impacts of climate change is to use the climate projections provided by the General Circulation Models (GCMs) for specified emission scenarios in conjunction with the process-based hydrologic models to generate the corresponding hydrologic projections. The scaling problem arising because of the large spatial scales at which the GCMs operate compared to those required in distributed hydrologic models, and their inability to satisfactorily simulate the variables of interest to hydrology are addressed by downscaling the GCM simulations to hydrologic scales. Projections obtained with this procedure are burdened with a large uncertainty introduced by the choice of GCMs and emission scenarios, small samples of historical data against which the models are calibrated, downscaling methods used and other sources. Development of methodologies to quantify and reduce such uncertainties is a current area of research in hydrology. In this presentation, an overview of recent research carried out by the author's group on assessment of hydrologic impacts of climate change addressing scale issues and quantification of uncertainties is provided. Methodologies developed with conditional random fields, Dempster-Shafer theory, possibility theory, imprecise probabilities and non-stationary extreme value theory are discussed. Specific applications on uncertainty quantification in impacts on streamflows, evaporative water demands, river water quality and urban flooding are presented. A brief discussion on detection and attribution of hydrologic change at river basin scales, contribution of landuse change and likely alterations in return levels of hydrologic extremes is also provided.
Information requirements and methodology for development of an EVA crewmember's heads up display
NASA Astrophysics Data System (ADS)
Petrek, J. S.
This paper presents a systematic approach for developing a Heads Up Display (HUD) to be used within the helmet of the Extra Vehicular Activity (EVA) crewmember. The information displayed on the EVA HUD will be analogous to EVA Flight Data File (FDF) information, which is an integral part of NASA's current Space Transportation System. Another objective is to determine information requirements and media techniques ultimately leading to the helmet-mounted HUD presentation technique.
NASA Astrophysics Data System (ADS)
Dorrestijn, Jesse; Kahn, Brian H.; Teixeira, João; Irion, Fredrick W.
2018-05-01
Satellite observations are used to obtain vertical profiles of variance scaling of temperature (T) and specific humidity (q) in the atmosphere. A higher spatial resolution nadir retrieval at 13.5 km complements previous Atmospheric Infrared Sounder (AIRS) investigations with 45 km resolution retrievals and enables the derivation of power law scaling exponents to length scales as small as 55 km. We introduce a variable-sized circular-area Monte Carlo methodology to compute exponents instantaneously within the swath of AIRS that yields additional insight into scaling behavior. While this method is approximate and some biases are likely to exist within non-Gaussian portions of the satellite observational swaths of T and q, this method enables the estimation of scale-dependent behavior within instantaneous swaths for individual tropical and extratropical systems of interest. Scaling exponents are shown to fluctuate between β = -1 and -3 at scales ≥ 500 km, while at scales ≤ 500 km they are typically near β ≈ -2, with q slightly lower than T at the smallest scales observed. In the extratropics, the large-scale β is near -3. Within the tropics, however, the large-scale β for T is closer to -1 as small-scale moist convective processes dominate. In the tropics, q exhibits large-scale β between -2 and -3. The values of β are generally consistent with previous works of either time-averaged spatial variance estimates, or aircraft observations that require averaging over numerous flight observational segments. The instantaneous variance scaling methodology is relevant for cloud parameterization development and the assessment of time variability of scaling exponents.
Farrell, Patrick; Sun, Jacob; Gao, Meg; Sun, Hong; Pattara, Ben; Zeiser, Arno; D'Amore, Tony
2012-08-17
A simple approach to the development of an aerobic scaled-down fermentation model is presented to obtain more consistent process performance during the scale-up of recombinant protein manufacture. Using a constant volumetric oxygen mass transfer coefficient (k(L)a) for the criterion of a scale-down process, the scaled-down model can be "tuned" to match the k(L)a of any larger-scale target by varying the impeller rotational speed. This approach is demonstrated for a protein vaccine candidate expressed in recombinant Escherichia coli, where process performance is shown to be consistent among 2-L, 20-L, and 200-L scales. An empirical correlation for k(L)a has also been employed to extrapolate to larger manufacturing scales. Copyright © 2012 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer
2018-01-01
The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.
Andersen, Randi Dovland; Jylli, Leena; Ambuel, Bruce
2014-06-01
There is little empirical evidence regarding the translation and cultural adaptation of self-report and observational outcome measures. Studies that evaluate and further develop existing practices are needed. This study explores the use of cognitive interviews in the translation and cultural adaptation of observational measures, using the COMFORT behavioral scale as an example, and demonstrates a structured approach to the analysis of data from cognitive interviews. The COMFORT behavioral scale is developed for assessment of distress and pain in a pediatric intensive care setting. Qualitative, descriptive methodological study. One general public hospital trust in southern Norway. N=12. Eight nurses, three physicians and one nurse assistant, from different wards and with experience caring for children. We translated the COMFORT behavior scale into Norwegian before conducting individual cognitive interviews. Participants first read and then used the translated version of the COMFORT behavioral scale to assess pain based on a 3-min film vignette depicting an infant in pain/distress. Two cognitive interview techniques were applied: Thinking Aloud (TA) during the assessment and Verbal Probing (VP) afterwards. In TA the participant verbalized his/her thought process while completing the COMFORT behavioral scale. During VP the participant responded to specific questions related to understanding of the measure, information recall and the decision process. We audio recorded, transcribed and analyzed interviews using a structured qualitative method (cross-case analysis based on predefined categories and development of a results matrix). Our analysis revealed two categories of problems: (1) Scale problems, warranting a change in the wording of the scale, including (a) translation errors, (b) content not understood as intended, and (c) differences between the original COMFORT scale and the revised COMFORT behavioral scale; and (2) Rater-context problems caused by (a) unfamiliarity with the scale, (b) lack of knowledge and experience, and (c) assessments based on a film vignette. Cognitive interviews revealed problems with both the translated and the original versions of the scale and suggested solutions that enhanced the validity of both versions. Cognitive interviews might be seen as a complement to current published best practices for translation and cultural adaptation. Copyright © 2013 Elsevier Ltd. All rights reserved.
Methodological Quality Assessment of Meta-Analyses of Hyperthyroidism Treatment.
Qin, Yahong; Yao, Liang; Shao, Feifei; Yang, Kehu; Tian, Limin
2018-01-01
Hyperthyroidism is a common condition that is associated with increased morbidity and mortality. A number of meta-analyses (MAs) have assessed the therapeutic measures for hyperthyroidism, including antithyroid drugs, surgery, and radioiodine, however, the methodological quality has not been evaluated. This study evaluated the methodological quality and summarized the evidence obtained from MAs of hyperthyroidism treatments for radioiodine, antithyroid drugs, and surgery. We searched the PubMed, EMBASE, Cochrane Library, Web of Science, and Chinese Biomedical Literature Database databases. Two investigators independently assessed the meta-analyses titles and abstracts for inclusion. Methodological quality was assessed using the validated AMSTAR (Assessing the Methodological Quality of Systematic Reviews) tool. A total of 26 MAs fulfilled the inclusion criteria. Based on the AMSTAR scores, the average methodological quality was 8.31, with large variability ranging from 4 to 11. The methodological quality of English meta-analyses was better than that of Chinese meta-analyses. Cochrane reviews had better methodological quality than non-Cochrane reviews due to better study selection and data extraction, the inclusion of unpublished studies, and better reporting of study characteristics. The authors did not report conflicts of interest in 53.8% meta-analyses, and 19.2% did not report the harmful effects of treatment. Publication bias was not assessed in 38.5% of meta-analyses, and 19.2% did not report the follow-up time. Large-scale assessment of methodological quality of meta-analyses of hyperthyroidism treatment highlighted methodological strengths and weaknesses. Consideration of scientific quality when formulating conclusions should be made explicit. Future meta-analyses should improve on reporting conflict of interest. © Georg Thieme Verlag KG Stuttgart · New York.
Structural connectivity at a national scale: Wildlife corridors in Tanzania.
Riggio, Jason; Caro, Tim
2017-01-01
Wildlife corridors can help maintain landscape connectivity but novel methods must be developed to assess regional structural connectivity quickly and cheaply so as to determine where expensive and time-consuming surveys of functional connectivity should occur. We use least-cost methods, the most accurate and up-to-date land conversion dataset for East Africa, and interview data on wildlife corridors, to develop a single, consistent methodology to systematically assess wildlife corridors at a national scale using Tanzania as a case study. Our research aimed to answer the following questions; (i) which corridors may still remain open (i.e. structurally connected) at a national scale, (ii) which have been potentially severed by anthropogenic land conversion (e.g., agriculture and settlements), (iii) where are other remaining potential wildlife corridors located, and (iv) which protected areas with lower forms of protection (e.g., Forest Reserves and Wildlife Management Areas) may act as stepping-stones linking more than one National Park and/or Game Reserve. We identify a total of 52 structural connections between protected areas that are potentially open to wildlife movement, and in so doing add 23 to those initially identified by other methods in Tanzanian Government reports. We find that the vast majority of corridors noted in earlier reports as "likely to be severed" have actually not been cut structurally (21 of 24). Nonetheless, nearly a sixth of all the wildlife corridors identified in Tanzania in 2009 have potentially been separated by land conversion, and a third now pass across lands likely to be converted to human use in the near future. Our study uncovers two reserves with lower forms of protection (Uvinza Forest Reserve in the west and Wami-Mbiki Wildlife Management Area in the east) that act as apparently crucial stepping-stones between National Parks and/or Game Reserves and therefore require far more serious conservation support. Methods used in this study are readily applicable to other nations lacking detailed data on wildlife movements and plagued by inaccurate land cover datasets. Our results are the first step in identifying wildlife corridors at a regional scale and provide a springboard for ground-based follow-up conservation.
Structural connectivity at a national scale: Wildlife corridors in Tanzania
Caro, Tim
2017-01-01
Wildlife corridors can help maintain landscape connectivity but novel methods must be developed to assess regional structural connectivity quickly and cheaply so as to determine where expensive and time-consuming surveys of functional connectivity should occur. We use least-cost methods, the most accurate and up-to-date land conversion dataset for East Africa, and interview data on wildlife corridors, to develop a single, consistent methodology to systematically assess wildlife corridors at a national scale using Tanzania as a case study. Our research aimed to answer the following questions; (i) which corridors may still remain open (i.e. structurally connected) at a national scale, (ii) which have been potentially severed by anthropogenic land conversion (e.g., agriculture and settlements), (iii) where are other remaining potential wildlife corridors located, and (iv) which protected areas with lower forms of protection (e.g., Forest Reserves and Wildlife Management Areas) may act as stepping-stones linking more than one National Park and/or Game Reserve. We identify a total of 52 structural connections between protected areas that are potentially open to wildlife movement, and in so doing add 23 to those initially identified by other methods in Tanzanian Government reports. We find that the vast majority of corridors noted in earlier reports as “likely to be severed” have actually not been cut structurally (21 of 24). Nonetheless, nearly a sixth of all the wildlife corridors identified in Tanzania in 2009 have potentially been separated by land conversion, and a third now pass across lands likely to be converted to human use in the near future. Our study uncovers two reserves with lower forms of protection (Uvinza Forest Reserve in the west and Wami-Mbiki Wildlife Management Area in the east) that act as apparently crucial stepping-stones between National Parks and/or Game Reserves and therefore require far more serious conservation support. Methods used in this study are readily applicable to other nations lacking detailed data on wildlife movements and plagued by inaccurate land cover datasets. Our results are the first step in identifying wildlife corridors at a regional scale and provide a springboard for ground-based follow-up conservation. PMID:29095901
Spicer, Neil; Berhanu, Della; Bhattacharya, Dipankar; Tilley-Gyado, Ritgak Dimka; Gautham, Meenakshi; Schellenberg, Joanna; Tamire-Woldemariam, Addis; Umar, Nasir; Wickremasinghe, Deepthi
2016-11-25
Donors commonly fund innovative interventions to improve health in the hope that governments of low and middle-income countries will scale-up those that are shown to be effective. Yet innovations can be slow to be adopted by country governments and implemented at scale. Our study explores this problem by identifying key contextual factors influencing scale-up of maternal and newborn health innovations in three low-income settings: Ethiopia, the six states of northeast Nigeria and Uttar Pradesh state in India. We conducted 150 semi-structured interviews in 2012/13 with stakeholders from government, development partner agencies, externally funded implementers including civil society organisations, academic institutions and professional associations to understand scale-up of innovations to improve the health of mothers and newborns these study settings. We analysed interview data with the aid of a common analytic framework to enable cross-country comparison, with Nvivo to code themes. We found that multiple contextual factors enabled and undermined attempts to catalyse scale-up of donor-funded maternal and newborn health innovations. Factors influencing government decisions to accept innovations at scale included: how health policy decisions are made; prioritising and funding maternal and newborn health; and development partner harmonisation. Factors influencing the implementation of innovations at scale included: health systems capacity in the three settings; and security in northeast Nigeria. Contextual factors influencing beneficiary communities' uptake of innovations at scale included: sociocultural contexts; and access to healthcare. We conclude that context is critical: externally funded implementers need to assess and adapt for contexts if they are to successfully position an innovation for scale-up.
BLACK Carbon Emissions from Diesel Sources in the Largest Arctic City: Case Study of Murmansk
NASA Astrophysics Data System (ADS)
Evans, M.; Kholod, N.; Malyshev, V.; Tretyakova, S.; Gusev, E.; Yu, S.; Barinov, A.
2014-12-01
Russia has very little data on its black carbon (BC) emissions. Because Russia makes up such a large share of the Arctic, understanding Russian emissions will improve our understanding of overall BC levels, BC in the Arctic and the link between BC and climate change. This paper provides a detailed, bottom-up inventory of BC emissions from diesel sources in Murmansk, Russia, along with uncertainty estimates associated with these emissions. The research team developed a detailed data collection methodology. The methodology involves assessing the vehicle fleet and activity in Murmansk using traffic, parking lot and driver surveys combined with an existing database from a vehicle inspection station and statistical data. The team also assessed the most appropriate emission factors, drawing from both Russian and international inventory methodologies. The researchers also compared fuel consumption using statistical data and bottom-up fuel calculations. They then calculated emissions for on-road transportation, off-road transportation (including mines), diesel generators, fishing and other sources. The article also provides a preliminary assessment of Russia-wide emissions of black carbon from diesel sources.
Accounting For Greenhouse Gas Emissions From Flooded ...
Nearly three decades of research has demonstrated that the inundation of rivers and terrestrial ecosystems behind dams can lead to enhanced rates of greenhouse gas emissions, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. The research approaches include 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane emissions. To inform th
Accounting for Greenhouse Gas Emissions from Reservoirs ...
Nearly three decades of research has demonstrated that the impoundment of rivers and the flooding of terrestrial ecosystems behind dams can increase rates of greenhouse gas emission, particularly methane. The 2006 IPCC Guidelines for National Greenhouse Gas Inventories includes a methodology for estimating methane emissions from flooded lands, but the methodology was published as an appendix to be used as a ‘basis for future methodological development’ due to a lack of data. Since the 2006 Guidelines were published there has been a 6-fold increase in the number of peer reviewed papers published on the topic including reports from reservoirs in India, China, Africa, and Russia. Furthermore, several countries, including Iceland, Switzerland, and Finland, have developed country specific methodologies for including flooded lands methane emissions in their National Greenhouse Gas Inventories. This presentation will include a review of the literature on flooded land methane emissions and approaches that have been used to upscale emissions for national inventories. We will also present ongoing research in the United States to develop a country specific methodology. In the U.S., research approaches include: 1) an effort to develop predictive relationships between methane emissions and reservoir characteristics that are available in national databases, such as reservoir size and drainage area, and 2) a national-scale probabilistic survey of reservoir methane em
Kempers, Jari; Ketting, Evert; Chandra-Mouli, Venkatraman; Raudsepp, Triin
2015-01-08
A growing number of middle-income countries are scaling up youth-friendly sexual and reproductive health pilot projects to national level programmes. Yet, there are few case studies on successful national level scale-up of such programmes. Estonia is an excellent example of scale-up of a small grassroots adolescent sexual and reproductive health initiative to a national programme, which most likely contributed to improved adolescent sexual and reproductive health outcomes. This study; (1) documents the scale-up process of the Estonian youth clinic network 1991-2013, and (2) analyses factors that contributed to the successful scale-up. This research provides policy makers and programme managers with new insights to success factors of the scale-up, that can be used to support planning, implementation and scale-up of adolescent sexual and reproductive health programmes in other countries. Information on the scale-up process and success factors were collected by conducting a literature review and interviewing key stakeholders. The findings were analysed using the WHO-ExpandNet framework, which provides a step-by-step process approach for design, implementation and assessment of the results of scaling-up health innovations. The scale-up was divided into two main phases: (1) planning the scale-up strategy 1991-1995 and (2) managing the scaling-up 1996-2013. The planning phase analysed innovation, user organizations (youth clinics), environment and resource team (a national NGO and international assistance). The managing phase examines strategic choices, advocacy, organization, resource mobilization, monitoring and evaluation, strategic planning and management of the scale-up. The main factors that contributed to the successful scale-up in Estonia were: (1) favourable social and political climate, (2) clear demonstrated need for the adolescent services, (3) a national professional organization that advocated, coordinated and represented the youth clinics, (4) enthusiasm and dedication of personnel, (5) acceptance by user organizations and (6) sustainable funding through the national health insurance system. Finally, the measurement and recognition of the remarkable improvement of adolescent SRH outcomes in Estonia would not have been possible without development of good reporting and monitoring systems, and many studies and international publications.
Assessment scale of risk for surgical positioning injuries 1
Lopes, Camila Mendonça de Moraes; Haas, Vanderlei José; Dantas, Rosana Aparecida Spadoti; de Oliveira, Cheila Gonçalves; Galvão, Cristina Maria
2016-01-01
ABSTRACT Objective: to build and validate a scale to assess the risk of surgical positioning injuries in adult patients. Method: methodological research, conducted in two phases: construction and face and content validation of the scale and field research, involving 115 patients. Results: the Risk Assessment Scale for the Development of Injuries due to Surgical Positioning contains seven items, each of which presents five subitems. The scale score ranges between seven and 35 points in which, the higher the score, the higher the patient's risk. The Content Validity Index of the scale corresponded to 0.88. The application of Student's t-test for equality of means revealed the concurrent criterion validity between the scores on the Braden scale and the constructed scale. To assess the predictive criterion validity, the association was tested between the presence of pain deriving from surgical positioning and the development of pressure ulcer, using the score on the Risk Assessment Scale for the Development of Injuries due to Surgical Positioning (p<0.001). The interrater reliability was verified using the intraclass correlation coefficient, equal to 0.99 (p<0.001). Conclusion: the scale is a valid and reliable tool, but further research is needed to assess its use in clinical practice. PMID:27579925
Development of cultural belief scales for mammography screening.
Russell, Kathleen M; Champion, Victoria L; Perkins, Susan M
2003-01-01
To develop instruments to measure culturally related variables that may influence mammography screening behaviors in African American women. Instrumentation methodology. Community organizations and public housing in the Indianapolis, IN, area. 111 African American women with a mean age of 60.2 years and 64 Caucasian women with a mean age of 60 years. After item development, scales were administered. Data were analyzed by factor analysis, item analysis via internal consistency reliability using Cronbach's alpha, and independent t tests and logistic regression analysis to test theoretical relationships. Personal space preferences, health temporal orientation, and perceived personal control. Space items were factored into interpersonal and physical scales. Temporal orientation items were loaded on one factor, creating a one-dimensional scale. Control items were factored into internal and external control scales. Cronbach's alpha coefficients for the scales ranged from 0.76-0.88. Interpersonal space preference, health temporal orientation, and perceived internal control scales each were predictive of mammography screening adherence. The three tested scales were reliable and valid. Scales, on average, did not differ between African American and Caucasian populations. These scales may be useful in future investigations aimed at increasing mammography screening in African American and Caucasian women.
ERIC Educational Resources Information Center
Caniglia, Guido; John, Beatrice; Kohler, Martin; Bellina, Leonie; Wiek, Arnim; Rojas, Christopher; Laubichler, Manfred D.; Lang, Daniel
2016-01-01
Purpose: This paper aims to present an experience-based learning framework that provides a bottom-up, student-centered entrance point for the development of systems thinking, normative and collaborative competencies in sustainability. Design/methodology/approach: The framework combines mental mapping with exploratory walking. It interweaves…
Subjective and objective scales to assess the development of children cerebral palsy.
Pietrzak, S; Jóźwiak, M
2001-01-01
Many scoring systems hale been constructed to assess the motor development of cerebral palsy children and to evaluate the effectiveness of treatment. According to the purposes they fulfill, these instruments may be divided into three types: discriminative, evaluative and predictive. The design and measurement methodology are the criteria that determine whether a given scale is quantitative or qualitative in nature, and whether is should be considered to be objective or subjective. The article presents the "reaching, losing and regaining" scale (constructed by the authors to assess functional development and its changes in certain periods of time), the Munich Functional Development Diagnostics, and the Gross Motor Function Measure (GMFM). Special attention is given to the GMFM, its methods, evaluation of results, and application. A comparison of subjective and objective assessment of two cerebral palsy children is included.
Mahjouri, Najmeh; Ardestani, Mojtaba
2011-01-01
In this paper, two cooperative and non-cooperative methodologies are developed for a large-scale water allocation problem in Southern Iran. The water shares of the water users and their net benefits are determined using optimization models having economic objectives with respect to the physical and environmental constraints of the system. The results of the two methodologies are compared based on the total obtained economic benefit, and the role of cooperation in utilizing a shared water resource is demonstrated. In both cases, the water quality in rivers satisfies the standards. Comparing the results of the two mentioned approaches shows the importance of acting cooperatively to achieve maximum revenue in utilizing a surface water resource while the river water quantity and quality issues are addressed.
Coarse-grained molecular dynamics simulations for giant protein-DNA complexes
NASA Astrophysics Data System (ADS)
Takada, Shoji
Biomolecules are highly hierarchic and intrinsically flexible. Thus, computational modeling calls for multi-scale methodologies. We have been developing a coarse-grained biomolecular model where on-average 10-20 atoms are grouped into one coarse-grained (CG) particle. Interactions among CG particles are tuned based on atomistic interactions and the fluctuation matching algorithm. CG molecular dynamics methods enable us to simulate much longer time scale motions of much larger molecular systems than fully atomistic models. After broad sampling of structures with CG models, we can easily reconstruct atomistic models, from which one can continue conventional molecular dynamics simulations if desired. Here, we describe our CG modeling methodology for protein-DNA complexes, together with various biological applications, such as the DNA duplication initiation complex, model chromatins, and transcription factor dynamics on chromatin-like environment.
Health Economics at the Crossroads of Centuries – From the Past to the Future
Jakovljevic, Mihajlo (Michael); Ogura, Seiritsu
2016-01-01
Health economics, as an interdisciplinary science, has experienced exceptionally bold evolution through the past eight decades. Generations of committed scholars have built up huge body of knowledge and developed a set of methodological tools to assist health-care authorities with resource allocation process. Following its conception at the US National Bureau of Economic Research and Ivy League US Universities, this science has spread across the Globe. It has adapted to a myriad of local conditions and needs of the national health systems with diverse historical legacies, medical services provision, and financing patterns. Challenge of financial sustainability facing modern day health systems remains primarily attributable to population aging, prosperity diseases, large scale migrations, rapid urbanization, and technological innovation in medicine. Despite promising developments in developing countries with emerging BRICS markets on the lead, rising out-of-pocket health spending continues to threaten affordability of medical care. Universal health coverage extension will likely remain serious challenge even for some of the most advanced OECD nations. These complex circumstances create strong drivers for inevitable further development of health economics. We believe that this interdisciplinary health science shall leave long-lasting blue print to be visible for decades to come. PMID:27376055
Health Economics at the Crossroads of Centuries - From the Past to the Future.
Jakovljevic, Mihajlo Michael; Ogura, Seiritsu
2016-01-01
Health economics, as an interdisciplinary science, has experienced exceptionally bold evolution through the past eight decades. Generations of committed scholars have built up huge body of knowledge and developed a set of methodological tools to assist health-care authorities with resource allocation process. Following its conception at the US National Bureau of Economic Research and Ivy League US Universities, this science has spread across the Globe. It has adapted to a myriad of local conditions and needs of the national health systems with diverse historical legacies, medical services provision, and financing patterns. Challenge of financial sustainability facing modern day health systems remains primarily attributable to population aging, prosperity diseases, large scale migrations, rapid urbanization, and technological innovation in medicine. Despite promising developments in developing countries with emerging BRICS markets on the lead, rising out-of-pocket health spending continues to threaten affordability of medical care. Universal health coverage extension will likely remain serious challenge even for some of the most advanced OECD nations. These complex circumstances create strong drivers for inevitable further development of health economics. We believe that this interdisciplinary health science shall leave long-lasting blue print to be visible for decades to come.
Development and exemplification of a model for Teacher Assessment in Primary Science
NASA Astrophysics Data System (ADS)
Davies, D. J.; Earle, S.; McMahon, K.; Howe, A.; Collier, C.
2017-09-01
The Teacher Assessment in Primary Science project is funded by the Primary Science Teaching Trust and based at Bath Spa University. The study aims to develop a whole-school model of valid, reliable and manageable teacher assessment to inform practice and make a positive impact on primary-aged children's learning in science. The model is based on a data-flow 'pyramid' (analogous to the flow of energy through an ecosystem), whereby the rich formative assessment evidence gathered in the classroom is summarised for monitoring, reporting and evaluation purposes [Nuffield Foundation. (2012). Developing policy, principles and practice in primary school science assessment. London: Nuffield Foundation]. Using a design-based research (DBR) methodology, the authors worked in collaboration with teachers from project schools and other expert groups to refine, elaborate, validate and operationalise the data-flow 'pyramid' model, resulting in the development of a whole-school self-evaluation tool. In this paper, we argue that a DBR approach to theory-building and school improvement drawing upon teacher expertise has led to the identification, adaptation and successful scaling up of a promising approach to school self-evaluation in relation to assessment in science.
NEPP Update of Independent Single Event Upset Field Programmable Gate Array Testing
NASA Technical Reports Server (NTRS)
Berg, Melanie; Label, Kenneth; Campola, Michael; Pellish, Jonathan
2017-01-01
This presentation provides a NASA Electronic Parts and Packaging (NEPP) Program update of independent Single Event Upset (SEU) Field Programmable Gate Array (FPGA) testing including FPGA test guidelines, Microsemi RTG4 heavy-ion results, Xilinx Kintex-UltraScale heavy-ion results, Xilinx UltraScale+ single event effect (SEE) test plans, development of a new methodology for characterizing SEU system response, and NEPP involvement with FPGA security and trust.
Traffic-related particulate air pollution exposure in urban areas
NASA Astrophysics Data System (ADS)
Borrego, C.; Tchepel, O.; Costa, A. M.; Martins, H.; Ferreira, J.; Miranda, A. I.
In the last years, there has been an increase of scientific studies confirming that long- and short-term exposure to particulate matter (PM) pollution leads to adverse health effects. The development of a methodology for the determination of accumulated human exposure in urban areas is the main objective of the current work, combining information on concentrations at different microenvironments and population time-activity pattern data. A link between a mesoscale meteorological and dispersion model and a local scale air quality model was developed to define the boundary conditions for the local scale application. The time-activity pattern of the population was derived from statistical information for different sub-population groups and linked to digital city maps. Finally, the hourly PM 10 concentrations for indoor and outdoor microenvironments were estimated for the Lisbon city centre, which was chosen as the case-study, based on the local scale air quality model application for a selected period. This methodology is a first approach to estimate population exposure, calculated as the total daily values above the thresholds recommended for long- and short-term health effects. Obtained results reveal that in Lisbon city centre a large number of persons are exposed to PM levels exceeding the legislated limit value.
Using communication technology to support professional development in teaching science
NASA Astrophysics Data System (ADS)
Sundberg, Cheryl White
The impact of collaboration via communication technology on follow-up to on-site professional development was the central focus of this hypothesis-generating study. The study used a combination of quantitative methodology and qualitative methodology. A convenient sample of 18 teachers was drawn from 208 teachers in an existing professional development program in science in a southeastern state. The statewide professional development program focused on energy education with a strong emphasis on using technology to enhance learning. Data sources included E-mail messages, lesson plans, photographs, workshop evaluations, surveys, and the report of an external reviewer. The study focused on two on-site workshops, February and June 2000 that were designed to model constructivist pedagogy and instruct teachers in effective utilization of computer-based laboratories in science classrooms. Follow-up to the on-site workshops was facilitated with several communication technologies (Internet, E-mail, telephone, and mail). The research found E-mail was the preferred mode for follow-up to on-site workshops because of the convenience of the medium. Barriers to effective distance professional development were time constraints, equipment failure, and lack of consistent Internet access to teachers in rural and under-served areas. Teacher characteristics of the sample, teacher efficacy, technical skill, experience, and constructivist pedagogy did not appear to impact the use of communication technologies as a means of follow-up to on-site professional development workshops. However, teacher efficacy might have negatively impacted effective implementation of calculator-based laboratory technology in the classroom. The study found E-mail was the most convenient and efficient way to facilitate follow-up to on-site professional development. Teacher characteristics (efficacy, technical skill, experience, and constructivist pedagogy) did not appear to impact the use of E-mail to facilitate follow-up to on-site professional development. Consistent access to the Internet was problematic for teachers in rural and under-served areas.
Ventelou, Bruno; Moatti, Jean-Paul; Videau, Yann; Kazatchkine, Michel
2008-01-02
Macroeconomic policy requirements may limit the capacity of national and international policy-makers to allocate sufficient resources for scaling-up access to HIV care and treatment in developing countries. An endogenous growth model, which takes into account the evolution of society's human capital, was used to assess the macroeconomic impact of policies aimed at scaling-up access to HIV/AIDS treatment in six African countries (Angola, Benin, Cameroon, Central African Republic, Ivory Coast and Zimbabwe). The model results showed that scaling-up access to treatment in the affected population would limit gross domestic product losses due to AIDS although differently from country to country. In our simulated scenarios of access to antiretroviral therapy, only 10.3% of the AIDS shock is counterbalanced in Zimbabwe, against 85.2% in Angola and even 100.0% in Benin (a total recovery). For four out of the six countries (Angola, Benin, Cameroon, Ivory Coast), the macro-economic gains of scaling-up would become potentially superior to its associated costs in 2010. Despite the variability of HIV prevalence rates between countries, macro-economic estimates strongly suggest that a massive investment in scaling-up access to HIV treatment may efficiently counteract the detrimental long-term impact of the HIV pandemic on economic growth, to the extent that the AIDS shock has not already driven the economy beyond an irreversible 'no-development epidemiological trap'.
Remote sensing with unmanned aircraft systems for precision agriculture applications
USDA-ARS?s Scientific Manuscript database
The Federal Aviation Administration is revising regulations for using unmanned aircraft systems (UAS) in the national airspace. An important potential application of UAS may be as a remote-sensing platform for precision agriculture, but simply down-scaling remote sensing methodologies developed usi...
EPA developed a methodology for estimating the health benefits of benzene reductions and has applied it in a metropolitan-scale case study of the benefits of CAA controls on benzene emissions to accompany the main 812 analysis.
EPA developed a methodology for estimating the health benefits of benzene reductions and has applied it in a metropolitan-scale case study of the benefits of CAA controls on benzene emissions to accompany the main 812 analysis.
NASA Astrophysics Data System (ADS)
Ronco, P.; Gallina, V.; Torresan, S.; Zabeo, A.; Semenzin, E.; Critto, A.; Marcomini, A.
2014-12-01
In recent years, the frequency of catastrophes induced by natural hazards has increased, and flood events in particular have been recognized as one of the most threatening water-related disasters. Severe floods have occurred in Europe over the last decade, causing loss of life, displacement of people and heavy economic losses. Flood disasters are growing in frequency as a consequence of many factors, both climatic and non-climatic. Indeed, the current increase of water-related disasters can be mainly attributed to the increase of exposure (elements potentially at risk in flood-prone area) and vulnerability (i.e. economic, social, geographic, cultural and physical/environmental characteristics of the exposure). Besides these factors, the undeniable effect of climate change is projected to strongly modify the usual pattern of the hydrological cycle by intensifying the frequency and severity of flood events at the local, regional and global scale. Within this context, the need for developing effective and pro-active strategies, tools and actions which allow one to assess and (possibly) to reduce the flood risks that threatens different relevant receptors becomes urgent. Several methodologies to assess the risk posed by water-related natural hazards have been proposed so far, but very few of them can be adopted to implement the last European Flood Directive (FD). This paper is intended to introduce and present a state-of-the-art Regional Risk Assessment (RRA) methodology to appraise the risk posed by floods from a physical-environmental perspective. The methodology, developed within the recently completed FP7-KULTURisk Project (Knowledge-based approach to develop a cULTUre of Risk prevention - KR) is flexible and can be adapted to different case studies (i.e. plain rivers, mountain torrents, urban and coastal areas) and spatial scales (i.e. from catchment to the urban scale). The FD compliant KR-RRA methodology is based on the concept of risk being function of hazard, exposure and vulnerability. It integrates the outputs of various hydrodynamic models with site-specific bio-geophysical and socio-economic indicators (e.g. slope, land cover, population density, economic activities etc.) to develop tailored risk indexes and GIS-based maps for each of the selected receptors (i.e. people, buildings, infrastructure, agriculture, natural and semi-natural systems, cultural heritage) in the considered region. It further compares the baseline scenario with alternative scenarios, where different structural and/or non-structural mitigation measures are planned and eventually implemented. As demonstrated in the companion paper (Part 2, Ronco et al., 2014), risk maps, along with related statistics, allow one to identify and classify, on a relative scale, areas at risk which are more likely to be affected by floods and support the development of strategic adaptation and prevention measures to minimizing flood impacts. In addition, the outcomes of the RRA can be eventually used for a further socio-economic assessment, considering the tangible and intangible costs as well as the human dimension of vulnerability.
Trends in the Development of Technology and Engineering Education in Emerging Economies
ERIC Educational Resources Information Center
Adegbuyi, P. A. O.; Uhomoibhi, J. O.
2008-01-01
Purpose: The purpose of this paper is to report on the nature of technology and engineering education provision in developing economies, focusing on Nigeria. Design/methodology/approach: The paper draws on recent developments in the shake up and implementation of new measures to call for quality technology and engineering education in the country,…
NASA Astrophysics Data System (ADS)
Ellis, Matthew O. A.; Stamenova, Maria; Sanvito, Stefano
2017-12-01
There exists a significant challenge in developing efficient magnetic tunnel junctions with low write currents for nonvolatile memory devices. With the aim of analyzing potential materials for efficient current-operated magnetic junctions, we have developed a multi-scale methodology combining ab initio calculations of spin-transfer torque with large-scale time-dependent simulations using atomistic spin dynamics. In this work we introduce our multiscale approach, including a discussion on a number of possible schemes for mapping the ab initio spin torques into the spin dynamics. We demonstrate this methodology on a prototype Co/MgO/Co/Cu tunnel junction showing that the spin torques are primarily acting at the interface between the Co free layer and MgO. Using spin dynamics we then calculate the reversal switching times for the free layer and the critical voltages and currents required for such switching. Our work provides an efficient, accurate, and versatile framework for designing novel current-operated magnetic devices, where all the materials details are taken into account.
V&V Of CFD Modeling Of The Argonne Bubble Experiment: FY15 Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoyt, Nathaniel C.; Wardle, Kent E.; Bailey, James L.
2015-09-30
In support of the development of accelerator-driven production of the fission product Mo 99, computational fluid dynamics (CFD) simulations of an electron-beam irradiated, experimental-scale bubble chamber have been conducted in order to aid in interpretation of existing experimental results, provide additional insights into the physical phenomena, and develop predictive thermal hydraulic capabilities that can be applied to full-scale target solution vessels. Toward that end, a custom hybrid Eulerian-Eulerian-Lagrangian multiphase solver was developed, and simulations have been performed on high-resolution meshes. Good agreement between experiments and simulations has been achieved, especially with respect to the prediction of the maximum temperature ofmore » the uranyl sulfate solution in the experimental vessel. These positive results suggest that the simulation methodology that has been developed will prove to be suitable to assist in the development of full-scale production hardware.« less
Fail-Safe Design for Large Capacity Lithium-Ion Battery Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G. H.; Smith, K.; Ireland, J.
2012-07-15
A fault leading to a thermal runaway in a lithium-ion battery is believed to grow over time from a latent defect. Significant efforts have been made to detect lithium-ion battery safety faults to proactively facilitate actions minimizing subsequent losses. Scaling up a battery greatly changes the thermal and electrical signals of a system developing a defect and its consequent behaviors during fault evolution. In a large-capacity system such as a battery for an electric vehicle, detecting a fault signal and confining the fault locally in the system are extremely challenging. This paper introduces a fail-safe design methodology for large-capacity lithium-ionmore » battery systems. Analysis using an internal short circuit response model for multi-cell packs is presented that demonstrates the viability of the proposed concept for various design parameters and operating conditions. Locating a faulty cell in a multiple-cell module and determining the status of the fault's evolution can be achieved using signals easily measured from the electric terminals of the module. A methodology is introduced for electrical isolation of a faulty cell from the healthy cells in a system to prevent further electrical energy feed into the fault. Experimental demonstration is presented supporting the model results.« less
Veronesi, G; Bertù, L; Mombelli, S; Cimmino, L; Caravati, G; Conti, M; Abate, T; Ferrario, M M
2011-01-01
We discuss the methodological aspects related to the evaluation of turn-over and up-down sizing as indicators of work-related stress, in complex organizations like a university hospital. To estimate the active workers population we developed an algorithm which integrated several administrative databases. The indicators were standardized to take into account some potential confounders (age, sex, work seniority) when considering different hospital structures and job mansions. Main advantages of our method include flexibility in the choice of the analysis detail (hospital units, job mansions, a combination of both) and the possibility to describe over-time trends to measure the success of preventive strategies.
Intelligent Performance Analysis with a Natural Language Interface
NASA Astrophysics Data System (ADS)
Juuso, Esko K.
2017-09-01
Performance improvement is taken as the primary goal in the asset management. Advanced data analysis is needed to efficiently integrate condition monitoring data into the operation and maintenance. Intelligent stress and condition indices have been developed for control and condition monitoring by combining generalized norms with efficient nonlinear scaling. These nonlinear scaling methodologies can also be used to handle performance measures used for management since management oriented indicators can be presented in the same scale as intelligent condition and stress indices. Performance indicators are responses of the process, machine or system to the stress contributions analyzed from process and condition monitoring data. Scaled values are directly used in intelligent temporal analysis to calculate fluctuations and trends. All these methodologies can be used in prognostics and fatigue prediction. The meanings of the variables are beneficial in extracting expert knowledge and representing information in natural language. The idea of dividing the problems into the variable specific meanings and the directions of interactions provides various improvements for performance monitoring and decision making. The integrated temporal analysis and uncertainty processing facilitates the efficient use of domain expertise. Measurements can be monitored with generalized statistical process control (GSPC) based on the same scaling functions.
2014-01-01
Background Access to mobile phone technology has rapidly expanded in developing countries. In Africa, mHealth is a relatively new concept and questions arise regarding reliability of the technology used for health outcomes. This review documents strengths, weaknesses, opportunities, and threats (SWOT) of mHealth projects in Africa. Methods A systematic review of peer-reviewed literature on mHealth projects in Africa, between 2003 and 2013, was carried out using PubMed and OvidSP. Data was synthesized using a SWOT analysis methodology. Results were grouped to assess specific aspects of project implementation in terms of sustainability and mid/long-term results, integration to the health system, management process, scale-up and replication, and legal issues, regulations and standards. Results Forty-four studies on mHealth projects in Africa were included and classified as: “patient follow-up and medication adherence” (n = 19), “staff training, support and motivation” (n = 2), “staff evaluation, monitoring and guidelines compliance” (n = 4), “drug supply-chain and stock management” (n = 2), “patient education and awareness” (n = 1), “disease surveillance and intervention monitoring” (n = 4), “data collection/transfer and reporting” (n = 10) and “overview of mHealth projects” (n = 2). In general, mHealth projects demonstrate positive health-related outcomes and their success is based on the accessibility, acceptance and low-cost of the technology, effective adaptation to local contexts, strong stakeholder collaboration, and government involvement. Threats such as dependency on funding, unclear healthcare system responsibilities, unreliable infrastructure and lack of evidence on cost-effectiveness challenge their implementation. mHealth projects can potentially be scaled-up to help tackle problems faced by healthcare systems like poor management of drug stocks, weak surveillance and reporting systems or lack of resources. Conclusions mHealth in Africa is an innovative approach to delivering health services. In this fast-growing technological field, research opportunities include assessing implications of scaling-up mHealth projects, evaluating cost-effectiveness and impacts on the overall health system. PMID:24555733
Aranda-Jan, Clara B; Mohutsiwa-Dibe, Neo; Loukanova, Svetla
2014-02-21
Access to mobile phone technology has rapidly expanded in developing countries. In Africa, mHealth is a relatively new concept and questions arise regarding reliability of the technology used for health outcomes. This review documents strengths, weaknesses, opportunities, and threats (SWOT) of mHealth projects in Africa. A systematic review of peer-reviewed literature on mHealth projects in Africa, between 2003 and 2013, was carried out using PubMed and OvidSP. Data was synthesized using a SWOT analysis methodology. Results were grouped to assess specific aspects of project implementation in terms of sustainability and mid/long-term results, integration to the health system, management process, scale-up and replication, and legal issues, regulations and standards. Forty-four studies on mHealth projects in Africa were included and classified as: "patient follow-up and medication adherence" (n = 19), "staff training, support and motivation" (n = 2), "staff evaluation, monitoring and guidelines compliance" (n = 4), "drug supply-chain and stock management" (n = 2), "patient education and awareness" (n = 1), "disease surveillance and intervention monitoring" (n = 4), "data collection/transfer and reporting" (n = 10) and "overview of mHealth projects" (n = 2). In general, mHealth projects demonstrate positive health-related outcomes and their success is based on the accessibility, acceptance and low-cost of the technology, effective adaptation to local contexts, strong stakeholder collaboration, and government involvement. Threats such as dependency on funding, unclear healthcare system responsibilities, unreliable infrastructure and lack of evidence on cost-effectiveness challenge their implementation. mHealth projects can potentially be scaled-up to help tackle problems faced by healthcare systems like poor management of drug stocks, weak surveillance and reporting systems or lack of resources. mHealth in Africa is an innovative approach to delivering health services. In this fast-growing technological field, research opportunities include assessing implications of scaling-up mHealth projects, evaluating cost-effectiveness and impacts on the overall health system.
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity (“residence times”) of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales. PMID:26261985
Capello, Manuela; Robert, Marianne; Soria, Marc; Potin, Gael; Itano, David; Holland, Kim; Deneubourg, Jean-Louis; Dagorn, Laurent
2015-01-01
The rapid expansion of the use of passive acoustic telemetry technologies has facilitated unprecedented opportunities for studying the behavior of marine organisms in their natural environment. This technological advance would greatly benefit from the parallel development of dedicated methodologies accounting for the variety of timescales involved in the remote detection of tagged animals related to instrumental, environmental and behavioral events. In this paper we propose a methodological framework for estimating the site fidelity ("residence times") of acoustic tagged animals at different timescales, based on the survival analysis of continuous residence times recorded at multiple receivers. Our approach is validated through modeling and applied on two distinct datasets obtained from a small coastal pelagic species (bigeye scad, Selar crumenophthalmus) and a large, offshore pelagic species (yellowfin tuna, Thunnus albacares), which show very distinct spatial scales of behavior. The methodological framework proposed herein allows estimating the most appropriate temporal scale for processing passive acoustic telemetry data depending on the scientific question of interest. Our method provides residence times free of the bias inherent to environmental and instrumental noise that can be used to study the small scale behavior of acoustic tagged animals. At larger timescales, it can effectively identify residence times that encompass the diel behavioral excursions of fish out of the acoustic detection range. This study provides a systematic framework for the analysis of passive acoustic telemetry data that can be employed for the comparative study of different species and study sites. The same methodology can be used each time discrete records of animal detections of any nature are employed for estimating the site fidelity of an animal at different timescales.
Sheehan, Emma V.; Stevens, Timothy F.; Attrill, Martin J.
2010-01-01
Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a “flying array” that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms−1 current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment. PMID:21206748
Sheehan, Emma V; Stevens, Timothy F; Attrill, Martin J
2010-12-29
Following governments' policies to tackle global climate change, the development of offshore renewable energy sites is likely to increase substantially over coming years. All such developments interact with the seabed to some degree and so a key need exists for suitable methodology to monitor the impacts of large-scale Marine Renewable Energy Installations (MREIs). Many of these will be situated on mixed or rocky substrata, where conventional methods to characterise the habitat are unsuitable. Traditional destructive sampling is also inappropriate in conservation terms, particularly as safety zones around (MREIs) could function as Marine Protected Areas, with positive benefits for biodiversity. Here we describe a technique developed to effectively monitor the impact of MREIs and report the results of its field testing, enabling large areas to be surveyed accurately and cost-effectively. The methodology is based on a high-definition video camera, plus LED lights and laser scale markers, mounted on a "flying array" that maintains itself above the seabed grounded by a length of chain, thus causing minimal damage. Samples are taken by slow-speed tows of the gear behind a boat (200 m transects). The HD video and randomly selected frame grabs are analysed to quantify species distribution. The equipment was tested over two years in Lyme Bay, UK (25 m depth), then subsequently successfully deployed in demanding conditions at the deep (>50 m) high-energy Wave Hub site off Cornwall, UK, and a potential tidal stream energy site in Guernsey, Channel Islands (1.5 ms⁻¹ current), the first time remote samples from such a habitat have been achieved. The next stage in the monitoring development process is described, involving the use of Remote Operated Vehicles to survey the seabed post-deployment of MREI devices. The complete methodology provides the first quantitative, relatively non-destructive method for monitoring mixed-substrate benthic communities beneath MPAs and MREIs pre- and post-device deployment.
NASA Astrophysics Data System (ADS)
Hershkovitz, Yaron; Anker, Yaakov; Ben-Dor, Eyal; Schwartz, Guy; Gasith, Avital
2010-05-01
In-stream vegetation is a key ecosystem component in many fluvial ecosystems, having cascading effects on stream conditions and biotic structure. Traditionally, ground-level surveys (e.g. grid and transect analyses) are commonly used for estimating cover of aquatic macrophytes. Nonetheless, this methodological approach is highly time consuming and usually yields information which is practically limited to habitat and sub-reach scales. In contrast, remote-sensing techniques (e.g. satellite imagery and airborne photography), enable collection of large datasets over section, stream and basin scales, in relatively short time and reasonable cost. However, the commonly used spatial high resolution (1m) is often inadequate for examining aquatic vegetation on habitat or sub-reach scales. We examined the utility of a pseudo-spectral methodology, using RGB digital photography for estimating the cover of in-stream vegetation in a small Mediterranean-climate stream. We compared this methodology with that obtained by traditional ground-level grid methodology and with an airborne hyper-spectral remote sensing survey (AISA-ES). The study was conducted along a 2 km section of an intermittent stream (Taninim stream, Israel). When studied, the stream was dominated by patches of watercress (Nasturtium officinale) and mats of filamentous algae (Cladophora glomerata). The extent of vegetation cover at the habitat and section scales (100 and 104 m, respectively) were estimated by the pseudo-spectral methodology, using an airborne Roli camera with a Phase-One P 45 (39 MP) CCD image acquisition unit. The swaths were taken in elevation of about 460 m having a spatial resolution of about 4 cm (NADIR). For measuring vegetation cover at the section scale (104 m) we also used a 'push-broom' AISA-ES hyper-spectral swath having a sensor configuration of 182 bands (350-2500 nm) at elevation of ca. 1,200 m (i.e. spatial resolution of ca. 1 m). Simultaneously, with every swath we used an Analytical Spectral Device (ASD) to measure hyper-spectral signatures (2150 bands configuration; 350-2500 nm) of selected ground-level targets (located by GPS) of soil, water; vegetation (common reed, watercress, filamentous algae) and standard EVA foam colored sheets (red, green, blue, black and white). Processing and analysis of the data were performed over an ITT ENVI platform. The hyper-spectral image underwent radiometric calibration according to the flight and sensor calibration parameters on CALIGEO platform and the raw DN scale was converted into radiance scale. Ground level visual survey of vegetation cover and height was applied at the habitat scale (100 m) by placing a 1m2 netted grids (10x10cm cells) along 'bank-to-bank' transect (in triplicates). Estimates of plant cover obtained by the pseudo-spectral methodology at the habitat scale were 35-61% for the watercress, 0.4-25% for the filamentous algae and 27-51% for plant-free patches. The respective estimates by ground level visual survey were 26-50, 14-43% and 36-50%. The pseudo-spectral methodology also yielded estimates for the section scale (104 m) of ca. 39% for the watercress, ca. 32% for the filamentous algae and 6% for plant-free patches. The respective estimates obtained by hyper-spectral swath were 38, 26 and 8%. Validation against ground-level measurements proved that pseudo-spectral methodology gives reasonably good estimates of in-stream plant cover. Therefore, this methodology can serve as a substitute for ground level estimates at small stream scales and for the low resolution hyper-spectral methodology at larger scales.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, S. L.
1998-08-25
Fluid Catalytic Cracking (FCC) technology is the most important process used by the refinery industry to convert crude oil to valuable lighter products such as gasoline. Process development is generally very time consuming especially when a small pilot unit is being scaled-up to a large commercial unit because of the lack of information to aide in the design of scaled-up units. Such information can now be obtained by analysis based on the pilot scale measurements and computer simulation that includes controlling physics of the FCC system. A Computational fluid dynamic (CFD) code, ICRKFLO, has been developed at Argonne National Laboratorymore » (ANL) and has been successfully applied to the simulation of catalytic petroleum cracking risers. It employs hybrid hydrodynamic-chemical kinetic coupling techniques, enabling the analysis of an FCC unit with complex chemical reaction sets containing tens or hundreds of subspecies. The code has been continuously validated based on pilot-scale experimental data. It is now being used to investigate the effects of scaled-up FCC units. Among FCC operating conditions, the feed injection conditions are found to have a strong impact on the product yields of scaled-up FCC units. The feed injection conditions appear to affect flow and heat transfer patterns and the interaction of hydrodynamics and cracking kinetics causes the product yields to change accordingly.« less
A comparison of the psychometric properties of three cigarette withdrawal scales.
Etter, Jean-François; Hughes, John R
2006-03-01
To compare the psychometric properties of three cigarette withdrawal scales. An internet cohort study. Each of 4,644 current (44%), former (49%) and never smokers (7%) completed the three scales via the internet. A subsample completed the scales again after 14 days (n=1309), and indicated their smoking status after 42 days (n=1431). The Cigarette Withdrawal Scale (CWS), the Wisconsin Withdrawal Scale (WWS) and the Minnesota Withdrawal Form (MWF). All three scales covered the main elements in the Diagnostic and Statistical Manual version IV (DSM-IV) and the International Classification of Diseases version 10 (ICD-10) definitions of tobacco withdrawal, but WWS did not cover weight gain. Factor analyses indicated that only six factors were present in WWS, instead of the expected seven factors. Cronbach's alpha coefficients (0.76-0.93) were high for all scales. Test-retest coefficients were in the range of 0.66-0.86 for CWS and WWS, but were somewhat lower for some MWF items (range 0.52-0.80). In 324 ex-smokers who had quit smoking 31 days or less before baseline, craving predicted relapse at 14-day follow-up (CWS: odds ratio=1.53 per point, P=0.003; WWS: odds ratio=1.40, P=0.04; MWF: odds ratio=1.49, P=0.002). In 34 baseline smokers who had quit smoking by 14-day retest, an increase in craving (WWS and MWF), depressed mood (MWF) and difficulty concentrating (WWS) between baseline and retest predicted relapse at 42-day follow-up. In terms of construct validity, scales performed similarly, but performance on some key tests (e.g. withdrawal will increase post-cessation) were inadequate, due perhaps to methodological limitations. No scale showed a decisive advantage over the others. MWF has the advantage of brevity.
Large-scale modeling of rain fields from a rain cell deterministic model
NASA Astrophysics Data System (ADS)
FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia
2006-04-01
A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.
Mokel, Melissa Jennifer; Shellman, Juliette M
2013-01-01
Many instruments in which religious involvement is measured often (a) contain unclear, poorly developed constructs; (b) lack methodological rigor in scale development; and (c) contain language and content culturally incongruent with the religious experiences of diverse ethnic groups. The primary aims of this review were to (a) synthesize the research on instruments designed to measure religious involvement, (b) evaluate the methodological quality of instruments that measure religious involvement, and (c) examine these instruments for conceptual congruency with African American religious involvement. An updated integrative research review method guided the process (Whittemore & Knafl, 2005). 152 articles were reviewed and 23 articles retrieved. Only 3 retained instruments were developed under methodologically rigorous conditions. All 3 instruments were congruent with a conceptual model of African American religious involvement. The Fetzer Multidimensional Measure of Religious Involvement and Spirituality (FMMRS; Idler et al., 2003) was found to have favorable characteristics. Further examination and psychometric testing is warranted to determine its acceptability, readability, and cultural sensitivity in an African American population.
ERIC Educational Resources Information Center
Stormer, Ame; Harrison, Gail G.
2003-01-01
The development in the last decade of methodology for measuring and scaling household food insecurity and hunger in U.S. populations makes possible systematic examination of the ways in which hunger and food insecurity affect individuals and families. The impact on children has always been of primary concern for policy, advocacy, and science…
ERIC Educational Resources Information Center
Paribakht, T. Sima; Wesche, Marjorie Bingham
A study investigated the role of comprehension of meaningful language input in young adults' second language learning, focusing on: (1) what kinds of measurement instruments and procedures can be used in tracking student gains in specific aspects of target language proficiency; (2) development of a reliable self-report scale capturing different…
Jan C. Thomas; Eric V. Mueller; Simon Santamaria; Michael Gallagher; Mohamad El Houssami; Alexander Filkov; Kenneth Clark; Nicholas Skowronski; Rory M. Hadden; William Mell; Albert Simeoni
2017-01-01
An experimental approach has been developed to quantify the characteristics and flux of firebrands during a management-scale wildfire in a pine-dominated ecosystem. By characterizing the local fire behavior and measuring the temporal and spatial variation in firebrand collection, the flux of firebrands has been related to the fire behavior for the first time. This...
NASA Technical Reports Server (NTRS)
Bower, Chad; Padilla, Sebastian; Iacomini, Christie; Paul, Heather L.
2010-01-01
This paper details the validation of modeling methods for the three core components of a Metabolic heat regenerated Temperature Swing Adsorption (MTSA) subassembly, developed for use in a Portable Life Support System (PLSS). The first core component in the subassembly is a sorbent bed, used to capture and reject metabolically produced carbon dioxide (CO2). The sorbent bed performance can be augmented with a temperature swing driven by a liquid CO2 (LCO2) sublimation heat exchanger (SHX) for cooling the sorbent bed, and a condensing, icing heat exchanger (CIHX) for warming the sorbent bed. As part of the overall MTSA effort, scaled design validation test articles for each of these three components have been independently tested in laboratory conditions. Previously described modeling methodologies developed for implementation in Thermal Desktop and SINDA/FLUINT are reviewed and updated, their application in test article models outlined, and the results of those model correlations relayed. Assessment of the applicability of each modeling methodology to the challenge of simulating the response of the test articles and their extensibility to a full scale integrated subassembly model is given. The independent verified and validated modeling methods are applied to the development of a MTSA subassembly prototype model and predictions of the subassembly performance are given. These models and modeling methodologies capture simulation of several challenging and novel physical phenomena in the Thermal Desktop and SINDA/FLUINT software suite. Novel methodologies include CO2 adsorption front tracking and associated thermal response in the sorbent bed, heat transfer associated with sublimation of entrained solid CO2 in the SHX, and water mass transfer in the form of ice as low as 210 K in the CIHX.
Harris, Joshua D; Erickson, Brandon J; Cvetanovich, Gregory L; Abrams, Geoffrey D; McCormick, Frank M; Gupta, Anil K; Verma, Nikhil N; Bach, Bernard R; Cole, Brian J
2014-02-01
Condition-specific questionnaires are important components in evaluation of outcomes of surgical interventions. No condition-specific study methodological quality questionnaire exists for evaluation of outcomes of articular cartilage surgery in the knee. To develop a reliable and valid knee articular cartilage-specific study methodological quality questionnaire. Cross-sectional study. A stepwise, a priori-designed framework was created for development of a novel questionnaire. Relevant items to the topic were identified and extracted from a recent systematic review of 194 investigations of knee articular cartilage surgery. In addition, relevant items from existing generic study methodological quality questionnaires were identified. Items for a preliminary questionnaire were generated. Redundant and irrelevant items were eliminated, and acceptable items modified. The instrument was pretested and items weighed. The instrument, the MARK score (Methodological quality of ARticular cartilage studies of the Knee), was tested for validity (criterion validity) and reliability (inter- and intraobserver). A 19-item, 3-domain MARK score was developed. The 100-point scale score demonstrated face validity (focus group of 8 orthopaedic surgeons) and criterion validity (strong correlation to Cochrane Quality Assessment score and Modified Coleman Methodology Score). Interobserver reliability for the overall score was good (intraclass correlation coefficient [ICC], 0.842), and for all individual items of the MARK score, acceptable to perfect (ICC, 0.70-1.000). Intraobserver reliability ICC assessed over a 3-week interval was strong for 2 reviewers (≥0.90). The MARK score is a valid and reliable knee articular cartilage condition-specific study methodological quality instrument. This condition-specific questionnaire may be used to evaluate the quality of studies reporting outcomes of articular cartilage surgery in the knee.
Manufacture of a human mesenchymal stem cell population using an automated cell culture platform.
Thomas, Robert James; Chandra, Amit; Liu, Yang; Hourd, Paul C; Conway, Paul P; Williams, David J
2007-09-01
Tissue engineering and regenerative medicine are rapidly developing fields that use cells or cell-based constructs as therapeutic products for a wide range of clinical applications. Efforts to commercialise these therapies are driving a need for capable, scaleable, manufacturing technologies to ensure therapies are able to meet regulatory requirements and are economically viable at industrial scale production. We report the first automated expansion of a human bone marrow derived mesenchymal stem cell population (hMSCs) using a fully automated cell culture platform. Differences in cell population growth profile, attributed to key methodological differences, were observed between the automated protocol and a benchmark manual protocol. However, qualitatively similar cell output, assessed by cell morphology and the expression of typical hMSC markers, was obtained from both systems. Furthermore, the critical importance of minor process variation, e.g. the effect of cell seeding density on characteristics such as population growth kinetics and cell phenotype, was observed irrespective of protocol type. This work highlights the importance of careful process design in therapeutic cell manufacture and demonstrates the potential of automated culture for future optimisation and scale up studies required for the translation of regenerative medicine products from the laboratory to the clinic.
A Methodology and a Web Platform for the Collaborative Development of Context-Aware Systems
Martín, David; López-de-Ipiña, Diego; Alzua-Sorzabal, Aurkene; Lamsfus, Carlos; Torres-Manzanera, Emilio
2013-01-01
Information and services personalization is essential for an optimal user experience. Systems have to be able to acquire data about the user's context, process them in order to identify the user's situation and finally, adapt the functionality of the system to that situation, but the development of context-aware systems is complex. Data coming from distributed and heterogeneous sources have to be acquired, processed and managed. Several programming frameworks have been proposed in order to simplify the development of context-aware systems. These frameworks offer high-level application programming interfaces for programmers that complicate the involvement of domain experts in the development life-cycle. The participation of users that do not have programming skills but are experts in the application domain can speed up and improve the development process of these kinds of systems. Apart from that, there is a lack of methodologies to guide the development process. This article presents as main contributions, the implementation and evaluation of a web platform and a methodology to collaboratively develop context-aware systems by programmers and domain experts. PMID:23666131
A methodology for rapid vehicle scaling and configuration space exploration
NASA Astrophysics Data System (ADS)
Balaba, Davis
2009-12-01
The Configuration-space Exploration and Scaling Methodology (CESM) entails the representation of component or sub-system geometries as matrices of points in 3D space. These typically large matrices are reduced using minimal convex sets or convex hulls. This reduction leads to significant gains in collision detection speed at minimal approximation expense. (The Gilbert-Johnson-Keerthi algorithm [79] is used for collision detection purposes in this methodology.) Once the components are laid out, their collective convex hull (from here on out referred to as the super-hull) is used to approximate the inner mold line of the minimum enclosing envelope of the vehicle concept. A sectional slicing algorithm is used to extract the sectional dimensions of this envelope. An offset is added to these dimensions in order to come up with the sectional fuselage dimensions. Once the lift and control surfaces are added, vehicle level objective functions can be evaluated and compared to other designs. The size of the design space coupled with the fact that some key constraints such as the number of collisions are discontinuous, dictate that a domain-spanning optimization routine be used. Also, as this is a conceptual design tool, the goal is to provide the designer with a diverse baseline geometry space from which to chose. For these reasons, a domain-spanning algorithm with counter-measures against speciation and genetic drift is the recommended optimization approach. The Non-dominated Sorting Genetic Algorithm (NSGA-II) [60] is shown to work well for the proof of concept study. There are two major reasons why the need to evaluate higher fidelity, custom geometric scaling laws became a part of this body of work. First of all, historical-data based regressions become implicitly unreliable when the vehicle concept in question is designed around a disruptive technology. Second, it was shown that simpler approaches such as photographic scaling can result in highly suboptimal concepts even for very small scaling factors. Yet good scaling information is critical to the success of any conceptual design process. In the CESM methodology, it is assumed that the new technology has matured enough to permit the prediction of the scaling behavior of the various subsystems in response to requirement changes. Updated subsystem geometry data is generated by applying the new requirement settings to the affected subsystems. All collisions are then eliminated using the NSGA-II algorithm. This is done while minimizing the adverse impact on the vehicle packing density. Once all collisions are eliminated, the vehicle geometry is reconstructed and system level data such as fuselage volume can be harvested. This process is repeated for all requirement settings. Dimensional analysis and regression can be carried out using this data and all other pertinent metrics in the manner described by Mendez [124] and Segel [173]. The dominant parameters for each response show up as in the dimensionally consistent groups that form the independent variables. More importantly the impact of changes in any of these variables on system level dependent variables can be easily and rapidly evaluated. In this way, the conceptual design process can be accelerated without sacrificing analysis accuracy. Scaling laws for take-off gross weight and fuselage volume as functions of fuel cell specific power and power density for a notional General Aviation vehicle are derived for the proof of concept. CESM enables the designer to maintain design freedom by portably carrying multiple designs deeper into the design process. Also since CESM is a bottom-up approach, all proposed baseline concepts are implicitly volumetrically feasible. System level geometry parameters become fall-outs as opposed to inputs. This is a critical attribute as, without the benefit of experience, a designer would be hard pressed to set the appropriate ranges for such parameters for a vehicle built around a disruptive technology. Furthermore, scaling laws generated from custom data for each concept are subject to less design noise than say, regression based approaches. Through these laws, key physics-based characteristics of vehicle subsystems such as energy density can be mapped onto key system level metrics such as fuselage volume or take-off gross weight. These laws can then substitute some historical-data based analyses thereby improving the fidelity of the analyses and reducing design time. (Abstract shortened by UMI.)
An analysis of IGBP global land-cover characterization process
Loveland, Thomas R.; Zhu, Zhiliang; Ohlen, Donald O.; Brown, Jesslyn F.; Reed, Bradley C.; Yang, Limin
1999-01-01
The international Geosphere Biosphere Programme (IGBP) has called for the development of improved global land-cover data for use in increasingly sophisticated global environmental models. To meet this need, the staff of the U.S. Geological Survey and the University of Nebraska-Lincoln developed and applied a global land-cover characterization methodology using 1992-1993 1-km resolution Advanced Very High Resolution Radiometer (AVHRR) and other spatial data. The methodology, based on unsupervised classification with extensive postclassification refinement, yielded a multi-layer database consisting of eight land-cover data sets, descriptive attributes, and source data. An independent IGBP accuracy assessment reports a global accuracy of 73.5 percent, and continental results vary from 63 percent to 83 percent. Although data quality, methodology, interpreter performance, and logistics affected the results, significant problems were associated with the relationship between AVHRR data and fine-scale, spectrally similar land-cover patterns in complex natural or disturbed landscapes.
A toolbox to explore the mechanics of living embryonic tissues
Campàs, Otger
2016-01-01
The sculpting of embryonic tissues and organs into their functional morphologies involves the spatial and temporal regulation of mechanics at cell and tissue scales. Decades of in vitro work, complemented by some in vivo studies, have shown the relevance of mechanical cues in the control of cell behaviors that are central to developmental processes, but the lack of methodologies enabling precise, quantitative measurements of mechanical cues in vivo have hindered our understanding of the role of mechanics in embryonic development. Several methodologies are starting to enable quantitative studies of mechanics in vivo and in situ, opening new avenues to explore how mechanics contributes to shaping embryonic tissues and how it affects cell behavior within developing embryos. Here we review the present methodologies to study the role of mechanics in living embryonic tissues, considering their strengths and drawbacks as well as the conditions in which they are most suitable. PMID:27061360
A toolbox to explore the mechanics of living embryonic tissues.
Campàs, Otger
2016-07-01
The sculpting of embryonic tissues and organs into their functional morphologies involves the spatial and temporal regulation of mechanics at cell and tissue scales. Decades of in vitro work, complemented by some in vivo studies, have shown the relevance of mechanical cues in the control of cell behaviors that are central to developmental processes, but the lack of methodologies enabling precise, quantitative measurements of mechanical cues in vivo have hindered our understanding of the role of mechanics in embryonic development. Several methodologies are starting to enable quantitative studies of mechanics in vivo and in situ, opening new avenues to explore how mechanics contributes to shaping embryonic tissues and how it affects cell behavior within developing embryos. Here we review the present methodologies to study the role of mechanics in living embryonic tissues, considering their strengths and drawbacks as well as the conditions in which they are most suitable. Copyright © 2016 Elsevier Ltd. All rights reserved.
He, Guizhen; Zhang, Lei; Lu, Yonglong
2009-09-01
Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.
NASA Astrophysics Data System (ADS)
Carnell, E. J.; Misselbrook, T. H.; Dore, A. J.; Sutton, M. A.; Dragosits, U.
2017-09-01
The effects of atmospheric nitrogen (N) deposition are evident in terrestrial ecosystems worldwide, with eutrophication and acidification leading to significant changes in species composition. Substantial reductions in N deposition from nitrogen oxides emissions have been achieved in recent decades. By contrast, ammonia (NH3) emissions from agriculture have not decreased substantially and are typically highly spatially variable, making efficient mitigation challenging. One solution is to target NH3 mitigation measures spatially in source landscapes to maximize the benefits for nature conservation. The paper develops an approach to link national scale data and detailed local data to help identify suitable measures for spatial targeting of local sources near designated Special Areas of Conservation (SACs). The methodology combines high-resolution national data on emissions, deposition and source attribution with local data on agricultural management and site conditions. Application of the methodology for the full set of 240 SACs in England found that agriculture contributes ∼45 % of total N deposition. Activities associated with cattle farming represented 54 % of agricultural NH3 emissions within 2 km of the SACs, making them a major contributor to local N deposition, followed by mineral fertiliser application (21 %). Incorporation of local information on agricultural management practices at seven example SACs provided the means to correct outcomes compared with national-scale emission factors. The outcomes show how national scale datasets can provide information on N deposition threats at landscape to national scales, while local-scale information helps to understand the feasibility of mitigation measures, including the impact of detailed spatial targeting on N deposition rates to designated sites.
Arheart, Kristopher L; Sly, David F; Trapido, Edward J; Rodriguez, Richard D; Ellestad, Amy J
2004-11-01
To identify multi-item attitude/belief scales associated with the theoretical foundations of an anti-tobacco counter-marketing campaign and assess their reliability and validity. The data analyzed are from two state-wide, random, cross-sectional telephone surveys [n(S1)=1,079, n(S2)=1,150]. Items forming attitude/belief scales are identified using factor analysis. Reliability is assessed with Chronbach's alpha. Relationships among scales are explored using Pearson correlation. Validity is assessed by testing associations derived from the Centers for Disease Control and Prevention's (CDC) logic model for tobacco control program development and evaluation linking media exposure to attitudes/beliefs, and attitudes/beliefs to smoking-related behaviors. Adjusted odds ratios are employed for these analyses. Three factors emerged: traditional attitudes/beliefs about tobacco and tobacco use, tobacco industry manipulation and anti-tobacco empowerment. Reliability coefficients are in the range of 0.70 and vary little between age groups. The factors are correlated with one-another as hypothesized. Associations between media exposure and the attitude/belief scales and between these scales and behaviors are consistent with the CDC logic model. Using reliable, valid multi-item scales is theoretically and methodologically more sound than employing single-item measures of attitudes/beliefs. Methodological, theoretical and practical implications are discussed.
Decision support for redesigning wastewater treatment technologies.
McConville, Jennifer R; Künzle, Rahel; Messmer, Ulrike; Udert, Kai M; Larsen, Tove A
2014-10-21
This paper offers a methodology for structuring the design space for innovative process engineering technology development. The methodology is exemplified in the evaluation of a wide variety of treatment technologies for source-separated domestic wastewater within the scope of the Reinvent the Toilet Challenge. It offers a methodology for narrowing down the decision-making field based on a strict interpretation of treatment objectives for undiluted urine and dry feces and macroenvironmental factors (STEEPLED analysis) which influence decision criteria. Such an evaluation identifies promising paths for technology development such as focusing on space-saving processes or the need for more innovation in low-cost, energy-efficient urine treatment methods. Critical macroenvironmental factors, such as housing density, transportation infrastructure, and climate conditions were found to affect technology decisions regarding reactor volume, weight of outputs, energy consumption, atmospheric emissions, investment cost, and net revenue. The analysis also identified a number of qualitative factors that should be carefully weighed when pursuing technology development; such as availability of O&M resources, health and safety goals, and other ethical issues. Use of this methodology allows for coevolution of innovative technology within context constraints; however, for full-scale technology choices in the field, only very mature technologies can be evaluated.
Low Cost Manufacturing of Composite Cryotanks
NASA Technical Reports Server (NTRS)
Meredith, Brent; Palm, Tod; Deo, Ravi; Munafo, Paul M. (Technical Monitor)
2002-01-01
This viewgraph presentation reviews research and development of cryotank manufacturing conducted by Northrup Grumman. The objectives of the research and development included the development and validation of manufacturing processes and technology for fabrication of large scale cryogenic tanks, the establishment of a scale-up and facilitization plan for full scale cryotanks, the development of non-autoclave composite manufacturing processes, the fabrication of subscale tank joints for element tests, the performance of manufacturing risk reduction trials for the subscale tank, and the development of full-scale tank manufacturing concepts.
Advances and trends in computational structural mechanics
NASA Technical Reports Server (NTRS)
Noor, A. K.
1986-01-01
Recent developments in computational structural mechanics are reviewed with reference to computational needs for future structures technology, advances in computational models for material behavior, discrete element technology, assessment and control of numerical simulations of structural response, hybrid analysis, and techniques for large-scale optimization. Research areas in computational structural mechanics which have high potential for meeting future technological needs are identified. These include prediction and analysis of the failure of structural components made of new materials, development of computational strategies and solution methodologies for large-scale structural calculations, and assessment of reliability and adaptive improvement of response predictions.
Wind Tunnel to Atmospheric Mapping for Static Aeroelastic Scaling
NASA Technical Reports Server (NTRS)
Heeg, Jennifer; Spain, Charles V.; Rivera, J. A.
2004-01-01
Wind tunnel to Atmospheric Mapping (WAM) is a methodology for scaling and testing a static aeroelastic wind tunnel model. The WAM procedure employs scaling laws to define a wind tunnel model and wind tunnel test points such that the static aeroelastic flight test data and wind tunnel data will be correlated throughout the test envelopes. This methodology extends the notion that a single test condition - combination of Mach number and dynamic pressure - can be matched by wind tunnel data. The primary requirements for affecting this extension are matching flight Mach numbers, maintaining a constant dynamic pressure scale factor and setting the dynamic pressure scale factor in accordance with the stiffness scale factor. The scaling is enabled by capabilities of the NASA Langley Transonic Dynamics Tunnel (TDT) and by relaxation of scaling requirements present in the dynamic problem that are not critical to the static aeroelastic problem. The methodology is exercised in two example scaling problems: an arbitrarily scaled wing and a practical application to the scaling of the Active Aeroelastic Wing flight vehicle for testing in the TDT.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heath, Garvin A.
The overall objective of the Research Partnership to Secure Energy for America (RPSEA)-funded research project is to develop independent estimates of methane emissions using top-down and bottom-up measurement approaches and then to compare the estimates, including consideration of uncertainty. Such approaches will be applied at two scales: basin and facility. At facility scale, multiple methods will be used to measure methane emissions of the whole facility (controlled dual tracer and single tracer releases, aircraft-based mass balance and Gaussian back-trajectory), which are considered top-down approaches. The bottom-up approach will sum emissions from identified point sources measured using appropriate source-level measurement techniquesmore » (e.g., high-flow meters). At basin scale, the top-down estimate will come from boundary layer airborne measurements upwind and downwind of the basin, using a regional mass balance model plus approaches to separate atmospheric methane emissions attributed to the oil and gas sector. The bottom-up estimate will result from statistical modeling (also known as scaling up) of measurements made at selected facilities, with gaps filled through measurements and other estimates based on other studies. The relative comparison of the bottom-up and top-down estimates made at both scales will help improve understanding of the accuracy of the tested measurement and modeling approaches. The subject of this CRADA is NREL's contribution to the overall project. This project resulted from winning a competitive solicitation no. RPSEA RFP2012UN001, proposal no. 12122-95, which is the basis for the overall project. This Joint Work Statement (JWS) details the contributions of NREL and Colorado School of Mines (CSM) in performance of the CRADA effort.« less
Multi -risk assessment at a national level in Georgia
NASA Astrophysics Data System (ADS)
Tsereteli, Nino; Varazanashvili, Otar; Amiranashvili, Avtandil; Tsereteli, Emili; Elizbarashvili, Elizbar; Saluqvadze, Manana; Dolodze, Jemal
2013-04-01
Work presented here was initiated by national GNSF project " Reducing natural disasters multiple risk: a positive factor for Georgia development " and two international projects: NATO SFP 983038 "Seismic hazard and Rusk assessment for Southern Caucasus-eastern Turkey Energy Corridors" and EMME " Earthquake Model for Middle east Region". Methodology for estimation of "general" vulnerability, hazards and multiple risk to natural hazards (namely, earthquakes, landslides, snow avalanches, flash floods, mudflows, drought, hurricanes, frost, hail) where developed for Georgia. The electronic detailed databases of natural disasters were created. These databases contain the parameters of hazardous phenomena that caused natural disasters. The magnitude and intensity scale of the mentioned disasters are reviewed and the new magnitude and intensity scales are suggested for disasters for which the corresponding formalization is not yet performed. The associated economic losses were evaluated and presented in monetary terms for these hazards. Based on the hazard inventory, an approach was developed that allowed for the calculation of an overall vulnerability value for each individual hazard type, using the Gross Domestic Product per unit area (applied to population) as the indicator for elements at risk exposed. The correlation between estimated economic losses, physical exposure and the magnitude for each of the six types of hazards has been investigated in detail by using multiple linear regression analysis. Economic losses for all past events and historical vulnerability were estimated. Finally, the spatial distribution of general vulnerability was assessed, and the expected maximum economic loss was calculated as well as a multi-risk map was set-up.
Park, Hyojung; Shin, Sunhwa
2015-12-01
The purpose of this study was to develop and test a semantic differential scale of sexual attitudes for older people in Korea. The scale was based on items derived from a literature review and focus group interviews. A methodological study was used to test the reliability and validity of the instrument. A total of 368 older men and women were recruited to complete the semantic differential scale. Fifteen pairs of adjective ratings were extracted through factor analysis. Total variance explained was 63.40%. To test for construct validity, group comparisons were implemented. The total score of sexual attitudes showed significant differences depending on gender and availability of sexual activity. Cronbach's alpha coefficient for internal consistency was 0.96. The findings of this study demonstrate that the semantic differential scale of sexual attitude is a reliable and valid instrument. © 2015 Wiley Publishing Asia Pty Ltd.
Representativeness-based sampling network design for the State of Alaska
Forrest M. Hoffman; Jitendra Kumar; Richard T. Mills; William W. Hargrove
2013-01-01
Resource and logistical constraints limit the frequency and extent of environmental observations, particularly in the Arctic, necessitating the development of a systematic sampling strategy to maximize coverage and objectively represent environmental variability at desired scales. A quantitative methodology for stratifying sampling domains, informing site selection,...
Status of the Flooding Fragility Testing Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, C. L.; Savage, B.; Bhandari, B.
2016-06-01
This report provides an update on research addressing nuclear power plant component reliability under flooding conditions. The research includes use of the Component Flooding Evaluation Laboratory (CFEL) where individual components and component subassemblies will be tested to failure under various flooding conditions. The resulting component reliability data can then be incorporated with risk simulation strategies to provide a more thorough representation of overall plant risk. The CFEL development strategy consists of four interleaved phases. Phase 1 addresses design and application of CFEL with water rise and water spray capabilities allowing testing of passive and active components including fully electrified components.more » Phase 2 addresses research into wave generation techniques followed by the design and addition of the wave generation capability to CFEL. Phase 3 addresses methodology development activities including small scale component testing, development of full scale component testing protocol, and simulation techniques including Smoothed Particle Hydrodynamic (SPH) based computer codes. Phase 4 involves full scale component testing including work on full scale component testing in a surrogate CFEL testing apparatus.« less