An approach to quantitative sustainability assessment in the early stages of process design.
Tugnoli, Alessandro; Santarelli, Francesco; Cozzani, Valerio
2008-06-15
A procedure was developed for the quantitative assessment of key performance indicators suitable for the sustainability analysis of alternative processes, mainly addressing the early stages of process design. The methodology was based on the calculation of a set of normalized impact indices allowing a direct comparison of the additional burden of each process alternative on a selected reference area. Innovative reference criteria were developed to compare and aggregate the impact indicators on the basis of the site-specific impact burden and sustainability policy. An aggregation procedure also allows the calculation of overall sustainability performance indicators and of an "impact fingerprint" of each process alternative. The final aim of the method is to support the decision making process during process development, providing a straightforward assessment of the expected sustainability performances. The application of the methodology to case studies concerning alternative waste disposal processes allowed a preliminary screening of the expected critical sustainability impacts of each process. The methodology was shown to provide useful results to address sustainability issues in the early stages of process design.
A prototype software methodology for the rapid evaluation of biomanufacturing process options.
Chhatre, Sunil; Francis, Richard; O'Donovan, Kieran; Titchener-Hooker, Nigel J; Newcombe, Anthony R; Keshavarz-Moore, Eli
2007-10-01
A three-layered simulation methodology is described that rapidly evaluates biomanufacturing process options. In each layer, inferior options are screened out, while more promising candidates are evaluated further in the subsequent, more refined layer, which uses more rigorous models that require more data from time-consuming experimentation. Screening ensures laboratory studies are focused only on options showing the greatest potential. To simplify the screening, outputs of production level, cost and time are combined into a single value using multi-attribute-decision-making techniques. The methodology was illustrated by evaluating alternatives to an FDA (U.S. Food and Drug Administration)-approved process manufacturing rattlesnake antivenom. Currently, antivenom antibodies are recovered from ovine serum by precipitation/centrifugation and proteolyzed before chromatographic purification. Alternatives included increasing the feed volume, replacing centrifugation with microfiltration and replacing precipitation/centrifugation with a Protein G column. The best alternative used a higher feed volume and a Protein G step. By rapidly evaluating the attractiveness of options, the methodology facilitates efficient and cost-effective process development.
Remedial Action Assessment System: A computer-based methodology for conducting feasibility studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, M.K.; Buelt, J.L.; Stottlemyre, J.A.
1991-02-01
Because of the complexity and number of potential waste sites facing the US Department of Energy (DOE) for potential cleanup, DOE is supporting the development of a computer-based methodology to streamline the remedial investigation/feasibility study process. The Remedial Action Assessment System (RAAS), can be used for screening, linking, and evaluating established technology processes in support of conducting feasibility studies. It is also intended to do the same in support of corrective measures studies. The user interface employs menus, windows, help features, and graphical information while RAAS is in operation. Object-oriented programming is used to link unit processes into sets ofmore » compatible processes that form appropriate remedial alternatives. Once the remedial alternatives are formed, the RAAS methodology can evaluate them in terms of effectiveness, implementability, and cost. RAAS will access a user-selected risk assessment code to determine the reduction of risk after remedial action by each recommended alternative. The methodology will also help determine the implementability of the remedial alternatives at a site and access cost estimating tools to provide estimates of capital, operating, and maintenance costs. This paper presents the characteristics of two RAAS prototypes currently being developed. These include the RAAS Technology Information System, which accesses graphical, tabular and textual information about technologies, and the main RAAS methodology, which screens, links, and evaluates remedial technologies. 4 refs., 3 figs., 1 tab.« less
A methodology for the comparative evaluation of alternative bioseparation technologies.
Tran, Richard; Zhou, Yuhong; Lacki, Karol M; Titchener-Hooker, Nigel J
2008-01-01
Advances in upstream technologies and growing commercial demand have led to cell culture processes of ever larger volumes and expressing at higher product titers. This has increased the burden on downstream processing. Concerns regarding the capacity limitations of packed-bed chromatography have led process engineers to begin investigating new bioseparation techniques that may be considered as "alternatives" to chromatography, and which could potentially offer higher processing capacities but at a lower cost. With the wide range of alternatives, which are currently available, each with their own strengths and inherent limitations, coupled with the time pressures associated with process development, the challenge for process engineers is to determine which technologies are most worth investigating. This article presents a methodology based on a multiattribute decision making (MADM) analysis approach, utilizing both quantitative and qualitative data, which can be used to determine the "industrial attractiveness" of bioseparation technologies, accounting for trade-offs between their strengths and weaknesses. By including packed-bed chromatography in the analysis as a reference point, it was possible to determine the alternatives, which show the most promise for use in large-scale manufacturing processes. The results of this analysis show that although the majority of alternative techniques offer certain advantages over conventional packed-bed chromatography, their attractiveness overall means that currently none of these technologies may be considered as viable alternatives to chromatography. The methodology introduced in this study may be used to gain significant quantitative insight as to the key areas in which improvements are required for each technique, and thus may be used as a tool to aid in further technological development.
Risk Assessment Methodology for Hazardous Waste Management (1998)
A methodology is described for systematically assessing and comparing the risks to human health and the environment of hazardous waste management alternatives. The methodology selects and links appropriate models and techniques for performing the process.
Alternative mRNA polyadenylation in eukaryotes: an effective regulator of gene expression
Lutz, Carol S.; Moreira, Alexandra
2010-01-01
Alternative RNA processing mechanisms, including alternative splicing and alternative polyadenylation, are increasingly recognized as important regulators of gene expression. This article will focus on what has recently been described about alternative polyadenylation in development, differentiation, and disease in higher eukaryotes. We will also describe how the evolving global methodologies for examining the cellular transcriptome, both experimental and bioinformatic, are revealing new details about the complex nature of alternative 3′ end formation, as well as interactions with other RNA-mediated and RNA processing mechanisms. PMID:21278855
Conceptual Chemical Process Design for Sustainability.
This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyse...
Angelis, Aris; Kanavos, Panos
2016-05-01
In recent years, multiple criteria decision analysis (MCDA) has emerged as a likely alternative to address shortcomings in health technology assessment (HTA) by offering a more holistic perspective to value assessment and acting as an alternative priority setting tool. In this paper, we argue that MCDA needs to subscribe to robust methodological processes related to the selection of objectives, criteria and attributes in order to be meaningful in the context of healthcare decision making and fulfil its role in value-based assessment (VBA). We propose a methodological process, based on multi-attribute value theory (MAVT) methods comprising five distinct phases, outline the stages involved in each phase and discuss their relevance in the HTA process. Importantly, criteria and attributes need to satisfy a set of desired properties, otherwise the outcome of the analysis can produce spurious results and misleading recommendations. Assuming the methodological process we propose is adhered to, the application of MCDA presents three very distinct advantages to decision makers in the context of HTA and VBA: first, it acts as an instrument for eliciting preferences on the performance of alternative options across a wider set of explicit criteria, leading to a more complete assessment of value; second, it allows the elicitation of preferences across the criteria themselves to reflect differences in their relative importance; and, third, the entire process of preference elicitation can be informed by direct stakeholder engagement, and can therefore reflect their own preferences. All features are fully transparent and facilitate decision making.
ERIC Educational Resources Information Center
Perkinson, Henry
1978-01-01
Describes the theories of Karl Popper regarding scientific knowledge and scientific methodology; tells how the Popper-Darwinian theory of growth of knowledge offers an alternative nonauthoritarian conception of the educational process, and thus an alternative conception of the functions of the teacher and the school. (GT)
A hierarchical modeling methodology for the definition and selection of requirements
NASA Astrophysics Data System (ADS)
Dufresne, Stephane
This dissertation describes the development of a requirements analysis methodology that takes into account the concept of operations and the hierarchical decomposition of aerospace systems. At the core of the methodology, the Analytic Network Process (ANP) is used to ensure the traceability between the qualitative and quantitative information present in the hierarchical model. The proposed methodology is implemented to the requirements definition of a hurricane tracker Unmanned Aerial Vehicle. Three research objectives are identified in this work; (1) improve the requirements mapping process by matching the stakeholder expectations with the concept of operations, systems and available resources; (2) reduce the epistemic uncertainty surrounding the requirements and requirements mapping; and (3) improve the requirements down-selection process by taking into account the level of importance of the criteria and the available resources. Several challenges are associated with the identification and definition of requirements. The complexity of the system implies that a large number of requirements are needed to define the systems. These requirements are defined early in the conceptual design, where the level of knowledge is relatively low and the level of uncertainty is large. The proposed methodology intends to increase the level of knowledge and reduce the level of uncertainty by guiding the design team through a structured process. To address these challenges, a new methodology is created to flow-down the requirements from the stakeholder expectations to the systems alternatives. A taxonomy of requirements is created to classify the information gathered during the problem definition. Subsequently, the operational and systems functions and measures of effectiveness are integrated to a hierarchical model to allow the traceability of the information. Monte Carlo methods are used to evaluate the variations of the hierarchical model elements and consequently reduce the epistemic uncertainty. The proposed methodology is applied to the design of a hurricane tracker Unmanned Aerial Vehicles to demonstrate the origin and impact of requirements on the concept of operations and systems alternatives. This research demonstrates that the hierarchical modeling methodology provides a traceable flow-down of the requirements from the problem definition to the systems alternatives phases of conceptual design.
Emergy Analysis for the Sustainable Utilization of Biosolids ...
This contribution describes the application of an emergy-based methodology for comparing two management alternatives of biosolids produced in a wastewater treatment plant. The current management practice of using biosolids as soil fertilizers was evaluated and compared to another alternative, the recovery of energy from the biosolid gasification process. This emergy assessment and comparison approach identifies more sustainable processes which achieve economic and social benefits with a minimal environmental impact. In addition, emergy-based sustainability indicators and the GREENSCOPE methodology were used to compare the two biosolid management alternatives. According to the sustainability assessment results, the energy production from biosolid gasification is energetically profitable, economically viable, and environmentally suitable. Furthermore, it was found that the current use of biosolids as soil fertilizer does not generate any considerable environmental stress, has the potential to achieve more economic benefits, and a post-processing of biosolids prior to its use as soil fertilizer improves its sustainability performance. In conclusion, this emergy analysis provides a sustainability assessment of both alternatives of biosolid management and helps decision-makers to identify opportunities for improvement during the current process of biosolid management. This work aims to identify the best option for the use and management of biosolids generated in a wa
Comparing Alternatives For Replacing Harmful Chemicals
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1995-01-01
Methodology developed to provide guidance for replacement of industrial chemicals that must be phased out by law because they are toxic and/or affect environment adversely. Chemicals and processes ranked numerically. Applies mostly to chemicals contributing to depletion of ozone in upper atmosphere; some other harmful chemicals included. Quality function deployment matrix format provides convenient way to compare alternative processes and chemicals. Overall rating at bottom of each process-and-chemical column indicates relative advantage.
The Use of Multi-Criteria Evaluation and Network Analysis in the Area Development Planning Process
2013-03-01
layouts. The alternative layout scoring process, base in multi-criteria evaluation, returns a quantitative score for each alternative layout and a...The purpose of this research was to develop improvements to the area development planning process. These plans are used to improve operations within...an installation sub-section by altering the physical layout of facilities. One methodology was developed based on apply network analysis concepts to
Jacobs, Molly M.; Malloy, Timothy F.; Tickner, Joel A.; Edwards, Sally
2015-01-01
Background Given increasing pressures for hazardous chemical replacement, there is growing interest in alternatives assessment to avoid substituting a toxic chemical with another of equal or greater concern. Alternatives assessment is a process for identifying, comparing, and selecting safer alternatives to chemicals of concern (including those used in materials, processes, or technologies) on the basis of their hazards, performance, and economic viability. Objectives The purposes of this substantive review of alternatives assessment frameworks are to identify consistencies and differences in methods and to outline needs for research and collaboration to advance science policy practice. Methods This review compares methods used in six core components of these frameworks: hazard assessment, exposure characterization, life-cycle impacts, technical feasibility evaluation, economic feasibility assessment, and decision making. Alternatives assessment frameworks published from 1990 to 2014 were included. Results Twenty frameworks were reviewed. The frameworks were consistent in terms of general process steps, but some differences were identified in the end points addressed. Methodological gaps were identified in the exposure characterization, life-cycle assessment, and decision–analysis components. Methods for addressing data gaps remain an issue. Discussion Greater consistency in methods and evaluation metrics is needed but with sufficient flexibility to allow the process to be adapted to different decision contexts. Conclusion Although alternatives assessment is becoming an important science policy field, there is a need for increased cross-disciplinary collaboration to refine methodologies in support of the informed substitution and design of safer chemicals, materials, and products. Case studies can provide concrete lessons to improve alternatives assessment. Citation Jacobs MM, Malloy TF, Tickner JA, Edwards S. 2016. Alternatives assessment frameworks: research needs for the informed substitution of hazardous chemicals. Environ Health Perspect 124:265–280; http://dx.doi.org/10.1289/ehp.1409581 PMID:26339778
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-03
... determine endpoints; questionnaire design and analyses; and presentation of survey results. To date, FDA has..., the workshop will invest considerable time in identifying best methodological practices for conducting... sample, sample size, question design, process, and endpoints. Panel 2 will focus on alternatives to...
ERIC Educational Resources Information Center
Pustejovsky, James E.; Runyon, Christopher
2014-01-01
Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…
Characterizing Postural Sway during Quiet Stance Based on the Intermittent Control Hypothesis
NASA Astrophysics Data System (ADS)
Nomura, Taishin; Nakamura, Toru; Fukada, Kei; Sakoda, Saburo
2007-07-01
This article illustrates a signal processing methodology for the time series of postural sway and accompanied electromyographs from the lower limb muscles during quiet stance. It was shown that the proposed methodology was capable of identifying the underlying postural control mechanisms. A preliminary application of the methodology provided evidence that supports the intermittent control hypothesis alternative to the conventional stiffness control hypothesis during human quiet upright stance.
ERIC Educational Resources Information Center
Duhon-Haynes, Gwendolyn; And Others
This paper examines alternative certification programs in terms of entrance requirements, supervision and mentoring, and post-certification professional support. A good alternative program uses rigorous screening processes to ensure the selection of qualified teacher interns; provides high-quality preservice training in methodology, classroom…
Multiple reaction monitoring (MRM) of plasma proteins in cardiovascular proteomics.
Dardé, Verónica M; Barderas, Maria G; Vivanco, Fernando
2013-01-01
Different methodologies have been used through years to discover new potential biomarkers related with cardiovascular risk. The conventional proteomic strategy involves a discovery phase that requires the use of mass spectrometry (MS) and a validation phase, usually on an alternative platform such as immunoassays that can be further implemented in clinical practice. This approach is suitable for a single biomarker, but when large panels of biomarkers must be validated, the process becomes inefficient and costly. Therefore, it is essential to find an alternative methodology to perform the biomarker discovery, validation, and -quantification. The skills provided by quantitative MS turn it into an extremely attractive alternative to antibody-based technologies. Although it has been traditionally used for quantification of small molecules in clinical chemistry, MRM is now emerging as an alternative to traditional immunoassays for candidate protein biomarker validation.
Training for Environmental Impact Assessment (E.I.A.).
ERIC Educational Resources Information Center
Vougias, S.
1988-01-01
Deals with the methodology and practices for Environmental Impact Assessment (EIA). Describes the EIA process, prediction process, alternative assessment methods, training needs, major activities, training provision and material, main deficiencies and the precautions, and real world training examples. (Author/YP)
Recovery and purification process development for monoclonal antibody production
Ma, Junfen; Winter, Charles; Bayer, Robert
2010-01-01
Hundreds of therapeutic monoclonal antibodies (mAbs) are currently in development, and many companies have multiple antibodies in their pipelines. Current methodology used in recovery processes for these molecules are reviewed here. Basic unit operations such as harvest, Protein A affinity chromatography and additional polishing steps are surveyed. Alternative processes such as flocculation, precipitation and membrane chromatography are discussed. We also cover platform approaches to purification methods development, use of high throughput screening methods, and offer a view on future developments in purification methodology as applied to mAbs. PMID:20647768
[The grounded theory as a methodological alternative for nursing research].
dos Santos, Sérgio Ribeiro; da Nóbrega, Maria Miriam
2002-01-01
This study presents a method of interpretative and systematic research with appliance to the development of studies in nursing called "the grounded theory", whose theoretical support is the symbolic interactionism. The purpose of the paper is to describe the grounded theory as an alternative methodology for the construction of knowledge in nursing. The study highlights four topics: the basic principle, the basic concepts, the trajectory of the method and the process of analysis of the data. We conclude that the systematization of data and its interpretation, based on social actors' experience, constitute strong subsidies to generate theories through this research tool.
ERIC Educational Resources Information Center
Streff, Robert James
2016-01-01
Studies have shown that not all students are assessed effectively using standard testing formats. However, it is unclear what alternative methodology would be useful to determine whether students have acquired the skills necessary for today's global market. This research study's purpose was to understand the processes instructors use when choosing…
ERIC Educational Resources Information Center
Wong, Chee Leong; Chu, Hye-Eun; Yap, Kueh Chin
2016-01-01
Currently, there is no agreement among scientists and science educators on whether heat should be defined as a "process of energy transfer" or "form of energy." For example, students may conceive of heat as "molecular kinetic energy," but the interpretation of this alternative conception is dependent on educational…
Experimental Learning Enhancing Improvisation Skills
ERIC Educational Resources Information Center
Pereira Christopoulos, Tania; Wilner, Adriana; Trindade Bestetti, Maria Luisa
2016-01-01
Purpose: This study aims to present improvisation training and experimentation as an alternative method to deal with unexpected events in which structured processes do not seem to work. Design/Methodology/Approach: Based on the literature of sensemaking and improvisation, the study designs a framework and process model of experimental learning…
Alternative Fuels Data Center: Vehicle Cost Calculator Assumptions and
Center: Vehicle Cost Calculator Assumptions and Methodology on Facebook Tweet about Alternative Fuels Data Center: Vehicle Cost Calculator Assumptions and Methodology on Twitter Bookmark Alternative Fuels Data Center: Vehicle Cost Calculator Assumptions and Methodology on Google Bookmark Alternative Fuels
Locally optimal extracellular stimulation for chaotic desynchronization of neural populations.
Wilson, Dan; Moehlis, Jeff
2014-10-01
We use optimal control theory to design a methodology to find locally optimal stimuli for desynchronization of a model of neurons with extracellular stimulation. This methodology yields stimuli which lead to positive Lyapunov exponents, and hence desynchronizes a neural population. We analyze this methodology in the presence of interneuron coupling to make predictions about the strength of stimulation required to overcome synchronizing effects of coupling. This methodology suggests a powerful alternative to pulsatile stimuli for deep brain stimulation as it uses less energy than pulsatile stimuli, and could eliminate the time consuming tuning process.
A Methodology for Robust Comparative Life Cycle Assessments Incorporating Uncertainty.
Gregory, Jeremy R; Noshadravan, Arash; Olivetti, Elsa A; Kirchain, Randolph E
2016-06-21
We propose a methodology for conducting robust comparative life cycle assessments (LCA) by leveraging uncertainty. The method evaluates a broad range of the possible scenario space in a probabilistic fashion while simultaneously considering uncertainty in input data. The method is intended to ascertain which scenarios have a definitive environmentally preferable choice among the alternatives being compared and the significance of the differences given uncertainty in the parameters, which parameters have the most influence on this difference, and how we can identify the resolvable scenarios (where one alternative in the comparison has a clearly lower environmental impact). This is accomplished via an aggregated probabilistic scenario-aware analysis, followed by an assessment of which scenarios have resolvable alternatives. Decision-tree partitioning algorithms are used to isolate meaningful scenario groups. In instances where the alternatives cannot be resolved for scenarios of interest, influential parameters are identified using sensitivity analysis. If those parameters can be refined, the process can be iterated using the refined parameters. We also present definitions of uncertainty quantities that have not been applied in the field of LCA and approaches for characterizing uncertainty in those quantities. We then demonstrate the methodology through a case study of pavements.
An industrial ecology approach to municipal solid waste ...
Municipal solid waste (MSW) can be viewed as a feedstock for industrial ecology inspired conversions of wastes to valuable products and energy. The industrial ecology principle of symbiotic processes using waste streams for creating value-added products is applied to MSW, with examples suggested for various residual streams. A methodology is presented to consider individual waste-to-energy or waste-to-product system synergies, evaluating the economic and environmental issues associated with each system. Steps included in the methodology include identifying waste streams, specific waste components of interest, and conversion technologies, plus steps for determining the economic and environmental effects of using wastes and changes due to transport, administrative handling, and processing. In addition to presenting the methodology, technologies for various MSW input streams are categorized as commercialized or demonstrated to provide organizations that are considering processes for MSW with summarized information. The organization can also follow the methodology to analyze interesting processes. Presents information useful for analyzing the sustainability of alternatives for the management of municipal solid waste.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1988-12-01
This document contains twelve papers on various aspects of low-level radioactive waste management. Topics of this volume include: performance assessment methodology; remedial action alternatives; site selection and site characterization procedures; intruder scenarios; sensitivity analysis procedures; mathematical models for mixed waste environmental transport; and risk assessment methodology. Individual papers were processed separately for the database. (TEM)
Using Processing Instruction for the Acquisition of English Present Perfect of Filipinos
ERIC Educational Resources Information Center
Erfe, Jonathan P.; Lintao, Rachelle B.
2012-01-01
This is an experimental study on the relative effects of Van Patten's Processing Instruction (PI) (1996, 2002), a "psycholinguistically-motivated" intervention in teaching second-language (L2) grammar, on young-adult Filipino learners of English. A growing body of research on this methodological alternative, which establishes…
Schaafsma, Joanna D; van der Graaf, Yolanda; Rinkel, Gabriel J E; Buskens, Erik
2009-12-01
The lack of a standard methodology in diagnostic research impedes adequate evaluation before implementation of constantly developing diagnostic techniques. We discuss the methodology of diagnostic research and underscore the relevance of decision analysis in the process of evaluation of diagnostic tests. Overview and conceptual discussion. Diagnostic research requires a stepwise approach comprising assessment of test characteristics followed by evaluation of added value, clinical outcome, and cost-effectiveness. These multiple goals are generally incompatible with a randomized design. Decision-analytic models provide an important alternative through integration of the best available evidence. Thus, critical assessment of clinical value and efficient use of resources can be achieved. Decision-analytic models should be considered part of the standard methodology in diagnostic research. They can serve as a valid alternative to diagnostic randomized clinical trials (RCTs).
1982-02-23
segregate the computer and storage from the outside world 2. Administrative security to control access to secure computer facilities 3. Network security to...Classification Alternative A- 8 NETWORK KG GENSER DSSCS AMPE TERMINALS TP No. 022-4668-A Figure A-2. Dedicated Switching Architecture Alternative A- 9...communications protocol with the network and GENSER message transmission to the - I-S/A AMPE processor. 7. DSSCS TPU - Handles communications protocol with
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruiz-Padillo, Alejandro, E-mail: aruizp@correo.ugr.es; Civil Engineering Department, University of Granada, Av. Fuentenueva s/n, 18071 Granada; Ruiz, Diego P., E-mail: druiz@ugr.es
Road traffic noise is one of the most significant environmental impacts generated by transport systems. To this regard, the recent implementation of the European Environmental Noise Directive by Public Administrations of the European Union member countries has led to various noise action plans (NAPs) for reducing the noise exposure of EU inhabitants. Every country or administration is responsible for applying criteria based on their own experience or expert knowledge, but there is no regulated process for the prioritization of technical measures within these plans. This paper proposes a multi-criteria decision methodology for the selection of suitable alternatives against traffic noisemore » in each of the road stretches included in the NAPs. The methodology first defines the main criteria and alternatives to be considered. Secondly, it determines the relative weights for the criteria and sub-criteria using the fuzzy extended analytical hierarchy process as applied to the results from an expert panel, thereby allowing expert knowledge to be captured in an automated way. A final step comprises the use of discrete multi-criteria analysis methods such as weighted sum, ELECTRE and TOPSIS, to rank the alternatives by suitability. To illustrate an application of the proposed methodology, this paper describes its implementation in a complex real case study: the selection of optimal technical solutions against traffic noise in the top priority road stretch included in the revision of the NAP of the regional road network in the province of Almeria (Spain).« less
Bechara, Rami; Gomez, Adrien; Saint-Antonin, Valérie; Schweitzer, Jean-Marc; Maréchal, François
2016-08-01
The application of methodologies for the optimal design of integrated processes has seen increased interest in literature. This article builds on previous works and applies a systematic methodology to an integrated first and second generation ethanol production plant with power cogeneration. The methodology breaks into process simulation, heat integration, thermo-economic evaluation, exergy efficiency vs. capital costs, multi-variable, evolutionary optimization, and process selection via profitability maximization. Optimization generated Pareto solutions with exergy efficiency ranging between 39.2% and 44.4% and capital costs from 210M$ to 390M$. The Net Present Value was positive for only two scenarios and for low efficiency, low hydrolysis points. The minimum cellulosic ethanol selling price was sought to obtain a maximum NPV of zero for high efficiency, high hydrolysis alternatives. The obtained optimal configuration presented maximum exergy efficiency, hydrolyzed bagasse fraction, capital costs and ethanol production rate, and minimum cooling water consumption and power production rate. Copyright © 2016 Elsevier Ltd. All rights reserved.
Alternative Fuels Data Center: Vehicle Cost Calculator Widget Assumptions
Data Center: Vehicle Cost Calculator Widget Assumptions and Methodology on Facebook Tweet about Alternative Fuels Data Center: Vehicle Cost Calculator Widget Assumptions and Methodology on Twitter Bookmark Alternative Fuels Data Center: Vehicle Cost Calculator Widget Assumptions and Methodology on Google Bookmark
Power processing methodology. [computerized design of spacecraft electric power systems
NASA Technical Reports Server (NTRS)
Fegley, K. A.; Hansen, I. G.; Hayden, J. H.
1974-01-01
Discussion of the interim results of a program to investigate the feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems. The object of the total program is to develop a flexible engineering tool which will allow the power processor designer to effectively and rapidly assess and analyze the tradeoffs available by providing, in one comprehensive program, a mathematical model, an analysis of expected performance, simulation, and a comparative evaluation with alternative designs. This requires an understanding of electrical power source characteristics and the effects of load control, protection, and total system interaction.
Some Findings Concerning Requirements in Agile Methodologies
NASA Astrophysics Data System (ADS)
Rodríguez, Pilar; Yagüe, Agustín; Alarcón, Pedro P.; Garbajosa, Juan
Agile methods have appeared as an attractive alternative to conventional methodologies. These methods try to reduce the time to market and, indirectly, the cost of the product through flexible development and deep customer involvement. The processes related to requirements have been extensively studied in literature, in most cases in the frame of conventional methods. However, conclusions of conventional methodologies could not be necessarily valid for Agile; in some issues, conventional and Agile processes are radically different. As recent surveys report, inadequate project requirements is one of the most conflictive issues in agile approaches and better understanding about this is needed. This paper describes some findings concerning requirements activities in a project developed under an agile methodology. The project intended to evolve an existing product and, therefore, some background information was available. The major difficulties encountered were related to non-functional needs and management of requirements dependencies.
Griesinger, Claudius; Desprez, Bertrand; Coecke, Sandra; Casey, Warren; Zuang, Valérie
This chapter explores the concepts, processes, tools and challenges relating to the validation of alternative methods for toxicity and safety testing. In general terms, validation is the process of assessing the appropriateness and usefulness of a tool for its intended purpose. Validation is routinely used in various contexts in science, technology, the manufacturing and services sectors. It serves to assess the fitness-for-purpose of devices, systems, software up to entire methodologies. In the area of toxicity testing, validation plays an indispensable role: "alternative approaches" are increasingly replacing animal models as predictive tools and it needs to be demonstrated that these novel methods are fit for purpose. Alternative approaches include in vitro test methods, non-testing approaches such as predictive computer models up to entire testing and assessment strategies composed of method suites, data sources and decision-aiding tools. Data generated with alternative approaches are ultimately used for decision-making on public health and the protection of the environment. It is therefore essential that the underlying methods and methodologies are thoroughly characterised, assessed and transparently documented through validation studies involving impartial actors. Importantly, validation serves as a filter to ensure that only test methods able to produce data that help to address legislative requirements (e.g. EU's REACH legislation) are accepted as official testing tools and, owing to the globalisation of markets, recognised on international level (e.g. through inclusion in OECD test guidelines). Since validation creates a credible and transparent evidence base on test methods, it provides a quality stamp, supporting companies developing and marketing alternative methods and creating considerable business opportunities. Validation of alternative methods is conducted through scientific studies assessing two key hypotheses, reliability and relevance of the test method for a given purpose. Relevance encapsulates the scientific basis of the test method, its capacity to predict adverse effects in the "target system" (i.e. human health or the environment) as well as its applicability for the intended purpose. In this chapter we focus on the validation of non-animal in vitro alternative testing methods and review the concepts, challenges, processes and tools fundamental to the validation of in vitro methods intended for hazard testing of chemicals. We explore major challenges and peculiarities of validation in this area. Based on the notion that validation per se is a scientific endeavour that needs to adhere to key scientific principles, namely objectivity and appropriate choice of methodology, we examine basic aspects of study design and management, and provide illustrations of statistical approaches to describe predictive performance of validated test methods as well as their reliability.
Molinos-Senante, María; Hernández-Sancho, Francesc; Sala-Garrido, Ramón
2012-01-01
The concept of sustainability involves the integration of economic, environmental, and social aspects and this also applies in the field of wastewater treatment. Economic feasibility studies are a key tool for selecting the most appropriate option from a set of technological proposals. Moreover, these studies are needed to assess the viability of transferring new technologies from pilot-scale to full-scale. In traditional economic feasibility studies, the benefits that have no market price, such as environmental benefits, are not considered and are therefore underestimated. To overcome this limitation, we propose a new methodology to assess the economic viability of wastewater treatment technologies that considers internal and external impacts. The estimation of the costs is based on the use of cost functions. To quantify the environmental benefits from wastewater treatment, the distance function methodology is proposed to estimate the shadow price of each pollutant removed in the wastewater treatment. The application of this methodological approach by decision makers enables the calculation of the true costs and benefits associated with each alternative technology. The proposed methodology is presented as a useful tool to support decision making.
Qualification Procedures for VHSIC/VLSI
1990-12-01
alternative approach for qualification of complex microcircuits. To address the technical issues related to a process oriented qualification approach, the...methodology of microcircuit process control to promote the United States to a position of supplying the nighest quality and most reliable...available resources . o Coordinate document reviews with weekly and monthly status reviews on progress. o Summarize results and collate into four basic
Code of Federal Regulations, 2012 CFR
2012-07-01
... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...
Code of Federal Regulations, 2011 CFR
2011-07-01
... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...
Code of Federal Regulations, 2010 CFR
2010-07-01
... rate, type of control devices, process parameters (e.g., maximum heat input), and non-process... control systems (if applicable) and explain why the conditions are worst-case. (c) Number of test runs... located at the outlet of the control device and prior to any releases to the atmosphere. (e) Collection of...
NASA Technical Reports Server (NTRS)
McDougal, Kristopher J.
2008-01-01
More and more test programs are requiring high frequency measurements. Marshall Space Flight Center s Cold Flow Test Facility has an interest in acquiring such data. The acquisition of this data requires special hardware and capabilities. This document provides a structured trade study approach for determining which additional capabilities of a VXI-based data acquisition system should be utilized to meet the test facility objectives. The paper is focused on the trade study approach detailing and demonstrating the methodology. A case is presented in which a trade study was initially performed to provide a recommendation for the data system capabilities. Implementation details of the recommended alternative are briefly provided as well as the system s performance during a subsequent test program. The paper then addresses revisiting the trade study with modified alternatives and attributes to address issues that arose during the subsequent test program. Although the model does not identify a single best alternative for all sensitivities, the trade study process does provide a much better understanding. This better understanding makes it possible to confidently recommend Alternative 3 as the preferred alternative.
System architectures for telerobotic research
NASA Technical Reports Server (NTRS)
Harrison, F. Wallace
1989-01-01
Several activities are performed related to the definition and creation of telerobotic systems. The effort and investment required to create architectures for these complex systems can be enormous; however, the magnitude of process can be reduced if structured design techniques are applied. A number of informal methodologies supporting certain aspects of the design process are available. More recently, prototypes of integrated tools supporting all phases of system design from requirements analysis to code generation and hardware layout have begun to appear. Activities related to system architecture of telerobots are described, including current activities which are designed to provide a methodology for the comparison and quantitative analysis of alternative system architectures.
Application of analytic hierarchy process in a waste treatment technology assessment in Mexico.
Taboada-González, Paul; Aguilar-Virgen, Quetzalli; Ojeda-Benítez, Sara; Cruz-Sotelo, Samantha
2014-09-01
The high per capita generation of solid waste and the environmental problems in major rural communities of Ensenada, Baja California, have prompted authorities to seek alternatives for waste treatment. In the absence of a selection methodology, three technologies of waste treatment with energy recovery (an anaerobic digester, a downdraft gasifier, and a plasma gasifier) were evaluated, taking the broader social, political, economic, and environmental issues into considerations. Using the scientific literature as a baseline, interviews with experts, decision makers and the community, and waste stream studies were used to construct a hierarchy that was evaluated by the analytic hierarchy process. In terms of the criteria, judgments, and assumptions made in the model, the anaerobic digester was found to have the highest rating and should consequently be selected as the waste treatment technology for this area. The study results showed low sensitivity, so alternative scenarios were not considered. The methodology developed in this study may be useful for other governments who wish to assess technologies to select waste treatment.
NASA Technical Reports Server (NTRS)
Baird, J.
1967-01-01
This supplement to Task lB-Large Solid Rocket Motor Case Fabrication Methods supplies additional supporting cost data and discusses in detail the methodology that was applied to the task. For the case elements studied, the cost was found to be directly proportional to the Process Complexity Factor (PCF). The PCF was obtained for each element by identifying unit processes that are common to the elements and their alternative manufacturing routes, by assigning a weight to each unit process, and by summing the weighted counts. In three instances of actual manufacture, the actual cost per pound equaled the cost estimate based on PCF per pound, but this supplement, recognizes that the methodology is of limited, rather than general, application.
Systematic review of the methodological and reporting quality of case series in surgery.
Agha, R A; Fowler, A J; Lee, S-Y; Gundogan, B; Whitehurst, K; Sagoo, H K; Jeong, K J L; Altman, D G; Orgill, D P
2016-09-01
Case series are an important and common study type. No guideline exists for reporting case series and there is evidence of key data being missed from such reports. The first step in the process of developing a methodologically sound reporting guideline is a systematic review of literature relevant to the reporting deficiencies of case series. A systematic review of methodological and reporting quality in surgical case series was performed. The electronic search strategy was developed by an information specialist and included MEDLINE, Embase, Cochrane Methods Register, Science Citation Index and Conference Proceedings Citation index, from the start of indexing to 5 November 2014. Independent screening, eligibility assessments and data extraction were performed. Included articles were then analysed for five areas of deficiency: failure to use standardized definitions, missing or selective data (including the omission of whole cases or important variables), transparency or incomplete reporting, whether alternative study designs were considered, and other issues. Database searching identified 2205 records. Through the process of screening and eligibility assessments, 92 articles met inclusion criteria. Frequencies of methodological and reporting issues identified were: failure to use standardized definitions (57 per cent), missing or selective data (66 per cent), transparency or incomplete reporting (70 per cent), whether alternative study designs were considered (11 per cent) and other issues (52 per cent). The methodological and reporting quality of surgical case series needs improvement. The data indicate that evidence-based guidelines for the conduct and reporting of case series may be useful. © 2016 BJS Society Ltd Published by John Wiley & Sons Ltd.
A data mining method to facilitate SAR transfer.
Wassermann, Anne Mai; Bajorath, Jürgen
2011-08-22
A challenging practical problem in medicinal chemistry is the transfer of SAR information from one chemical series to another. Currently, there are no computational methods available to rationalize or support this process. Herein, we present a data mining approach that enables the identification of alternative analog series with different core structures, corresponding substitution patterns, and comparable potency progression. Scaffolds can be exchanged between these series and new analogs suggested that incorporate preferred R-groups. The methodology can be applied to search for alternative analog series if one series is known or, alternatively, to systematically assess SAR transfer potential in compound databases.
Chan, T M Simon; Teram, Eli; Shaw, Ian
2017-01-01
Despite growing consideration of the needs of research participants in studies related to sensitive issues, discussions of alternative ways to design sensitive research are scarce. Structured as an exchange between two researchers who used different approaches in their studies with childhood sexual abuse survivors, in this article, we seek to advance understanding of methodological and ethical issues in designing sensitive research. The first perspective, which is termed protective, promotes the gradual progression of participants from a treatment phase into a research phase, with the ongoing presence of a researcher and a social worker in both phases. In the second perspective, which is termed minimalist, we argue for clear boundaries between research and treatment processes, limiting the responsibility of researchers to ensuring that professional support is available to participants who experience emotional difficulties. Following rebuttals, lessons are drawn for ethical balancing between methodological rigor and the needs of participants. © The Author(s) 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Litchfield, J.W.; Watts, R.L.; Gurwell, W.E.
A materials assessment methodology for identifying specific critical material requirements that could hinder the implementation of solar energy has been developed and demonstrated. The methodology involves an initial screening process, followed by a more detailed materials assessment. The detailed assessment considers such materials concerns and constraints as: process and production constraints, reserve and resource limitations, lack of alternative supply sources, geopolitical problems, environmental and energy concerns, time constraints, and economic constraints. Data for 55 bulk and 53 raw materials are currently available on the data base. These materials are required in the example photovoltaic systems. One photovoltaic system and thirteenmore » photovoltaic cells, ten solar heating and cooling systems, and two agricultural and industrial process heat systems have been characterized to define their engineering and bulk material requirements.« less
Additive Manufacturing of Functional Elements on Sheet Metal
NASA Astrophysics Data System (ADS)
Schaub, Adam; Ahuja, Bhrigu; Butzhammer, Lorenz; Osterziel, Johannes; Schmidt, Michael; Merklein, Marion
Laser Beam Melting (LBM) process with its advantages of high design flexibility and free form manufacturing methodology is often applied limitedly due to its low productivity and unsuitability for mass production compared to conventional manufacturing processes. In order to overcome these limitations, a hybrid manufacturing methodology is developed combining the additive manufacturing process of laser beam melting with sheet forming processes. With an interest towards aerospace and medical industry, the material in focus is Ti-6Al-4V. Although Ti-6Al-4V is a commercially established material and its application for LBM process has been extensively investigated, the combination of LBM of Ti-6Al-4V with sheet metal still needs to be researched. Process dynamics such as high temperature gradients and thermally induced stresses lead to complex stress states at the interaction zone between the sheet and LBM structure. Within the presented paper mechanical characterization of hybrid parts will be performed by shear testing. The association of shear strength with process parameters is further investigated by analyzing the internal structure of the hybrid geometry at varying energy inputs during the LBM process. In order to compare the hybrid manufacturing methodology with conventional fabrication, the conventional methodologies subtractive machining and state of the art Laser Beam Melting is evaluated within this work. These processes will be analyzed for their mechanical characteristics and productivity by determining the build time and raw material consumption for each case. The paper is concluded by presenting the characteristics of the hybrid manufacturing methodology compared to alternative manufacturing technologies.
Doing Research That Makes a Difference
ERIC Educational Resources Information Center
Bensimon, Estela Mara; Polkinghorne, Donald E.; Bauman, Georgia L.; Vallejo, Edlyn
2004-01-01
This article describes an alternative methodology for conducting research that is intended to bring about institutional change. This process involves developing deeper awareness among faculty members, administrators, or counselors, of a problem that exists in their local context. In some instances these individuals may be unaware that the problem…
Häuser, Winfried; Dobos, Gustav; Langhorst, Jost
2015-01-01
Objectives. This systematic overview of reviews aimed to summarize evidence and methodological quality from systematic reviews of complementary and alternative medicine (CAM) for the fibromyalgia syndrome (FMS). Methods. The PubMed/MEDLINE, Cochrane Library, and Scopus databases were screened from their inception to Sept 2013 to identify systematic reviews and meta-analyses of CAM interventions for FMS. Methodological quality of reviews was rated using the AMSTAR instrument. Results. Altogether 25 systematic reviews were found; they investigated the evidence of CAM in general, exercised-based CAM therapies, manipulative therapies, Mind/Body therapies, acupuncture, hydrotherapy, phytotherapy, and homeopathy. Methodological quality of reviews ranged from lowest to highest possible quality. Consistently positive results were found for tai chi, yoga, meditation and mindfulness-based interventions, hypnosis or guided imagery, electromyogram (EMG) biofeedback, and balneotherapy/hydrotherapy. Inconsistent results concerned qigong, acupuncture, chiropractic interventions, electroencephalogram (EEG) biofeedback, and nutritional supplements. Inconclusive results were found for homeopathy and phytotherapy. Major methodological flaws included missing details on data extraction process, included or excluded studies, study details, and adaption of conclusions based on quality assessment. Conclusions. Despite a growing body of scientific evidence of CAM therapies for the management of FMS systematic reviews still show methodological flaws limiting definite conclusions about their efficacy and safety. PMID:26246841
Lauche, Romy; Cramer, Holger; Häuser, Winfried; Dobos, Gustav; Langhorst, Jost
2015-01-01
Objectives. This systematic overview of reviews aimed to summarize evidence and methodological quality from systematic reviews of complementary and alternative medicine (CAM) for the fibromyalgia syndrome (FMS). Methods. The PubMed/MEDLINE, Cochrane Library, and Scopus databases were screened from their inception to Sept 2013 to identify systematic reviews and meta-analyses of CAM interventions for FMS. Methodological quality of reviews was rated using the AMSTAR instrument. Results. Altogether 25 systematic reviews were found; they investigated the evidence of CAM in general, exercised-based CAM therapies, manipulative therapies, Mind/Body therapies, acupuncture, hydrotherapy, phytotherapy, and homeopathy. Methodological quality of reviews ranged from lowest to highest possible quality. Consistently positive results were found for tai chi, yoga, meditation and mindfulness-based interventions, hypnosis or guided imagery, electromyogram (EMG) biofeedback, and balneotherapy/hydrotherapy. Inconsistent results concerned qigong, acupuncture, chiropractic interventions, electroencephalogram (EEG) biofeedback, and nutritional supplements. Inconclusive results were found for homeopathy and phytotherapy. Major methodological flaws included missing details on data extraction process, included or excluded studies, study details, and adaption of conclusions based on quality assessment. Conclusions. Despite a growing body of scientific evidence of CAM therapies for the management of FMS systematic reviews still show methodological flaws limiting definite conclusions about their efficacy and safety.
NASA Technical Reports Server (NTRS)
Howard, R. A.; North, D. W.; Pezier, J. P.
1975-01-01
A new methodology is proposed for integrating planetary quarantine objectives into space exploration planning. This methodology is designed to remedy the major weaknesses inherent in the current formulation of planetary quarantine requirements. Application of the methodology is illustrated by a tutorial analysis of a proposed Jupiter Orbiter mission. The proposed methodology reformulates planetary quarantine planning as a sequential decision problem. Rather than concentrating on a nominal plan, all decision alternatives and possible consequences are laid out in a decision tree. Probabilities and values are associated with the outcomes, including the outcome of contamination. The process of allocating probabilities, which could not be made perfectly unambiguous and systematic, is replaced by decomposition and optimization techniques based on principles of dynamic programming. Thus, the new methodology provides logical integration of all available information and allows selection of the best strategy consistent with quarantine and other space exploration goals.
1982-02-01
385. Punj, Girish N. and Richard Staelin, (1978), "The Choice Process for Graduate Business Schools," Journal of Marketing Research , 15, (November 1978...Test Market Evaluation of New Packaged Goods: A Model and Measurement Methodology," Journal of Marketing Research , 15, (May) 171-191. Urban, Glen
Adding Asymmetrically Dominated Alternatives: Violations of Regularity & the Similarity Hypothesis.
1981-07-01
34The Choice Process for Graduate Business Schools," Journal of Marketing Research , 15, (November, 1978) 588-598. Reibstein, David (1978), "The...Market Evaluation of New Packaged Goods: A Model and Measurement Methodology," Journal of Marketing Research , 15, (May), 171-191. Simon, H. A. (1957
Gilligan's Moral Orientation Hypothesis: Strategies of Justification and Practical Deliberation.
ERIC Educational Resources Information Center
Keefer, Matthew Wilks
Previous studies failed to determine whether Gilligan's (1982) justice and care perspectives represent two distinct orientations of moral reasoning. Using methods developed in research on reasoning and discourse processes, a study used a discursive framework to validate an alternate methodology for the investigation of moral orientation reasoning.…
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
NASA Astrophysics Data System (ADS)
McJannet, D. L.; Cook, F. J.; McGloin, R. P.; McGowan, H. A.; Burn, S.
2011-05-01
The use of scintillometers to determine sensible and latent heat flux is becoming increasingly common because of their ability to quantify convective fluxes over distances of hundreds of meters to several kilometers. The majority of investigations using scintillometry have focused on processes above land surfaces, but here we propose a new methodology for obtaining sensible and latent heat fluxes from a scintillometer deployed over open water. This methodology has been tested by comparison with eddy covariance measurements and through comparison with alternative scintillometer calculation approaches that are commonly used in the literature. The methodology is based on linearization of the Bowen ratio, which is a common assumption in models such as Penman's model and its derivatives. Comparison of latent heat flux estimates from the eddy covariance system and the scintillometer showed excellent agreement across a range of weather conditions and flux rates, giving a high level of confidence in scintillometry-derived latent heat fluxes. The proposed approach produced better estimates than other scintillometry calculation methods because of the reliance of alternative methods on measurements of water temperature or water body heat storage, which are both notoriously hard to quantify. The proposed methodology requires less instrumentation than alternative scintillometer calculation approaches, and the spatial scales of required measurements are arguably more compatible. In addition to scintillometer measurements of the structure parameter of the refractive index of air, the only measurements required are atmospheric pressure, air temperature, humidity, and wind speed at one height over the water body.
Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants
NASA Astrophysics Data System (ADS)
Kulbjakina, A. V.; Dolotovskij, I. V.
2018-01-01
The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.
Guerra, J G; Rubiano, J G; Winter, G; Guerra, A G; Alonso, H; Arnedo, M A; Tejera, A; Gil, J M; Rodríguez, R; Martel, P; Bolivar, J P
2015-11-01
The determination in a sample of the activity concentration of a specific radionuclide by gamma spectrometry needs to know the full energy peak efficiency (FEPE) for the energy of interest. The difficulties related to the experimental calibration make it advisable to have alternative methods for FEPE determination, such as the simulation of the transport of photons in the crystal by the Monte Carlo method, which requires an accurate knowledge of the characteristics and geometry of the detector. The characterization process is mainly carried out by Canberra Industries Inc. using proprietary techniques and methodologies developed by that company. It is a costly procedure (due to shipping and to the cost of the process itself) and for some research laboratories an alternative in situ procedure can be very useful. The main goal of this paper is to find an alternative to this costly characterization process, by establishing a method for optimizing the parameters of characterizing the detector, through a computational procedure which could be reproduced at a standard research lab. This method consists in the determination of the detector geometric parameters by using Monte Carlo simulation in parallel with an optimization process, based on evolutionary algorithms, starting from a set of reference FEPEs determined experimentally or computationally. The proposed method has proven to be effective and simple to implement. It provides a set of characterization parameters which it has been successfully validated for different source-detector geometries, and also for a wide range of environmental samples and certified materials. Copyright © 2015 Elsevier Ltd. All rights reserved.
Radiological Characterization Methodology of INEEL Stored RH-TRU Waste from ANL-E
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajiv N. Bhatt
2003-02-01
An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using this methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuan, P.; Bhatt, R.N.
2003-01-14
An Acceptable Knowledge (AK)-based radiological characterization methodology is being developed for RH TRU waste generated from ANL-E hot cell operations performed on fuel elements irradiated in the EBR-II reactor. The methodology relies on AK for composition of the fresh fuel elements, their irradiation history, and the waste generation and collection processes. Radiological characterization of the waste involves the estimates of the quantities of significant fission products and transuranic isotopes in the waste. Methods based on reactor and physics principles are used to achieve these estimates. Because of the availability of AK and the robustness of the calculation methods, the AK-basedmore » characterization methodology offers a superior alternative to traditional waste assay techniques. Using the methodology, it is shown that the radiological parameters of a test batch of ANL-E waste is well within the proposed WIPP Waste Acceptance Criteria limits.« less
Second Life as a Support Element for Learning Electronic Related Subjects: A Real Case
ERIC Educational Resources Information Center
Beltran Sierra, Luis M.; Gutierrez, Ronald S.; Garzon-Castro, Claudia L.
2012-01-01
Looking for more active and motivating methodological alternatives from the students' perspective, which promote analysis and investigation abilities that make the student a more participative agent and some learning processes are facilitated, a practical study was conducted in the University of La Sabana (Chia, Colombia), in Computing Engineering…
Propensity Score Estimation with Data Mining Techniques: Alternatives to Logistic Regression
ERIC Educational Resources Information Center
Keller, Bryan S. B.; Kim, Jee-Seon; Steiner, Peter M.
2013-01-01
Propensity score analysis (PSA) is a methodological technique which may correct for selection bias in a quasi-experiment by modeling the selection process using observed covariates. Because logistic regression is well understood by researchers in a variety of fields and easy to implement in a number of popular software packages, it has…
USDA-ARS?s Scientific Manuscript database
Alternative egg production methods are becoming more popular with US consumers. As the drive to expand the retail shell egg market to accommodate consumer shifts proceeds, a need arises for additional information to ensure processing methodologies result in safe eggs from all egg sources. A study ...
The Role of Writing Pedagogy in Vocabulary Improvement
ERIC Educational Resources Information Center
Mirhassani, Seyyed Akbar; Samar, Reza Ghafar; Fattahipoor, Majid
2006-01-01
To improve and activate the vocabulary of EFL learners, an alternative to common advice in trying to use them in speech can be invited. As two quite different methodologies in writing pedagogy are process and product writing, it is of concern to find which holds more promise for the vocabulary improvement. Product writing pedagogy encompasses…
Multicriteria methodological approach to manage urban air pollution
NASA Astrophysics Data System (ADS)
Vlachokostas, Ch.; Achillas, Ch.; Moussiopoulos, N.; Banias, G.
2011-08-01
Managing urban air pollution necessitates a feasible and efficient abatement strategy which is characterised as a defined set of specific control measures. In practice, hard budget constraints are present in any decision-making process and therefore available alternatives need to be hierarchised in a fast but still reliable manner. Moreover, realistic strategies require adequate information on the available control measures, taking also into account the area's special characteristics. The selection of the most applicable bundle of measures rests in achieving stakeholders' consensus, while taking into consideration mutually conflicting views and criteria. A preliminary qualitative comparison of alternative control measures would be most handy for decision-makers, forming the grounds for an in-depth analysis of the most promising ones. This paper presents an easy-to-follow multicriteria methodological approach in order to include and synthesise multi-disciplinary knowledge from various stakeholders so as to result into a priority list of abatement options, achieve consensus and secure the adoption of the resulting optimal solution. The approach relies on the active involvement of public authorities and local stakeholders in order to incorporate their environmental, economic and social preferences. The methodological scheme is implemented for the case of Thessaloniki, Greece, an area considered among the most polluted cities within Europe, especially with respect to airborne particles. Intense police control, natural gas penetration in buildings and metro construction equally result into the most "promising" alternatives in order to control air pollution in the GTA. The three optimal alternatives belong to different thematic areas, namely road transport, thermal heating and infrastructure. Thus, it is obvious that efforts should spread throughout all thematic areas. Natural gas penetration in industrial units, intense monitoring of environmental standards and regular maintenance of heavy oil burners are ranked as 4th, 5th and 6th optimal alternatives, respectively.
Modeling of electrohydrodynamic drying process using response surface methodology
Dalvand, Mohammad Jafar; Mohtasebi, Seyed Saeid; Rafiee, Shahin
2014-01-01
Energy consumption index is one of the most important criteria for judging about new, and emerging drying technologies. One of such novel and promising alternative of drying process is called electrohydrodynamic (EHD) drying. In this work, a solar energy was used to maintain required energy of EHD drying process. Moreover, response surface methodology (RSM) was used to build a predictive model in order to investigate the combined effects of independent variables such as applied voltage, field strength, number of discharge electrode (needle), and air velocity on moisture ratio, energy efficiency, and energy consumption as responses of EHD drying process. Three-levels and four-factor Box–Behnken design was employed to evaluate the effects of independent variables on system responses. A stepwise approach was followed to build up a model that can map the entire response surface. The interior relationships between parameters were well defined by RSM. PMID:24936289
Helal-Neto, Edward; Cabezas, Santiago Sánchez; Sancenón, Félix; Martínez-Máñez, Ramón; Santos-Oliveira, Ralph
2018-05-10
The use of monoclonal antibodies (Mab) in the current medicine is increasing. Antibody-drug conjugates (ADCs) represents an increasingly and important modality for treating several types of cancer. In this area, the use of Mab associated with nanoparticles is a valuable strategy. However, the methodology used to calculate the Mab entrapment, efficiency and content is extremely expensive. In this study we developed and tested a novel very simple one-step methodology to calculate monoclonal antibody entrapment in mesoporous silica (with magnetic core) nanoparticles using the radiolabeling process as primary methodology. The magnetic core mesoporous silica were successfully developed and characterised. The PXRD analysis at high angles confirmed the presence of magnetic cores in the structures and transmission electron microscopy allowed to determine structures size (58.9 ± 8.1 nm). From the isotherm curve, a specific surface area of 872 m 2 /g was estimated along with a pore volume of 0.85 cm 3 /g and an average pore diameter of 3.15 nm. The radiolabeling process to proceed the indirect determination were well-done. Trastuzumab were successfully labeled (>97%) with Tc-99m generating a clear suspension. Besides, almost all the Tc-99m used (labeling the trastuzumab) remained trapped in the surface of the mesoporous silica for a period as long as 8 h. The indirect methodology demonstrated a high entrapment in magnetic core mesoporous silica surface of Tc-99m-traztuzumab. The results confirmed the potential use from the indirect entrapment efficiency methodology using the radiolabeling process, as a one-step, easy and cheap methodology. Copyright © 2018 Elsevier B.V. All rights reserved.
Methodologies and systems for heterogeneous concurrent computing
NASA Technical Reports Server (NTRS)
Sunderam, V. S.
1994-01-01
Heterogeneous concurrent computing is gaining increasing acceptance as an alternative or complementary paradigm to multiprocessor-based parallel processing as well as to conventional supercomputing. While algorithmic and programming aspects of heterogeneous concurrent computing are similar to their parallel processing counterparts, system issues, partitioning and scheduling, and performance aspects are significantly different. In this paper, we discuss critical design and implementation issues in heterogeneous concurrent computing, and describe techniques for enhancing its effectiveness. In particular, we highlight the system level infrastructures that are required, aspects of parallel algorithm development that most affect performance, system capabilities and limitations, and tools and methodologies for effective computing in heterogeneous networked environments. We also present recent developments and experiences in the context of the PVM system and comment on ongoing and future work.
Netlist Oriented Sensitivity Evaluation (NOSE)
2017-03-01
developing methodologies to assess sensitivities of alternative chip design netlist implementations. The research is somewhat foundational in that such...Netlist-Oriented Sensitivity Evaluation (NOSE) project was to develop methodologies to assess sensitivities of alternative chip design netlist...analysis to devise a methodology for scoring the sensitivity of circuit nodes in a netlist and thus providing the raw data for any meaningful
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purcupile, J.C.
The purpose of this study is to apply the methodologies developed in the Energy Conservation in Coal Conversion August, 1977 Progress Report - Contract No. EY77S024196 - to an energy efficient, near-term coal conversion process design, and to develop additional, general techniques for studying energy conservation and utilization in coal conversion processes. The process selected for study was the Ralph M. Parsons Company of Pasadena, California ''Oil/Gas Complex, Conceptual Design/Economic Analysis'' as described in R and D Report No. 114 - Interim Report No. 4, published March, 1977, ERDA Contract No. E(49-18)-1975. Thirteen papers representing possible alternative methods of energymore » conservation or waste heat utilization have been entered individually into EDB and ERA. (LTN)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pombet, Denis; Desnoyers, Yvon; Charters, Grant
2013-07-01
The TruPro{sup R} process enables to collect a significant number of samples to characterize radiological materials. This innovative and alternative technique is experimented for the ANDRA quality-control inspection of cemented packages. It proves to be quicker and more prolific than the current methodology. Using classical statistics and geo-statistics approaches, the physical and radiological characteristics of two hulls containing immobilized wastes (sludges or concentrates) in a hydraulic binder are assessed in this paper. The waste homogeneity is also evaluated in comparison to ANDRA criterion. Sensibility to sample size (support effect), presence of extreme values, acceptable deviation rate and minimum number ofmore » data are discussed. The final objectives are to check the homogeneity of the two characterized radwaste packages and also to validate and reinforce this alternative characterization methodology. (authors)« less
The Future Impact of Wind on BPA Power System Load Following and Regulation Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Lu, Shuai; McManus, Bart
Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system load following and regulation requirements. Existing methodologies for similar analysis include dispatch model simulation and standard deviation evaluation on load and wind data. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in the BPA power system. It mimics themore » actual power system operations therefore the results are close to reality yet the study based on this methodology is convenient to perform. The capacity, ramp rate and ramp duration characteristics are extracted from the simulation results. System load following and regulation capacity requirements are calculated accordingly. The ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability requirement and regulating units’ energy requirement, respectively.« less
NASA Electronic Publishing System: Cost/benefit Methodology
NASA Technical Reports Server (NTRS)
Tuey, Richard C.
1994-01-01
The NASA Scientific and Technical Information Office was assigned the responsibility to examine the benefits of the utilization of electronic printing and duplicating systems throughout NASA Installations and Headquarters. The subject of this report is the documentation of the methodology used in justifying the acquisition of the most cost beneficial solution for the printing and duplicating requirements of a duplicating facility that is contemplating the acquisition of an electronic printing and duplicating system. Four alternatives are presented with each alternative costed out with its associated benefits. The methodology goes a step further than just a cost benefit analysis through its comparison of risks associated with each alternative, sensitivity to number of impressions and productivity gains on the selected alternative and finally the return on investment for the selected alternative. The report can be used in conjunction with the two earlier reports, NASA-TM-106242 and TM-106510 in guiding others in determining the cost effective duplicating alternative.
Measuring attitudes towards the dying process: A systematic review of tools.
Groebe, Bernadette; Strupp, Julia; Eisenmann, Yvonne; Schmidt, Holger; Schlomann, Anna; Rietz, Christian; Voltz, Raymond
2018-04-01
At the end of life, anxious attitudes concerning the dying process are common in patients in Palliative Care. Measurement tools can identify vulnerabilities, resources and the need for subsequent treatment to relieve suffering and support well-being. To systematically review available tools measuring attitudes towards dying, their operationalization, the method of measurement and the methodological quality including generalizability to different contexts. Systematic review according to the PRISMA Statement. Methodological quality of tools assessed by standardized review criteria. MEDLINE, PsycINFO, PsyndexTests and the Health and Psychosocial Instruments were searched from their inception to April 2017. A total of 94 identified studies reported the development and/or validation of 44 tools. Of these, 37 were questionnaires and 7 alternative measurement methods (e.g. projective measures). In 34 of 37 questionnaires, the emotional evaluation (e.g. anxiety) towards dying is measured. Dying is operationalized in general items ( n = 20), in several specific aspects of dying ( n = 34) and as dying of others ( n = 14). Methodological quality of tools was reported inconsistently. Nine tools reported good internal consistency. Of 37 tools, 4 were validated in a clinical sample (e.g. terminal cancer; Huntington disease), indicating questionable generalizability to clinical contexts for most tools. Many tools exist to measure attitudes towards the dying process using different endpoints. This overview can serve as decision framework on which tool to apply in which contexts. For clinical application, only few tools were available. Further validation of existing tools and potential alternative methods in various populations is needed.
A review of the solar array manufacturing industry costing standards
NASA Technical Reports Server (NTRS)
1977-01-01
The solar array manufacturing industry costing standards model is designed to compare the cost of producing solar arrays using alternative manufacturing processes. Constructive criticism of the methodology used is intended to enhance its implementation as a practical design tool. Three main elements of the procedure include workbook format and presentation, theoretical model validity and standard financial parameters.
Research and development of an electrochemical biocide reactor
NASA Technical Reports Server (NTRS)
See, G. G.; Bodo, C. A.; Glennon, J. P.
1975-01-01
An alternate disinfecting process to chemical agents, heat, or radiation in an aqueous media has been studied. The process is called an electrochemical biocide and employs cyclic, low-level voltages at chemically inert electrodes to pass alternating current through water and, in the process, to destroy microorganisms. The paper describes experimental hardware, methodology, and results with a tracer microorganism (Escherichia coli). The results presented show the effects on microorganism kill of operating parameters, including current density (15 to 55 mA/sq cm (14 to 51 ASF)), waveform of applied electrical signal (square, triangular, sine), frequency of applied electrical signal (0.5 to 1.5 Hz), process water flow rate (100 to 600 cc/min (1.6 to 9.5 gph)), and reactor resident time (0 to 4 min). Comparisons are made between the disinfecting property of the electrochemical biocide and chlorine, bromine, and iodine.
Process-based Cost Estimation for Ramjet/Scramjet Engines
NASA Technical Reports Server (NTRS)
Singh, Brijendra; Torres, Felix; Nesman, Miles; Reynolds, John
2003-01-01
Process-based cost estimation plays a key role in effecting cultural change that integrates distributed science, technology and engineering teams to rapidly create innovative and affordable products. Working together, NASA Glenn Research Center and Boeing Canoga Park have developed a methodology of process-based cost estimation bridging the methodologies of high-level parametric models and detailed bottoms-up estimation. The NASA GRC/Boeing CP process-based cost model provides a probabilistic structure of layered cost drivers. High-level inputs characterize mission requirements, system performance, and relevant economic factors. Design alternatives are extracted from a standard, product-specific work breakdown structure to pre-load lower-level cost driver inputs and generate the cost-risk analysis. As product design progresses and matures the lower level more detailed cost drivers can be re-accessed and the projected variation of input values narrowed, thereby generating a progressively more accurate estimate of cost-risk. Incorporated into the process-based cost model are techniques for decision analysis, specifically, the analytic hierarchy process (AHP) and functional utility analysis. Design alternatives may then be evaluated not just on cost-risk, but also user defined performance and schedule criteria. This implementation of full-trade study support contributes significantly to the realization of the integrated development environment. The process-based cost estimation model generates development and manufacturing cost estimates. The development team plans to expand the manufacturing process base from approximately 80 manufacturing processes to over 250 processes. Operation and support cost modeling is also envisioned. Process-based estimation considers the materials, resources, and processes in establishing cost-risk and rather depending on weight as an input, actually estimates weight along with cost and schedule.
Injector element characterization methodology
NASA Technical Reports Server (NTRS)
Cox, George B., Jr.
1988-01-01
Characterization of liquid rocket engine injector elements is an important part of the development process for rocket engine combustion devices. Modern nonintrusive instrumentation for flow velocity and spray droplet size measurement, and automated, computer-controlled test facilities allow rapid, low-cost evaluation of injector element performance and behavior. Application of these methods in rocket engine development, paralleling their use in gas turbine engine development, will reduce rocket engine development cost and risk. The Alternate Turbopump (ATP) Hot Gas Systems (HGS) preburner injector elements were characterized using such methods, and the methodology and some of the results obtained will be shown.
An Analysis Methodology for the Gamma-ray Large Area Space Telescope
NASA Technical Reports Server (NTRS)
Morris, Robin D.; Cohen-Tanugi, Johann
2004-01-01
The Large Area Telescope (LAT) instrument on the Gamma Ray Large Area Space Telescope (GLAST) has been designed to detect high-energy gamma rays and determine their direction of incidence and energy. We propose a reconstruction algorithm based on recent advances in statistical methodology. This method, alternative to the standard event analysis inherited from high energy collider physics experiments, incorporates more accurately the physical processes occurring in the detector, and makes full use of the statistical information available. It could thus provide a better estimate of the direction and energy of the primary photon.
NASA Astrophysics Data System (ADS)
Huang, Xiao
2006-04-01
Today's and especially tomorrow's competitive launch vehicle design environment requires the development of a dedicated generic Space Access Vehicle (SAV) design methodology. A total of 115 industrial, research, and academic aircraft, helicopter, missile, and launch vehicle design synthesis methodologies have been evaluated. As the survey indicates, each synthesis methodology tends to focus on a specific flight vehicle configuration, thus precluding the key capability to systematically compare flight vehicle design alternatives. The aim of the research investigation is to provide decision-making bodies and the practicing engineer a design process and tool box for robust modeling and simulation of flight vehicles where the ultimate performance characteristics may hinge on numerical subtleties. This will enable the designer of a SAV for the first time to consistently compare different classes of SAV configurations on an impartial basis. This dissertation presents the development steps required towards a generic (configuration independent) hands-on flight vehicle conceptual design synthesis methodology. This process is developed such that it can be applied to any flight vehicle class if desired. In the present context, the methodology has been put into operation for the conceptual design of a tourist Space Access Vehicle. The case study illustrates elements of the design methodology & algorithm for the class of Horizontal Takeoff and Horizontal Landing (HTHL) SAVs. The HTHL SAV design application clearly outlines how the conceptual design process can be centrally organized, executed and documented with focus on design transparency, physical understanding and the capability to reproduce results. This approach offers the project lead and creative design team a management process and tool which iteratively refines the individual design logic chosen, leading to mature design methods and algorithms. As illustrated, the HTHL SAV hands-on design methodology offers growth potential in that the same methodology can be continually updated and extended to other SAV configuration concepts, such as the Vertical Takeoff and Vertical Landing (VTVL) SAV class. Having developed, validated and calibrated the methodology for HTHL designs in the 'hands-on' mode, the report provides an outlook how the methodology will be integrated into a prototype computerized design synthesis software AVDS-PrADOSAV in a follow-on step.
Single point aerosol sampling: evaluation of mixing and probe performance in a nuclear stack.
Rodgers, J C; Fairchild, C I; Wood, G O; Ortiz, C A; Muyshondt, A; McFarland, A R
1996-01-01
Alternative reference methodologies have been developed for sampling of radionuclides from stacks and ducts, which differ from the methods previously required by the United States Environmental Protection Agency. These alternative reference methodologies have recently been approved by the U.S. EPA for use in lieu of the current standard techniques. The standard EPA methods are prescriptive in selection of sampling locations and in design of sampling probes whereas the alternative reference methodologies are performance driven. Tests were conducted in a stack at Los Alamos National Laboratory to demonstrate the efficacy of some aspects of the alternative reference methodologies. Coefficients of variation of velocity, tracer gas, and aerosol particle profiles were determined at three sampling locations. Results showed that numerical criteria placed upon the coefficients of variation by the alternative reference methodologies were met at sampling stations located 9 and 14 stack diameters from the flow entrance, but not at a location that was 1.5 diameters downstream from the inlet. Experiments were conducted to characterize the transmission of 10 microns aerodynamic diameter liquid aerosol particles through three types of sampling probes. The transmission ratio (ratio of aerosol concentration at the probe exit plane to the concentration in the free stream) was 107% for a 113 L min-1 (4-cfm) anisokinetic shrouded probe, but only 20% for an isokinetic probe that follows the existing EPA standard requirements. A specially designed isokinetic probe showed a transmission ratio of 63%. The shrouded probe performance would conform to the alternative reference methodologies criteria; however, the isokinetic probes would not.
Effect of alternative glycosylation on insulin receptor processing.
Hwang, J B; Frost, S C
1999-08-06
The mature insulin receptor is a cell surface heterotetrameric glycoprotein composed of two alpha- and two beta-subunits. In 3T3-L1 adipocytes as in other cell types, the receptor is synthesized as a single polypeptide consisting of uncleaved alpha- and beta-subunits, migrating as a 190-kDa glycoprotein. To examine the importance of N-linked glycosylation on insulin receptor processing, we have used glucose deprivation as a tool to alter protein glycosylation. Western blot analysis shows that glucose deprivation led to a time-dependent accumulation of an alternative proreceptor of 170 kDa in a subcellular fraction consistent with endoplasmic reticulum localization. Co-precipitation assays provide evidence that the alternative proreceptor bound GRP78, an endoplasmic reticulum molecular chaperone. N-Glycosidase F treatment shows that the alternative proreceptor contained N-linked oligosaccharides. Yet, endoglycosidase H insensitivity indicates an aberrant oligosaccharide structure. Using pulse-chase methodology, we show that the synthetic rate was similar between the normal and alternative proreceptor. However, the normal proreceptor was processed into alpha- and beta-subunits (t((1)/(2)) = 1.3 +/- 0.6 h), while the alternative proreceptor was degraded (t((1)/(2)) = 5.1 +/- 0.6 h). Upon refeeding cells that were initially deprived of glucose, the alternative proreceptor was processed to a higher molecular weight form and gained sensitivity to endoglycosidase H. This "intermediate" form of the proreceptor was also degraded, although a small fraction escaped degradation, resulting in cleavage to the alpha- and beta-subunits. These data provide evidence for the first time that glucose deprivation leads to the accumulation of an alternative proreceptor, which can be post-translationally glycosylated with the readdition of glucose inducing both accelerated degradation and maturation.
Moeinaddini, Mazaher; Khorasani, Nematollah; Danehkar, Afshin; Darvishsefat, Ali Asghar; Zienalyan, Mehdi
2010-05-01
Selection of landfill site is a complex process and needs many diverse criteria. The purpose of this paper is to evaluate the suitability of the studied site as landfill for MSW in Karaj. Using weighted linear combination (WLC) method and spatial cluster analysis (SCA), suitable sites for allocation of landfill for a 20-year period were identified. For analyzing spatial auto-correlation of the land suitability map layer (LSML), Maron's I was used. Finally, using the analytical hierarchy process (AHP), the most preferred alternative for the landfill siting was identified. Main advantages of AHP are: relative ease of handling multiple criteria, easy to understand and effective handling of both qualitative and quantitative data. As a result, 6% of the study area is suitable for landfill siting and third alternative was identified as the most preferred for siting MSW landfill by AHP. The ranking of alternatives were obtained only by applying the WLC approach showed different results from the AHP. The WLC should be used only for the identification of alternatives and the AHP is used for prioritization. We suggest the employed procedure for other similar regions. Copyright (c) 2010 Elsevier Ltd. All rights reserved.
A multicriteria-based methodology for site prioritisation in sediment management.
Alvarez-Guerra, Manuel; Viguri, Javier R; Voulvoulis, Nikolaos
2009-08-01
Decision-making for sediment management is a complex task that incorporates the selections of areas for remediation and the assessment of options for any mitigation required. The application of Multicriteria Analysis (MCA) to rank different areas, according to their need for sediment management, provides a great opportunity for prioritisation, a first step in an integrated methodology that finally aims to assess and select suitable alternatives for managing the identified priority sites. This paper develops a methodology that starts with the delimitation of management units within areas of study, followed by the application of MCA methods that allows ranking of these management units, according to their need for remediation. This proposed process considers not only scientific evidence on sediment quality, but also other relevant aspects such as social and economic criteria associated with such decisions. This methodology is illustrated with its application to the case study area of the Bay of Santander, in northern Spain, highlighting some of the implications of utilising different MCA methods in the process. It also uses site-specific data to assess the subjectivity in the decision-making process, mainly reflected through the assignment of the criteria weights and uncertainties in the criteria scores. Analysis of the sensitivity of the results to these factors is used as a way to assess the stability and robustness of the ranking as a first step of the sediment management decision-making process.
Probability genotype imputation method and integrated weighted lasso for QTL identification.
Demetrashvili, Nino; Van den Heuvel, Edwin R; Wit, Ernst C
2013-12-30
Many QTL studies have two common features: (1) often there is missing marker information, (2) among many markers involved in the biological process only a few are causal. In statistics, the second issue falls under the headings "sparsity" and "causal inference". The goal of this work is to develop a two-step statistical methodology for QTL mapping for markers with binary genotypes. The first step introduces a novel imputation method for missing genotypes. Outcomes of the proposed imputation method are probabilities which serve as weights to the second step, namely in weighted lasso. The sparse phenotype inference is employed to select a set of predictive markers for the trait of interest. Simulation studies validate the proposed methodology under a wide range of realistic settings. Furthermore, the methodology outperforms alternative imputation and variable selection methods in such studies. The methodology was applied to an Arabidopsis experiment, containing 69 markers for 165 recombinant inbred lines of a F8 generation. The results confirm previously identified regions, however several new markers are also found. On the basis of the inferred ROC behavior these markers show good potential for being real, especially for the germination trait Gmax. Our imputation method shows higher accuracy in terms of sensitivity and specificity compared to alternative imputation method. Also, the proposed weighted lasso outperforms commonly practiced multiple regression as well as the traditional lasso and adaptive lasso with three weighting schemes. This means that under realistic missing data settings this methodology can be used for QTL identification.
Solvent-free and catalyst-free chemistry: A benign pathway to sustainability
In the past decade, alternative benign organic methodologies have become an imperative part of organic syntheses and novel chemical reactions. The various new and innovative sustainable organic reactions and methodologies using no solvents or catalysts and employing alternative ...
Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique.
Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever
2015-01-01
In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method.
Gadamerian philosophical hermeneutics as a useful methodological framework for the Delphi technique
Guzys, Diana; Dickson-Swift, Virginia; Kenny, Amanda; Threlkeld, Guinever
2015-01-01
In this article we aim to demonstrate how Gadamerian philosophical hermeneutics may provide a sound methodological framework for researchers using the Delphi Technique (Delphi) in studies exploring health and well-being. Reporting of the use of Delphi in health and well-being research is increasing, but less attention has been given to covering its methodological underpinnings. In Delphi, a structured anonymous conversation between participants is facilitated, via an iterative survey process. Participants are specifically selected for their knowledge and experience with the topic of interest. The purpose of structuring conversation in this manner is to cultivate collective opinion and highlight areas of disagreement, using a process that minimizes the influence of group dynamics. The underlying premise is that the opinion of a collective is more useful than that of an individual. In designing our study into health literacy, Delphi aligned well with our research focus and would enable us to capture collective views. However, we were interested in the methodology that would inform our study. As researchers, we believe that methodology provides the framework and principles for a study and is integral to research integrity. In assessing the suitability of Delphi for our research purpose, we found little information about underpinning methodology. The absence of a universally recognized or consistent methodology associated with Delphi was highlighted through a scoping review we undertook to assist us in our methodological thinking. This led us to consider alternative methodologies, which might be congruent with the key principles of Delphi. We identified Gadamerian philosophical hermeneutics as a methodology that could provide a supportive framework and principles. We suggest that this methodology may be useful in health and well-being studies utilizing the Delphi method. PMID:25948132
Cogeneration Technology Alternatives Study (CTAS). Volume 2: Analytical approach
NASA Technical Reports Server (NTRS)
Gerlaugh, H. E.; Hall, E. W.; Brown, D. H.; Priestley, R. R.; Knightly, W. F.
1980-01-01
The use of various advanced energy conversion systems were compared with each other and with current technology systems for their savings in fuel energy, costs, and emissions in individual plants and on a national level. The ground rules established by NASA and assumptions made by the General Electric Company in performing this cogeneration technology alternatives study are presented. The analytical methodology employed is described in detail and is illustrated with numerical examples together with a description of the computer program used in calculating over 7000 energy conversion system-industrial process applications. For Vol. 1, see 80N24797.
SOCIOECONOMIC ANALYSIS OF HAZARDOUS WASTE MANAGEMENT ALTERNATIVES: METHOLOLOGY AND DEMONSTRATION
A methodology for analyzing economic and social effects of alternatives in hazardous waste management is presented and demonstrated. The approach includes the use of environmental threat scenarios and evaluation of effects on and responses by parties-at-interest. The methodology ...
ERIC Educational Resources Information Center
Alorda, B.; Suenaga, K.; Pons, P.
2011-01-01
This paper reports on the design, implementation and assessment of a new approach course structure based on the combination of three cooperative methodologies. The main goal is to reduce the percentage of non-passed students focusing the learning process on students by offering different alternatives and motivational activities based on working in…
USDA-ARS?s Scientific Manuscript database
There is a desired by US consumers for eggs produced by hens in alternative production systems. As the retail shell egg market offers these products to accommodate consumer demands, additional information is needed to ensure processing methodologies result in safe eggs from all egg sources. A stud...
Berne, Rosalyn W; Raviv, Daniel
2004-04-01
This paper introduces the Eight Dimensional Methodology for Innovative Thinking (the Eight Dimensional Methodology), for innovative problem solving, as a unified approach to case analysis that builds on comprehensive problem solving knowledge from industry, business, marketing, math, science, engineering, technology, arts, and daily life. It is designed to stimulate innovation by quickly generating unique "out of the box" unexpected and high quality solutions. It gives new insights and thinking strategies to solve everyday problems faced in the workplace, by helping decision makers to see otherwise obscure alternatives and solutions. Daniel Raviv, the engineer who developed the Eight Dimensional Methodology, and paper co-author, technology ethicist Rosalyn Berne, suggest that this tool can be especially useful in identifying solutions and alternatives for particular problems of engineering, and for the ethical challenges which arise with them. First, the Eight Dimensional Methodology helps to elucidate how what may appear to be a basic engineering problem also has ethical dimensions. In addition, it offers to the engineer a methodology for penetrating and seeing new dimensions of those problems. To demonstrate the effectiveness of the Eight Dimensional Methodology as an analytical tool for thinking about ethical challenges to engineering, the paper presents the case of the construction of the Large Binocular Telescope (LBT) on Mount Graham in Arizona. Analysis of the case offers to decision makers the use of the Eight Dimensional Methodology in considering alternative solutions for how they can proceed in their goals of exploring space. It then follows that same process through the second stage of exploring the ethics of each of those different solutions. The LBT project pools resources from an international partnership of universities and research institutes for the construction and maintenance of a highly sophisticated, powerful new telescope. It will soon mark the erection of the world's largest and most powerful optical telescope, designed to see fine detail otherwise visible only from space. It also represents a controversial engineering project that is being undertaken on land considered to be sacred by the local, native Apache people. As presented, the case features the University of Virginia, and its challenges in consideration of whether and how to join the LBT project consortium.
Scrutinizing UML Activity Diagrams
NASA Astrophysics Data System (ADS)
Al-Fedaghi, Sabah
Building an information system involves two processes: conceptual modeling of the “real world domain” and designing the software system. Object-oriented methods and languages (e.g., UML) are typically used for describing the software system. For the system analysis process that produces the conceptual description, object-oriented techniques or semantics extensions are utilized. Specifically, UML activity diagrams are the “flow charts” of object-oriented conceptualization tools. This chapter proposes an alternative to UML activity diagrams through the development of a conceptual modeling methodology based on the notion of flow.
A Goal Seeking Strategy for Constructing Systems from Alternative Components
NASA Technical Reports Server (NTRS)
Valentine, Mark E.
1999-01-01
This paper describes a methodology to efficiently construct feasible systems then modify feasible systems to meet successive goals by selecting from alternative components, a problem recognized to be n-p complete. The methodology provides a means to catalog and model alternative components. A presented system modeling Structure is robust enough to model a wide variety of systems and provides a means to compare and evaluate alternative systems. These models act as input to a methodology for selecting alternative components to construct feasible systems and modify feasible systems to meet design goals and objectives. The presented algorithm's ability to find a restricted solution, as defined by a unique set of requirements, is demonstrated against an exhaustive search of a sample of proposed shuttle modifications. The utility of the algorithm is demonstrated by comparing results from the algorithm with results from three NASA shuttle evolution studies using their value systems and assumptions.
NASA Astrophysics Data System (ADS)
Sheate, William R.; Partidário, Maria Rosário Do; Byron, Helen; Bina, Olivia; Dagg, Suzan
2008-02-01
BioScene (scenarios for reconciling biodiversity conservation with declining agriculture use in mountain areas in Europe) was a three-year project (2002 2005) funded by the European Union’s Fifth Framework Programme, and aimed to investigate the implications of agricultural restructuring and decline for biodiversity conservation in the mountain areas of Europe. The research took a case study approach to the analysis of the biodiversity processes and outcomes of different scenarios of agri-environmental change in six countries (France, Greece, Norway, Slovakia, Switzerland, and the United Kingdom) covering the major biogeographical regions of Europe. The project was coordinated by Imperial College London, and each study area had a multidisciplinary team including ecologists and social and economic experts, which sought a comprehensive understanding of the drivers for change and their implications for sustainability. A key component was the sustainability assessment (SA) of the alternative scenarios. This article discusses the development and application of the SA methodology developed for BioScene. While the methodology was objectives-led, it was also strongly grounded in baseline ecological and socio-economic data. This article also describes the engagement of stakeholder panels in each study area and the use of causal chain analysis for understanding the likely implications for land use and biodiversity of strategic drivers of change under alternative scenarios for agriculture and rural policy and for biodiversity management. Finally, this article draws conclusions for the application of SA more widely, its use with scenarios, and the benefits of stakeholder engagement in the SA process.
The report defines a simplified methodology that can be used by indoor air quality (IAQ) diagnosticians, architects/engineers, building owners/operators, and the scientific community for preliminary comparison of the cost-effectiveness of alternative IAQ control measures for any ...
Huertas, César S; Carrascosa, L G; Bonnal, S; Valcárcel, J; Lechuga, L M
2016-04-15
Alternative splicing of mRNA precursors enables cells to generate different protein outputs from the same gene depending on their developmental or homeostatic status. Its deregulation is strongly linked to disease onset and progression. Current methodologies for monitoring alternative splicing demand elaborate procedures and often present difficulties in discerning between closely related isoforms, e.g. due to cross-hybridization during their detection. Herein, we report a general methodology using a Surface Plasmon Resonance (SPR) biosensor for label-free monitoring of alternative splicing events in real-time, without any cDNA synthesis or PCR amplification requirements. We applied this methodology to RNA isolated from HeLa cells for the quantification of alternatively spliced isoforms of the Fas gene, involved in cancer progression through regulation of programmed cell death. We demonstrate that our methodology is isoform-specific, with virtually no cross-hybridization, achieving limits of detection (LODs) in the picoMolar (pM) range. Similar results were obtained for the detection of the BCL-X gene mRNA isoforms. The results were independently validated by RT-qPCR, with excellent concordance in the determination of isoform ratios. The simplicity and robustness of this biosensor technology can greatly facilitate the exploration of alternative splicing biomarkers in disease diagnosis and therapy. Copyright © 2015 Elsevier B.V. All rights reserved.
Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection.
Caiazzo, Fabrizia; Caggiano, Alessandra
2018-04-20
Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti⁻6Al⁻4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data.
Investigation of Laser Welding of Ti Alloys for Cognitive Process Parameters Selection
2018-01-01
Laser welding of titanium alloys is attracting increasing interest as an alternative to traditional joining techniques for industrial applications, with particular reference to the aerospace sector, where welded assemblies allow for the reduction of the buy-to-fly ratio, compared to other traditional mechanical joining techniques. In this research work, an investigation on laser welding of Ti–6Al–4V alloy plates is carried out through an experimental testing campaign, under different process conditions, in order to perform a characterization of the produced weld bead geometry, with the final aim of developing a cognitive methodology able to support decision-making about the selection of the suitable laser welding process parameters. The methodology is based on the employment of artificial neural networks able to identify correlations between the laser welding process parameters, with particular reference to the laser power, welding speed and defocusing distance, and the weld bead geometric features, on the basis of the collected experimental data. PMID:29677114
Eukaryotic DNA Replication Fork.
Burgers, Peter M J; Kunkel, Thomas A
2017-06-20
This review focuses on the biogenesis and composition of the eukaryotic DNA replication fork, with an emphasis on the enzymes that synthesize DNA and repair discontinuities on the lagging strand of the replication fork. Physical and genetic methodologies aimed at understanding these processes are discussed. The preponderance of evidence supports a model in which DNA polymerase ε (Pol ε) carries out the bulk of leading strand DNA synthesis at an undisturbed replication fork. DNA polymerases α and δ carry out the initiation of Okazaki fragment synthesis and its elongation and maturation, respectively. This review also discusses alternative proposals, including cellular processes during which alternative forks may be utilized, and new biochemical studies with purified proteins that are aimed at reconstituting leading and lagging strand DNA synthesis separately and as an integrated replication fork.
45 CFR 153.330 - State alternate risk adjustment methodology.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 1 2012-10-01 2012-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.330 - State alternate risk adjustment methodology.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 1 2014-10-01 2014-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare Department of Health and Human Services REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
45 CFR 153.330 - State alternate risk adjustment methodology.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 1 2013-10-01 2013-10-01 false State alternate risk adjustment methodology. 153.330 Section 153.330 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES REQUIREMENTS RELATING TO HEALTH CARE ACCESS STANDARDS RELATED TO REINSURANCE, RISK CORRIDORS, AND RISK ADJUSTMENT UNDER THE...
Assessment of methodologies for analysis of the dungeness B accidental aircraft crash risk.
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaChance, Jeffrey L.; Hansen, Clifford W.
2010-09-01
The Health and Safety Executive (HSE) has requested Sandia National Laboratories (SNL) to review the aircraft crash methodology for nuclear facilities that are being used in the United Kingdom (UK). The scope of the work included a review of one method utilized in the UK for assessing the potential for accidental airplane crashes into nuclear facilities (Task 1) and a comparison of the UK methodology against similar International Atomic Energy Agency (IAEA), United States (US) Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) methods (Task 2). Based on the conclusions from Tasks 1 and 2, an additionalmore » Task 3 would provide an assessment of a site-specific crash frequency for the Dungeness B facility using one of the other methodologies. This report documents the results of Task 2. The comparison of the different methods was performed for the three primary contributors to aircraft crash risk at the Dungeness B site: airfield related crashes, crashes below airways, and background crashes. The methods and data specified in each methodology were compared for each of these risk contributors, differences in the methodologies were identified, and the importance of these differences was qualitatively and quantitatively assessed. The bases for each of the methods and the data used were considered in this assessment process. A comparison of the treatment of the consequences of the aircraft crashes was not included in this assessment because the frequency of crashes into critical structures is currently low based on the existing Dungeness B assessment. Although the comparison found substantial differences between the UK and the three alternative methodologies (IAEA, NRC, and DOE) this assessment concludes that use of any of these alternative methodologies would not change the conclusions reached for the Dungeness B site. Performance of Task 3 is thus not recommended.« less
Martin, Todd M
2017-05-01
The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are compared using a variety of human health effects, ecotoxicity, and physicochemical properties. Hazard profiles are completed using a variety of online sources and quantitative structure activity relationship models. In the second stage, alternatives are evaluated utilizing an exposure/risk assessment over the entire life cycle. Exposure values are calculated using screening-level near-field and far-field exposure models. The second stage allows one to more accurately compare potential exposure to each alternative and consider additional factors that may not be obvious from separate binned persistence, bioaccumulation, and toxicity scores. The methodology was utilized to compare phosphate-based alternatives for decabromodiphenyl ether (decaBDE) in electronics applications.
ERIC Educational Resources Information Center
Greenberg, Julie; Walsh, Kate; McKee, Arthur
2014-01-01
The "NCTQ Teacher Prep Review" evaluates the quality of programs that provide preservice preparation of public school teachers. As part of the "Review," this appendix reports on a pilot study of new standards for assessing the quality of alternative certification programs. Background and methodology for alternative…
This NODA requests public comment on two alternative allocation methodologies for existing units, on the unit-level allocations calculated using those alternative methodologies, on the data supporting the calculations, and on any resulting implications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Columbia River System Operation Review
1995-11-01
This Appendix J of the Final Environmental Impact Statement for the Columbia River System discusses impacts on the recreational activities in the region. Major sections include the following: scope and processes; recreation in the Columbia River Basin today - by type, location, participation, user characteristics, factors which affect usage, and managing agencies; recreation analysis procedures and methodology; and alternatives and their impacts.
ERIC Educational Resources Information Center
Bissels, Gerhard
2008-01-01
Purpose: The purpose of this paper is to describe the selection process and criteria that led to the implementation of the Koha 3.0 library management system (LMS) at the Complementary and Alternative Medicine Library and Information Service (CAMLIS), Royal London Homoeopathic Hospital. Design/methodology/approach: The paper is a report based on…
The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two-stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are ...
Research methods in complementary and alternative medicine: an integrative review.
de Almeida Andrade, Fabiana; Schlechta Portella, Caio Fabio
2018-01-01
The scientific literature presents a modest amount of evidence in the use of complementary and alternative medicine (CAM). On the other hand, in practice, relevant results are common. The debates among CAM practitioners about the quality and execution of scientific research are important. Therefore, the aim of this review is to gather, synthesize and describe the differentiated methodological models that encompass the complexity of therapeutic interventions. The process of bringing evidence-based medicine into clinical practice in CAM is essential for the growth and strengthening of complementary medicines worldwide. Copyright © 2017 Shanghai Changhai Hospital. Published by Elsevier B.V. All rights reserved.
Valuation Diagramming and Accounting of Transactive Energy Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makhmalbaf, Atefe; Hammerstrom, Donald J.; Huang, Qiuhua
Transactive energy (TE) systems support both economic and technical objectives of a power system including efficiency and reliability. TE systems utilize value-driven mechanisms to coordinate and balance responsive supply and demand in the power system. Economic performance of TE systems cannot be assessed without estimating their value. Estimating the potential value of transactive energy systems requires a systematic valuation methodology that can capture value exchanges among different stakeholders (i.e., actors) and ultimately estimate impact of one TE design and compare it against another one. Such a methodology can help decision makers choose the alternative that results in preferred outcomes. Thismore » paper presents a valuation methodology developed to assess value of TE systems. A TE use-case example is discussed, and metrics identified in the valuation process are quantified using a TE simulation program.« less
30 CFR 206.173 - How do I calculate the alternative methodology for dual accounting?
Code of Federal Regulations, 2010 CFR
2010-07-01
... measured at facility measurement points whose quality exceeds 1,000 Btu/cf are subject to dual accounting... for dual accounting? 206.173 Section 206.173 Mineral Resources MINERALS MANAGEMENT SERVICE, DEPARTMENT... the alternative methodology for dual accounting? (a) Electing a dual accounting method. (1) If you are...
ERIC Educational Resources Information Center
Hopson, Laura M.; Steiker, Lori K. H.
2008-01-01
The purpose of this article is to set forth an innovative methodological protocol for culturally grounding interventions with high-risk youths in alternative schools. This study used mixed methods to evaluate original and adapted versions of a culturally grounded substance abuse prevention program. The qualitative and quantitative methods…
Bernstad Saraiva, A; Souza, R G; Valle, R A B
2017-10-01
The environmental impacts from three management alternatives for organic fraction of municipal solid waste were compared using lifecycle assessment methodology. The alternatives (sanitary landfill, selective collection of organic waste for anaerobic digestion and anaerobic digestion after post-separation of organic waste) were modelled applying an attributional as well as consequential approach, in parallel with the aim of identifying if and how these approaches can affect results and conclusions. The marginal processes identified in the consequential modelling were in general associated with higher environmental impacts than average processes modelled with an attributional approach. As all investigated waste management alternatives result in net-substitution of energy and in some cases also materials, the consequential modelling resulted in lower absolute environmental impacts in five of the seven environmental impact categories assessed in the study. In three of these, the chosen modelling approach can alter the hierarchy between compared waste management alternatives. This indicates a risk of underestimating potential benefits from efficient energy recovery from waste when applying attributional modelling in contexts in which electricity provision historically has been dominated by technologies presenting rather low environmental impacts, but where projections point at increasing impacts from electricity provision in coming years. Thus, in the present case study, the chosen approach affects both absolute and relative results from the comparison. However, results were largely related to the processes identified as affected by investigated changes, and not merely the chosen modelling approach. The processes actually affected by future choices between different waste management alternatives are intrinsically uncertain. The study demonstrates the benefits of applying different assumptions regarding the processes affected by investigated choices - both for provision of energy and materials substituted by waste management processes in consequential LCA modelling, in order to present outcomes that are relevant as decision support within the waste management sector. Copyright © 2017 Elsevier Ltd. All rights reserved.
Armstrong, Patrick Ian; Vogel, David L
2010-04-01
The current article replies to comments made by Lent, Sheu, and Brown (2010) and Lubinski (2010) regarding the study "Interpreting the Interest-Efficacy Association From a RIASEC Perspective" (Armstrong & Vogel, 2009). The comments made by Lent et al. and Lubinski highlight a number of important theoretical and methodological issues, including the process of defining and differentiating between constructs, the assumptions underlying Holland's (1959, 1997) RIASEC (Realistic, Investigative, Artistic, Social, Enterprising, and Conventional types) model and interrelations among constructs specified in social cognitive career theory (SCCT), the importance of incremental validity for evaluating constructs, and methodological considerations when quantifying interest-efficacy correlations and for comparing models using multivariate statistical methods. On the basis of these comments and previous research on the SCCT and Holland models, we highlight the importance of considering multiple theoretical perspectives in vocational research and practice. Alternative structural models are outlined for examining the role of interests, self-efficacy, learning experiences, outcome expectations, personality, and cognitive abilities in the career choice and development process. PsycINFO Database Record (c) 2010 APA, all rights reserved.
Hanine, Mohamed; Boutkhoum, Omar; Tikniouine, Abdessadek; Agouti, Tarik
2016-01-01
Actually, a set of ETL software (Extract, Transform and Load) is available to constitute a major investment market. Each ETL uses its own techniques for extracting, transforming and loading data into data warehouse, which makes the task of evaluating ETL software very difficult. However, choosing the right software of ETL is critical to the success or failure of any Business Intelligence project. As there are many impacting factors in the selection of ETL software, the same process is considered as a complex multi-criteria decision making (MCDM) problem. In this study, an application of decision-making methodology that employs the two well-known MCDM techniques, namely Analytic Hierarchy Process (AHP) and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) methods is designed. In this respect, the aim of using AHP is to analyze the structure of the ETL software selection problem and obtain weights of the selected criteria. Then, TOPSIS technique is used to calculate the alternatives' ratings. An example is given to illustrate the proposed methodology. Finally, a software prototype for demonstrating both methods is implemented.
Improta, Giovanni; Russo, Mario Alessandro; Triassi, Maria; Converso, Giuseppe; Murino, Teresa; Santillo, Liberatina Carmela
2018-05-01
Health technology assessments (HTAs) are often difficult to conduct because of the decisive procedures of the HTA algorithm, which are often complex and not easy to apply. Thus, their use is not always convenient or possible for the assessment of technical requests requiring a multidisciplinary approach. This paper aims to address this issue through a multi-criteria analysis focusing on the analytic hierarchy process (AHP). This methodology allows the decision maker to analyse and evaluate different alternatives and monitor their impact on different actors during the decision-making process. However, the multi-criteria analysis is implemented through a simulation model to overcome the limitations of the AHP methodology. Simulations help decision-makers to make an appropriate decision and avoid unnecessary and costly attempts. Finally, a decision problem regarding the evaluation of two health technologies, namely, the evaluation of two biological prostheses for incisional infected hernias, will be analysed to assess the effectiveness of the model. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
Cabrera-Barona, Pablo; Ghorbanzadeh, Omid
2018-01-16
Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas.
Cabrera-Barona, Pablo
2018-01-01
Deprivation indices are useful measures to study health inequalities. Different techniques are commonly applied to construct deprivation indices, including multi-criteria decision methods such as the analytical hierarchy process (AHP). The multi-criteria deprivation index for the city of Quito is an index in which indicators are weighted by applying the AHP. In this research, a variation of this index is introduced that is calculated using interval AHP methodology. Both indices are compared by applying logistic generalized linear models and multilevel models, considering self-reported health as the dependent variable and deprivation and self-reported quality of life as the independent variables. The obtained results show that the multi-criteria deprivation index for the city of Quito is a meaningful measure to assess neighborhood effects on self-reported health and that the alternative deprivation index using the interval AHP methodology more thoroughly represents the local knowledge of experts and stakeholders. These differences could support decision makers in improving health planning and in tackling health inequalities in more deprived areas. PMID:29337915
NASA Astrophysics Data System (ADS)
Adhikari, Pashupati Raj
Materials selection processes have been the most important aspects in product design and development. Knowledge-based system (KBS) and some of the methodologies used in the materials selection for the design of aircraft cabin metallic structures are discussed. Overall aircraft weight reduction means substantially less fuel consumption. Part of the solution to this problem is to find a way to reduce overall weight of metallic structures inside the cabin. Among various methodologies of materials selection using Multi Criterion Decision Making (MCDM) techniques, a few of them are demonstrated with examples and the results are compared with those obtained using Ashby's approach in materials selection. Pre-defined constraint values, mainly mechanical properties, are employed as relevant attributes in the process. Aluminum alloys with high strength-to-weight ratio have been second-to-none in most of the aircraft parts manufacturing. Magnesium alloys that are much lighter in weight as alternatives to the Al-alloys currently in use in the structures are tested using the methodologies and ranked results are compared. Each material attribute considered in the design are categorized as benefit and non-benefit attribute. Using Ashby's approach, material indices that are required to be maximized for an optimum performance are determined, and materials are ranked based on the average of consolidated indices ranking. Ranking results are compared for any disparity among the methodologies.
Krauter, Paula; Edwards, Donna; Yang, Lynn; Tucker, Mark
2011-09-01
Decontamination and recovery of a facility or outdoor area after a wide-area biological incident involving a highly persistent agent (eg, Bacillus anthracis spores) is a complex process that requires extensive information and significant resources, which are likely to be limited, particularly if multiple facilities or areas are affected. This article proposes a systematic methodology for evaluating information to select the decontamination or alternative treatments that optimize use of resources if decontamination is required for the facility or area. The methodology covers a wide range of approaches, including volumetric and surface decontamination, monitored natural attenuation, and seal and abandon strategies. A proposed trade-off analysis can help decision makers understand the relative appropriateness, efficacy, and labor, skill, and cost requirements of the various decontamination methods for the particular facility or area needing treatment--whether alone or as part of a larger decontamination effort. Because the state of decontamination knowledge and technology continues to evolve rapidly, the methodology presented here is designed to accommodate new strategies and materials and changing information.
Non-chromate Passivation for LHE ZnNi
2017-03-01
control of coatings and processes. Development of an alternative methodology that is simple, repeatable, non -destructive, and capable of scanning across...FINAL REPORT Non -chromate Passivation for LHE ZnNi SERDP Project WP-2527 JANUARY 2017 Matt O’Keefe Missouri S&T...valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 2. REPORT TYPE 3. DATES COVERED (From
NEMS Freight Transportation Module Improvement Study
2015-01-01
The U.S. Energy Information Administration (EIA) contracted with IHS Global, Inc. (IHS) to analyze the relationship between the value of industrial output, physical output, and freight movement in the United States for use in updating analytic assumptions and modeling structure within the National Energy Modeling System (NEMS) freight transportation module, including forecasting methodologies and processes to identify possible alternative approaches that would improve multi-modal freight flow and fuel consumption estimation.
Martínez-López, J. Israel; Mojica, Mauricio; Rodríguez, Ciro A.; Siller, Héctor R.
2016-01-01
Despite the copious amount of research on the design and operation of micromixers, there are few works regarding manufacture technology aimed at implementation beyond academic environments. This work evaluates the viability of xurography as a rapid fabrication tool for the development of ultra-low cost microfluidic technology for extreme Point-of-Care (POC) micromixing devices. By eschewing photolithographic processes and the bulkiness of pumping and enclosure systems for rapid fabrication and passively driven operation, xurography is introduced as a manufacturing alternative for asymmetric split and recombine (ASAR) micromixers. A T-micromixer design was used as a reference to assess the effects of different cutting conditions and materials on the geometric features of the resulting microdevices. Inspection by stereographic and confocal microscopy showed that it is possible to manufacture devices with less than 8% absolute dimensional error. Implementation of the manufacturing methodology in modified circular shape- based SAR microdevices (balanced and unbalanced configurations) showed that, despite the precision limitations of the xurographic process, it is possible to implement this methodology to produce functional micromixing devices. Mixing efficiency was evaluated numerically and experimentally at the outlet of the microdevices with performances up to 40%. Overall, the assessment encourages further research of xurography for the development of POC micromixers. PMID:27196904
Martínez-López, J Israel; Mojica, Mauricio; Rodríguez, Ciro A; Siller, Héctor R
2016-05-16
Despite the copious amount of research on the design and operation of micromixers, there are few works regarding manufacture technology aimed at implementation beyond academic environments. This work evaluates the viability of xurography as a rapid fabrication tool for the development of ultra-low cost microfluidic technology for extreme Point-of-Care (POC) micromixing devices. By eschewing photolithographic processes and the bulkiness of pumping and enclosure systems for rapid fabrication and passively driven operation, xurography is introduced as a manufacturing alternative for asymmetric split and recombine (ASAR) micromixers. A T-micromixer design was used as a reference to assess the effects of different cutting conditions and materials on the geometric features of the resulting microdevices. Inspection by stereographic and confocal microscopy showed that it is possible to manufacture devices with less than 8% absolute dimensional error. Implementation of the manufacturing methodology in modified circular shape- based SAR microdevices (balanced and unbalanced configurations) showed that, despite the precision limitations of the xurographic process, it is possible to implement this methodology to produce functional micromixing devices. Mixing efficiency was evaluated numerically and experimentally at the outlet of the microdevices with performances up to 40%. Overall, the assessment encourages further research of xurography for the development of POC micromixers.
Design Optimization of Gas Generator Hybrid Propulsion Boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight; Fink, Larry
1990-01-01
A methodology used in support of a study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specific optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Qureshi, Adnan I; Gilani, Sarwat; Adil, Malik M; Majidi, Shahram; Hassan, Ameer E; Miley, Jefferson T; Rodriguez, Gustavo J
2014-01-01
Background Telephone consent and two physician consents based on medical necessity are alternate strategies for time sensitive medical decisions but are not uniformly accepted for clinical practice or recruitment into clinical trials. We determined the rate of and associated outcomes with alternate consenting strategies in consecutive acute ischemic stroke patients receiving emergent endovascular treatment. Methods We divided patients into those treated based on in-person consent and those based on alternate strategies. We identified clinical and procedural differences and differences in hospital outcomes: symptomatic ICH and favorable outcome (defined by modified Rankin Scale of 0–2 at discharge) based on consenting methodology. Results Of a total of 159 patients treated, 119 were treated based on in-person consent (by the patient in 27 and legally authorized representative in 92 procedures). Another 40 patients were treated using alternate strategies (20 telephone consents and 20 two physician consents based on medical necessity). There was no difference in the mean ages and proportion of men among the two groups based on consenting methodology. There was a significantly greater time interval incurred between CT scan and initiation of endovascular procedure in those in whom in-person consent was obtained (117 ± 65 min versus 101 ± 45 min, p = 0.01). There was no significant difference in rates of ICH (9% versus 8%, p = 0.9), or favorable outcome at discharge (28% versus 30%, p = 0.8). Conclusions Consent through alternate strategies does not adversely affect procedural characteristics or outcome of patients and may be more time efficient than in-person consenting process. PMID:25132906
Multi objective decision making in hybrid energy system design
NASA Astrophysics Data System (ADS)
Merino, Gabriel Guillermo
The design of grid-connected photovoltaic wind generator system supplying a farmstead in Nebraska has been undertaken in this dissertation. The design process took into account competing criteria that motivate the use of different sources of energy for electric generation. The criteria considered were 'Financial', 'Environmental', and 'User/System compatibility'. A distance based multi-objective decision making methodology was developed to rank design alternatives. The method is based upon a precedence order imposed upon the design objectives and a distance metric describing the performance of each alternative. This methodology advances previous work by combining ambiguous information about the alternatives with a decision-maker imposed precedence order in the objectives. Design alternatives, defined by the photovoltaic array and wind generator installed capacities, were analyzed using the multi-objective decision making approach. The performance of the design alternatives was determined by simulating the system using hourly data for an electric load for a farmstead and hourly averages of solar irradiation, temperature and wind speed from eight wind-solar energy monitoring sites in Nebraska. The spatial variability of the solar energy resource within the region was assessed by determining semivariogram models to krige hourly and daily solar radiation data. No significant difference was found in the predicted performance of the system when using kriged solar radiation data, with the models generated vs. using actual data. The spatial variability of the combined wind and solar energy resources was included in the design analysis by using fuzzy numbers and arithmetic. The best alternative was dependent upon the precedence order assumed for the main criteria. Alternatives with no PV array or wind generator dominated when the 'Financial' criteria preceded the others. In contrast, alternatives with a nil component of PV array but a high wind generator component, dominated when the 'Environment' objective or the 'User/System compatibility' objectives were more important than the 'Financial' objectives and they also dominated when the three criteria were considered equally important.
Martin, Todd M.
2017-01-01
The goal of alternatives assessment (AA) is to facilitate a comparison of alternatives to a chemical of concern, resulting in the identification of safer alternatives. A two stage methodology for comparing chemical alternatives was developed. In the first stage, alternatives are compared using a variety of human health effects, ecotoxicity, and physicochemical properties. Hazard profiles are completed using a variety of online sources and quantitative structure activity relationship models. In the second stage, alternatives are evaluated utilizing an exposure/risk assessment over the entire life cycle. Exposure values are calculated using screening-level near-field and far-field exposure models. The second stage allows one to more accurately compare potential exposure to each alternative and consider additional factors that may not be obvious from separate binned persistence, bioaccumulation, and toxicity scores. The methodology was utilized to compare phosphate-based alternatives for decabromodiphenyl ether (decaBDE) in electronics applications. PMID:29333139
ERIC Educational Resources Information Center
Roman, Elliott M.
The Alternative Learning Methodologies through Academics Project (Project ALMA) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation in two high schools in Queens and the Bronx (New York). The program served 436 Spanish-speaking students, most of whom were of limited English proficiency.…
Emerging and recurrent issues in drug development.
Anello, C
This paper reviews several emerging and recurrent issues relating to the drug development process. These emerging issues include changes to the FDA regulatory environment, internationalization of drug development, advances in computer technology and visualization tools, and efforts to incorporate meta-analysis methodology. Recurrent issues include: renewed interest in statistical methods for handling subgroups in the design and analysis of clinical trials; renewed interest in alternatives to the 'intention-to-treat' analysis in the presence of non-compliance in randomized clinical trials; renewed interest in methodology to address the multiplicities resulting from a variety of sources inherent in the drug development process, and renewed interest in methods to assure data integrity. These emerging and recurrent issues provide a continuing challenge to the international community of statisticians involved in drug development. Moreover, the involvement of statisticians with different perspectives continues to enrich the field and contributes to improvement in the public health.
NASA Astrophysics Data System (ADS)
Kouloumentas, Christos
2011-09-01
The concept of the all-fiberized multi-wavelength regenerator is analyzed, and the design methodology for operation at 40 Gb/s is presented. The specific methodology has been applied in the past for the experimental proof-of-principle of the technique, but it has never been reported in detail. The regenerator is based on a strong dispersion map that is implemented using alternating dispersion compensating fibers (DCF) and single-mode fibers (SMF), and minimizes the nonlinear interaction between the wavelength-division multiplexing (WDM) channels. The optimized regenerator design with + 0.86 ps/nm/km average dispersion of the nonlinear fiber section is further investigated. The specific design is capable of simultaneously processing five WDM channels with 800 GHz channel spacing and providing Q-factor improvement higher than 1 dB for each channel. The cascadeability of the regenerator is also indicated using a 6-node metropolitan network simulation model.
Spatio-Temporal Process Variability in Watershed Scale Wetland Restoration Planning
NASA Astrophysics Data System (ADS)
Evenson, G. R.
2012-12-01
Watershed scale restoration decision making processes are increasingly informed by quantitative methodologies providing site-specific restoration recommendations - sometimes referred to as "systematic planning." The more advanced of these methodologies are characterized by a coupling of search algorithms and ecological models to discover restoration plans that optimize environmental outcomes. Yet while these methods have exhibited clear utility as decision support toolsets, they may be critiqued for flawed evaluations of spatio-temporally variable processes fundamental to watershed scale restoration. Hydrologic and non-hydrologic mediated process connectivity along with post-restoration habitat dynamics, for example, are commonly ignored yet known to appreciably affect restoration outcomes. This talk will present a methodology to evaluate such spatio-temporally complex processes in the production of watershed scale wetland restoration plans. Using the Tuscarawas Watershed in Eastern Ohio as a case study, a genetic algorithm will be coupled with the Soil and Water Assessment Tool (SWAT) to reveal optimal wetland restoration plans as measured by their capacity to maximize nutrient reductions. Then, a so-called "graphical" representation of the optimization problem will be implemented in-parallel to promote hydrologic and non-hydrologic mediated connectivity amongst existing wetlands and sites selected for restoration. Further, various search algorithm mechanisms will be discussed as a means of accounting for temporal complexities such as post-restoration habitat dynamics. Finally, generalized patterns of restoration plan optimality will be discussed as an alternative and possibly superior decision support toolset given the complexity and stochastic nature of spatio-temporal process variability.
Api, A M; Belsito, D; Bruze, M; Cadby, P; Calow, P; Dagli, M L; Dekant, W; Ellis, G; Fryer, A D; Fukayama, M; Griem, P; Hickey, C; Kromidas, L; Lalko, J F; Liebler, D C; Miyachi, Y; Politano, V T; Renskers, K; Ritacco, G; Salvito, D; Schultz, T W; Sipes, I G; Smith, B; Vitale, D; Wilcox, D K
2015-08-01
The Research Institute for Fragrance Materials, Inc. (RIFM) has been engaged in the generation and evaluation of safety data for fragrance materials since its inception over 45 years ago. Over time, RIFM's approach to gathering data, estimating exposure and assessing safety has evolved as the tools for risk assessment evolved. This publication is designed to update the RIFM safety assessment process, which follows a series of decision trees, reflecting advances in approaches in risk assessment and new and classical toxicological methodologies employed by RIFM over the past ten years. These changes include incorporating 1) new scientific information including a framework for choosing structural analogs, 2) consideration of the Threshold of Toxicological Concern (TTC), 3) the Quantitative Risk Assessment (QRA) for dermal sensitization, 4) the respiratory route of exposure, 5) aggregate exposure assessment methodology, 6) the latest methodology and approaches to risk assessments, 7) the latest alternatives to animal testing methodology and 8) environmental risk assessment. The assessment begins with a thorough analysis of existing data followed by in silico analysis, identification of 'read across' analogs, generation of additional data through in vitro testing as well as consideration of the TTC approach. If necessary, risk management may be considered. Copyright © 2014 Elsevier Ltd. All rights reserved.
ARCHITECT: The architecture-based technology evaluation and capability tradeoff method
NASA Astrophysics Data System (ADS)
Griendling, Kelly A.
The use of architectures for the design, development, and documentation of system-of-systems engineering has become a common practice in recent years. This practice became mandatory in the defense industry in 2004 when the Department of Defense Architecture Framework (DoDAF) Promulgation Memo mandated that all Department of Defense (DoD) architectures must be DoDAF compliant. Despite this mandate, there has been significant confusion and a lack of consistency in the creation and the use of the architecture products. Products are typically created as static documents used for communication and documentation purposes that are difficult to change and do not support engineering design activities and acquisition decision making. At the same time, acquisition guidance has been recently reformed to move from the bottom-up approach of the Requirements Generation System (RGS) to the top-down approach mandated by the Joint Capabilities Integration and Devel- opment System (JCIDS), which requires the use of DoDAF to support acquisition. Defense agencies have had difficulty adjusting to this new policy, and are struggling to determine how to meet new acquisition requirements. This research has developed the Architecture-based Technology Evaluation and Capability Tradeoff (ARCHITECT) Methodology to respond to these challenges and address concerns raised about the defense acquisition process, particularly the time required to implement parts of the process, the need to evaluate solutions across capability and mission areas, and the need to use a rigorous, traceable, repeatable method that utilizes modeling and simulation to better substantiate early-phase acquisition decisions. The objective is to create a capability-based systems engineering methodology for the early phases of design and acquisition (specifically Pre-Milestone A activities) which improves agility in defense acquisition by (1) streamlining the development of key elements of JCIDS and DoDAF, (2) moving the creation of DoDAF products forward in the defense acquisition process, and (3) using DoDAF products for more than documentation by integrating them into the problem definition and analysis of alternatives phases and applying executable architecting. This research proposes and demonstrates the plausibility of a prescriptive methodology for developing executable DoDAF products which will explicitly support decision-making in the early phases of JCIDS. A set of criteria by which CBAs should be judged is proposed, and the methodology is developed with these criteria in mind. The methodology integrates existing tools and techniques for systems engineering and system of systems engineering with several new modeling and simulation tools and techniques developed as part of this research to fill gaps noted in prior CBAs. A suppression of enemy air defenses (SEAD) mission is used to demonstrate the ap- plication of ARCHITECT and to show the plausibility of the approach. For the SEAD study, metrics are derived and a gap analysis is performed. The study then identifies and quantitatively compares system and operational architecture alternatives for performing SEAD. A series of down-selections is performed to identify promising architectures, and these promising solutions are subject to further analysis where the impacts of force structure and network structure are examined. While the numerical results of the SEAD study are notional and could not be applied to an actual SEAD CBA, the example served to highlight many of the salient features of the methodology. The SEAD study presented enabled pre-Milestone A tradeoffs to be performed quantitatively across a large number of architectural alternatives in a traceable and repeatable manner. The alternatives considered included variations on operations, systems, organizational responsibilities (through the assignment of systems to tasks), network (or collaboration) structure, interoperability level, and force structure. All of the information used in the study is preserved in the environment, which is dynamic and allows for on-the-fly analysis. The assumptions used were consistent, which was assured through the use of single file documenting all inputs, which was shared across all models. Furthermore, a model was made of the ARCHITECT methodology itself, and was used to demonstrate that even if the steps took twice as long to perform as they did in the case of the SEAD example, the methodology still provides the ability to conduct CBA analyses in less time than prior CBAs to date. Overall, it is shown that the ARCHITECT methodology results in an improvement over current CBAs in the criteria developed here.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitworth, J.; Pearson, M.; Feldman, A.
2006-07-01
The Offsite Source Recovery (OSR) Project at Los Alamos National Laboratory is now shipping transuranic (TRU) waste containers to the Waste Isolation Pilot Plant (WIPP) in New Mexico for disposal. Sealed source waste disposal has become possible in part because OSR personnel were able to obtain Environmental Protection Agency (EPA) and DOE-CBFO approval for an alternative radiological characterization procedure relying on acceptable knowledge (AK) and modeling, rather than on non-destructive assay (NDA) of each container. This is the first successful qualification of an 'alternate methodology' under the radiological characterization requirements of the WIPP Waste Acceptance Criteria (WAC) by any TRUmore » waste generator site. This paper describes the approach OSR uses to radiologically characterize its sealed source waste and the process by which it obtained certification of this approach. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schultz-Fellenz, Emily S.
A portion of LANL’s FY15 SPE objectives includes initial ground-based or ground-proximal investigations at the SPE Phase 2 site. The area of interest is the U2ez location in Yucca Flat. This collection serves as a baseline for discrimination of surface features and acquisition of topographic signatures prior to any development or pre-shot activities associated with SPE Phase 2. Our team originally intended to perform our field investigations using previously vetted ground-based (GB) LIDAR methodologies. However, the extended proposed time frame of the GB LIDAR data collection, and associated data processing time and delivery date, were unacceptable. After technical consultation andmore » careful literature research, LANL identified an alternative methodology to achieve our technical objectives and fully support critical model parameterization. Very-low-altitude unmanned aerial systems (UAS) photogrammetry appeared to satisfy our objectives in lieu of GB LIDAR. The SPE Phase 2 baseline collection was used as a test of this UAS photogrammetric methodology.« less
NASA Astrophysics Data System (ADS)
Kamalraj, Devaraj; Yuvaraj, Selvaraj; Yoganand, Coimbatore Paramasivam; Jaffer, Syed S.
2018-01-01
Here, we propose a new synthetic methodology for silver nanocluster preparation by using a double stranded-DNA (ds-DNA) template which no one has reported yet. A new calculative method was formulated to determine the size of the nanocluster and their band gaps by using steady state 3D contour fluorescence technique with Brus model. Generally, the structure and size of the nanoclusters determine by using High Resolution Transmission Electron Microscopy (HR-TEM). Before imaging the samples by using HR-TEM, they are introduced to drying process which causes aggregation and forms bigger polycrystalline particles. It takes long time duration and expensive methodology. In this current methodology, we found out the size and band gap of the nanocluster in the liquid form without any polycrystalline aggregation for which 3D contour fluorescence technique was used as an alternative approach to the HR-TEM method.
Making sense of executive sensemaking. A phenomenological case study with methodological criticism.
Parry, Jonathan
2003-01-01
This paper attempts to answer the research question, "how do senior executives in my organisation make sense of their professional life?" Having reviewed the sensemaking literature, in particular that of the pre-eminent author in this field, Karl E. Weick, I adopt a phenomenological, interpretist orientation which relies on an ideographic, inductive generation of theory. I situate myself, both as researcher and chief executive of the organisation studied, in the narrative of sensemaking. Using semi-structured interviews and a combination of grounded theory and template analysis to generate categories, seven themes of sensemaking are tentatively produced which are then compared with Weick's characteristics. The methodological approach is then reflected on, criticised and alternative methodologies are briefly considered. The conclusion reached is that the themes generated by the research may have relevance for sensemaking processes, but that the production of formal theory through social research is problematic.
Joint EPA/NASA/USAF Interagency Depainting Study
NASA Technical Reports Server (NTRS)
Clark-Ingram, M.
2001-01-01
Environmental regulations such as National Emission Standards for Hazardous Air Pollutants (NESHAPs) are drivers for the implementation of environmentally compliant methodologies in the manufacture of aerospace hardware. In 1995, the Environmental Protection Agency (EPA) promulgated the NESHAP for the Aerospace Manufacture and Rework (Aerospace NESHAP) industry. Affected facilities were to be in compliance by September 1998. Several aerospace manufacturing operations are regulated within the Aerospace NESHAP including Depainting operations. The National Aeronautics and Space Administration (NASA), EPA, and United States Air Force (USAF) combined resources to evaluate the performance of nine alternative depainting processes. The seven alternative depainting processes were: (1) Chemical stripping (non-methylene chloride); (2) Carbon Dioxide Blasting; (3) Xenon Flashlamp; (4) Carbon Dioxide Laser Stripping; (5) Plastic Media Blasting; (6) Sodium Bicarbonate Wet Stripping; and (7) Waterjet Blasting and Wheat Starch Blasting. All epoxy primer and polyurethane top coat system was applied to 2024-T3 clad and non-clad aluminum test specimens. Approximately 200 test specimens were evaluated in this study. Each coupon was subjected to three, four, or five complete depainting cycles. This paper discusses the conclusions from the study including the test protocol, test parameters, and achievable strip rates for the alternative depainting processes. Test data includes immersion corrosion testing, sandwich corrosion testing and hydrogen embrittlement testing for the non-methylene chloride chemical strippers. Additionally, the cumulative effect of the alternative depainting processes on the metallurgical integrity of the test substrate is addressed with the results from tensile and fatigue evaluations.
Identification of the Criteria for Decision Making of Cut-Away Peatland Reuse
NASA Astrophysics Data System (ADS)
Padur, Kadi; Ilomets, Mati; Põder, Tõnis
2017-03-01
The total area of abandoned milled peatlands which need to be rehabilitated for sustainable land-use is nearly 10,000 ha in Estonia. According to the agreement between Estonia and the European Union, Estonia has to create suitable conditions for restoration of 2000 ha of abandoned cut-away peatlands by 2023. The decisions on rehabilitation of abandoned milled peatlands have so far relied on a limited knowledgebase with unestablished methodologies, thus the decision making process needs a significant improvement. This study aims to improve the methodology by identifying the criteria for optimal decision making to ensure sustainable land use planning after peat extraction. Therefore relevant environmental, social and economic restrictive and weighted comparison criteria, which assess reuse alternatives suitability for achieving the goal, is developed in cooperation with stakeholders. Restrictive criteria are arranged into a decision tree to help to determine the implementable reuse alternatives in various situations. Weighted comparison criteria are developed in cooperation with stakeholders to rank the reuse alternatives. The comparison criteria are organised hierarchically into a value tree. In the situation, where the selection of a suitable rehabilitation alternative for a specific milled peatland is going to be made, the weighted comparison criteria values need to be identified and the presented approach supports the optimal and transparent decision making. In addition to Estonian context the general results of the study could also be applied to a cut-away peatlands in other regions with need-based site-dependent modifications of criteria values and weights.
Identification of the Criteria for Decision Making of Cut-Away Peatland Reuse.
Padur, Kadi; Ilomets, Mati; Põder, Tõnis
2017-03-01
The total area of abandoned milled peatlands which need to be rehabilitated for sustainable land-use is nearly 10,000 ha in Estonia. According to the agreement between Estonia and the European Union, Estonia has to create suitable conditions for restoration of 2000 ha of abandoned cut-away peatlands by 2023. The decisions on rehabilitation of abandoned milled peatlands have so far relied on a limited knowledgebase with unestablished methodologies, thus the decision making process needs a significant improvement. This study aims to improve the methodology by identifying the criteria for optimal decision making to ensure sustainable land use planning after peat extraction. Therefore relevant environmental, social and economic restrictive and weighted comparison criteria, which assess reuse alternatives suitability for achieving the goal, is developed in cooperation with stakeholders. Restrictive criteria are arranged into a decision tree to help to determine the implementable reuse alternatives in various situations. Weighted comparison criteria are developed in cooperation with stakeholders to rank the reuse alternatives. The comparison criteria are organised hierarchically into a value tree. In the situation, where the selection of a suitable rehabilitation alternative for a specific milled peatland is going to be made, the weighted comparison criteria values need to be identified and the presented approach supports the optimal and transparent decision making. In addition to Estonian context the general results of the study could also be applied to a cut-away peatlands in other regions with need-based site-dependent modifications of criteria values and weights.
Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong
2018-03-01
Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.
Transportation systems evaluation methodology development and applications, phase 3
NASA Technical Reports Server (NTRS)
Kuhlthau, A. R.; Jacobson, I. D.; Richards, L. C.
1981-01-01
Transportation systems or proposed changes in current systems are evaluated. Four principal evaluation criteria are incorporated in the process, operating performance characteristics as viewed by potential users, decisions based on the perceived impacts of the system, estimating what is required to reduce the system to practice; and predicting the ability of the concept to attract financial support. A series of matrix multiplications in which the various matrices represent evaluations in a logical sequence of the various discrete steps in a management decision process is used. One or more alternatives are compared with the current situation, and the result provides a numerical rating which determines the desirability of each alternative relative to the norm and to each other. The steps in the decision process are isolated so that contributions of each to the final result are readily analyzed. The ability to protect against bias on the part of the evaluators, and the fact that system parameters which are basically qualitative in nature can be easily included are advantageous.
[Ethical problems in the selection of embryos with therapeutic usefulness].
Collazo Chao, Eliseo
2010-01-01
The first saviour sibling produced entirely in Spain was born in Hospital Virgen del Rocío of Seville in October 2008. The consequente mass media coverage has unleashed multiple requests for similar treatments to be carried on throughout the country. The process, its methodology and efficiency are revised. Their anthropological, ethical and deontological foundations are explored in order to assess their fulfillment. Umbilical cord banking blood is proposed as an alternative.
DOE Office of Scientific and Technical Information (OSTI.GOV)
García-Sánchez, Tania; Gómez-Lázaro, Emilio; Muljadi, E.
An alternative approach to characterise real voltage dips is proposed and evaluated in this study. The proposed methodology is based on voltage-space vector solutions, identifying parameters for ellipses trajectories by using the least-squares algorithm applied on a sliding window along the disturbance. The most likely patterns are then estimated through a clustering process based on the k-means algorithm. The objective is to offer an efficient and easily implemented alternative to characterise faults and visualise the most likely instantaneous phase-voltage evolution during events through their corresponding voltage-space vector trajectories. This novel solution minimises the data to be stored but maintains extensivemore » information about the dips including starting and ending transients. The proposed methodology has been applied satisfactorily to real voltage dips obtained from intensive field-measurement campaigns carried out in a Spanish wind power plant up to a time period of several years. A comparison to traditional minimum root mean square-voltage and time-duration classifications is also included in this study.« less
Dipolar recoupling in solid state NMR by phase alternating pulse sequences
Lin, J.; Bayro, M.; Griffin, R. G.; Khaneja, N.
2009-01-01
We describe some new developments in the methodology of making heteronuclear and homonuclear recoupling experiments in solid state NMR insensitive to rf-inhomogeneity by phase alternating the irradiation on the spin system every rotor period. By incorporating delays of half rotor periods in the pulse sequences, these phase alternating experiments can be made γ encoded. The proposed methodology is conceptually different from the standard methods of making recoupling experiments robust by the use of ramps and adiabatic pulses in the recoupling periods. We show how the concept of phase alternation can be incorporated in the design of homonuclear recoupling experiments that are both insensitive to chemical-shift dispersion and rf-inhomogeneity. PMID:19157931
DOT National Transportation Integrated Search
2013-09-01
Recent advances in multivariate methodology provide an opportunity to further the assessment of service offerings in public transportation for work commuting. We offer methodologies that are alternative to direct rating scale and have advantages in t...
LCA of greywater management within a water circular economy restorative thinking framework.
Dominguez, Sara; Laso, Jara; Margallo, María; Aldaco, Rubén; Rivero, Maria J; Irabien, Ángel; Ortiz, Inmaculada
2018-04-15
Greywater reuse is an attractive option for the sustainable management of water under water scarcity circumstances, within a water circular economy restorative thinking framework. Its successful deployment relies on the availability of low cost and environmentally friendly technologies. The life cycle assessment (LCA) approach provides the appropriate methodological tool for the evaluation of alternative treatments based on environmental decision criteria and, therefore, it is highly useful during the process conceptual design. This methodology should be employed in the early design phase to select those technologies with lower environmental impact. This work reports the comparative LCA of three scenarios for greywater reuse: photocatalysis, photovoltaic solar-driven photocatalysis and membrane biological reactor, in order to help the selection of the most environmentally friendly technology. The study has been focused on the removal of the surfactant sodium dodecylbenzenesulfonate, which is used in the formulation of detergents and personal care products and, thus, widely present in greywater. LCA was applied using the Environmental Sustainability Assessment methodology to obtain two main environmental indicators in order to simplify the decision making process: natural resources and environmental burdens. Energy consumption is the main contributor to both indicators owing to the high energy consumption of the light source for the photocatalytic greywater treatment. In order to reduce its environmental burdens, the most desirable scenario would be the use of solar light for the photocatalytic transformation. However, while the technological challenge of direct use of solar light is approached, the environmental suitability of the photovoltaic solar energy driven photocatalysis technology to greywater reuse has been demonstrated, as it involves the smallest environmental impact among the three studied alternatives. Copyright © 2017 Elsevier B.V. All rights reserved.
Design optimization of gas generator hybrid propulsion boosters
NASA Technical Reports Server (NTRS)
Weldon, Vincent; Phillips, Dwight U.; Fink, Lawrence E.
1990-01-01
A methodology used in support of a contract study for NASA/MSFC to optimize the design of gas generator hybrid propulsion booster for uprating the National Space Transportation System (NSTS) is presented. The objective was to compare alternative configurations for this booster approach, optimizing each candidate concept on different bases, in order to develop data for a trade table on which a final decision was based. The methodology is capable of processing a large number of independent and dependent variables, adjusting the overall subsystems characteristics to arrive at a best compromise integrated design to meet various specified optimization criteria subject to selected constraints. For each system considered, a detailed weight statement was generated along with preliminary cost and reliability estimates.
Advances in bioanalytical techniques to measure steroid hormones in serum.
French, Deborah
2016-06-01
Steroid hormones are measured clinically to determine if a patient has a pathological process occurring in the adrenal gland, or other hormone responsive organs. They are very similar in structure making them analytically challenging to measure. Additionally, these hormones have vast concentration differences in human serum adding to the measurement complexity. GC-MS was the gold standard methodology used to measure steroid hormones clinically, followed by radioimmunoassay, but that was replaced by immunoassay due to ease of use. LC-MS/MS has now become a popular alternative owing to simplified sample preparation than for GC-MS and increased specificity and sensitivity over immunoassay. This review will discuss these methodologies and some new developments that could simplify and improve steroid hormone analysis in serum.
Molenaar, Peter C M
2007-03-01
I am in general agreement with Toomela's (Integrative Psychological and Behavioral Science doi:10.1007/s12124-007-9004-0, 2007) plea for an alternative psychological methodology inspired by his description of the German-Austrian orientation. I will argue, however, that this alternative methodology has to be based on the classical ergodic theorems, using state-of-the-art statistical time series analysis of intra-individual variation as its main tool. Some more specific points made by Toomela will be criticized, while for others a more extreme elaboration along the lines indicated by Toomela is proposed.
Alternative occupied volume integrity (OVI) tests and analyses.
DOT National Transportation Integrated Search
2013-10-01
FRA, supported by the Volpe Center, conducted research on alternative methods of evaluating occupied volume integrity (OVI) in passenger railcars. Guided by this research, an alternative methodology for evaluating OVI that ensures an equivalent or gr...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ekmekcioglu, Mehmet, E-mail: meceng3584@yahoo.co; Kaya, Tolga; Kahraman, Cengiz
The use of fuzzy multiple criteria analysis (MCA) in solid waste management has the advantage of rendering subjective and implicit decision making more objective and analytical, with its ability to accommodate both quantitative and qualitative data. In this paper a modified fuzzy TOPSIS methodology is proposed for the selection of appropriate disposal method and site for municipal solid waste (MSW). Our method is superior to existing methods since it has capability of representing vague qualitative data and presenting all possible results with different degrees of membership. In the first stage of the proposed methodology, a set of criteria of cost,more » reliability, feasibility, pollution and emission levels, waste and energy recovery is optimized to determine the best MSW disposal method. Landfilling, composting, conventional incineration, and refuse-derived fuel (RDF) combustion are the alternatives considered. The weights of the selection criteria are determined by fuzzy pairwise comparison matrices of Analytic Hierarchy Process (AHP). It is found that RDF combustion is the best disposal method alternative for Istanbul. In the second stage, the same methodology is used to determine the optimum RDF combustion plant location using adjacent land use, climate, road access and cost as the criteria. The results of this study illustrate the importance of the weights on the various factors in deciding the optimized location, with the best site located in Catalca. A sensitivity analysis is also conducted to monitor how sensitive our model is to changes in the various criteria weights.« less
An Independent Evaluation of the FMEA/CIL Hazard Analysis Alternative Study
NASA Technical Reports Server (NTRS)
Ray, Paul S.
1996-01-01
The present instruments of safety and reliability risk control for a majority of the National Aeronautics and Space Administration (NASA) programs/projects consist of Failure Mode and Effects Analysis (FMEA), Hazard Analysis (HA), Critical Items List (CIL), and Hazard Report (HR). This extensive analytical approach was introduced in the early 1970's and was implemented for the Space Shuttle Program by NHB 5300.4 (1D-2. Since the Challenger accident in 1986, the process has been expanded considerably and resulted in introduction of similar and/or duplicated activities in the safety/reliability risk analysis. A study initiated in 1995, to search for an alternative to the current FMEA/CIL Hazard Analysis methodology generated a proposed method on April 30, 1996. The objective of this Summer Faculty Study was to participate in and conduct an independent evaluation of the proposed alternative to simplify the present safety and reliability risk control procedure.
Haith-Cooper, Melanie
2003-01-01
The use of problem-based learning (PBL) in Health Professional curricula is becoming more wide spread. Although the way in which the tutor facilitates PBL can have a major impact on students' learning (Andrews and Jones 1996), the literature provides little consistency as to how the tutor can effectively facilitate PBL (Haith-Cooper 2000). It is therefore important to examine the facilitation role to promote effective learning through the use of PBL. This article is the first of two parts exploring a study that was undertaken to investigate tutors' experiences of facilitating PBL. This part focuses on the methodology and the combining of innovative processes with traditional philosophical traditions to develop a systematic educational research methodology. The study was undertaken respecting the philosophy of hermeneutic phenomenology but utilised alternative data collection and analysis technique. Video conferencing and e-mail were used in conjunction with more traditional processes to access a worldwide sample. This paper explores some of the issues that arose when undertaking such a study. The second article then focuses on exploring the findings of the study and their implications for the facilitation of PBL.
The Problem of Multiple Criteria Selection of the Surface Mining Haul Trucks
NASA Astrophysics Data System (ADS)
Bodziony, Przemysław; Kasztelewicz, Zbigniew; Sawicki, Piotr
2016-06-01
Vehicle transport is a dominant type of technological processes in rock mines, and its profit ability is strictly dependent on overall cost of its exploitation, especially on diesel oil consumption. Thus, a rational design of transportation system based on haul trucks should result from thorough analysis of technical and economic issues, including both cost of purchase and its further exploitation, having a crucial impact on the cost of minerals extraction. Moreover, off-highway trucks should be selected with respect to all specific exploitation conditions and even the user's preferences and experience. In this paper a development of universal family of evaluation criteria as well as application of evaluation method for haul truck selection process for a specific exploitation conditions in surface mining have been carried out. The methodology presented in the paper is based on the principles of multiple criteria decision aiding (MCDA) using one of the ranking method, i.e. ELECTRE III. The applied methodology has been allowed for ranking of alternative solution (variants), on the considered set of haul trucks. The result of the research is a universal methodology, and it consequently may be applied in other surface mines with similar exploitation parametres.
MASQOT: a method for cDNA microarray spot quality control
Bylesjö, Max; Eriksson, Daniel; Sjödin, Andreas; Sjöström, Michael; Jansson, Stefan; Antti, Henrik; Trygg, Johan
2005-01-01
Background cDNA microarray technology has emerged as a major player in the parallel detection of biomolecules, but still suffers from fundamental technical problems. Identifying and removing unreliable data is crucial to prevent the risk of receiving illusive analysis results. Visual assessment of spot quality is still a common procedure, despite the time-consuming work of manually inspecting spots in the range of hundreds of thousands or more. Results A novel methodology for cDNA microarray spot quality control is outlined. Multivariate discriminant analysis was used to assess spot quality based on existing and novel descriptors. The presented methodology displays high reproducibility and was found superior in identifying unreliable data compared to other evaluated methodologies. Conclusion The proposed methodology for cDNA microarray spot quality control generates non-discrete values of spot quality which can be utilized as weights in subsequent analysis procedures as well as to discard spots of undesired quality using the suggested threshold values. The MASQOT approach provides a consistent assessment of spot quality and can be considered an alternative to the labor-intensive manual quality assessment process. PMID:16223442
The Use of HFC (CFC Free) Processes at the NASA Stennis Space Center
NASA Technical Reports Server (NTRS)
Ross, Richard H.
1997-01-01
The search for ozone depleting alternative chemicals was heightened when, in 1990, the more than 65 countries that had signed the Montreal Protocol agreed to phase out completely by the year 2000. In 1992, then-president Bush advanced this date for the United States to January l, 1996. In 1991, it was realized that the planned phase out and eventual elimination of ozone depleting chemicals imposed by the Montreal Protocol and the resulting Clean Air Act (CAA) amendments would impact the cleaning and testing of aerospace hardware at the NASA Stennis Space Center. Because of this regulation, the Test & Engineering Sciences Laboratory has been working on solvent conversion studies to replace CFC-113. Aerospace hardware and test equipment used in rocket propulsion systems require extreme cleanliness levels to function and maintain their integrity. Because the cleanliness of aerospace hardware will be affected by the elimination of CFC-113; alternate cleaning technologies, including the use of fluoridated solvents have been studied as potential replacements. Several aqueous processes have been identified for cleaning moderately sized components. However, no known aqueous alternative exists for cleaning and validating T&ME and complex geometry based hardware. This paper discusses the choices and the methodologies that were used to screen potential alternatives to CFC-113.
Stevenson, Fiona A; Gibson, William; Pelletier, Caroline; Chrysikou, Vasiliki; Park, Sophie
2015-05-08
UK-based research conducted within a healthcare setting generally requires approval from the National Research Ethics Service. Research ethics committees are required to assess a vast range of proposals, differing in both their topic and methodology. We argue the methodological benchmarks with which research ethics committees are generally familiar and which form the basis of assessments of quality do not fit with the aims and objectives of many forms of qualitative inquiry and their more iterative goals of describing social processes/mechanisms and making visible the complexities of social practices. We review current debates in the literature related to ethical review and social research, and illustrate the importance of re-visiting the notion of ethics in healthcare research. We present an analysis of two contrasting paradigms of ethics. We argue that the first of these is characteristic of the ways that NHS ethics boards currently tend to operate, and the second is an alternative paradigm, that we have labelled the 'iterative' paradigm, which draws explicitly on methodological issues in qualitative research to produce an alternative vision of ethics. We suggest that there is an urgent need to re-think the ways that ethical issues are conceptualised in NHS ethical procedures. In particular, we argue that embedded in the current paradigm is a restricted notion of 'quality', which frames how ethics are developed and worked through. Specific, pre-defined outcome measures are generally seen as the traditional marker of quality, which means that research questions that focus on processes rather than on 'outcomes' may be regarded as problematic. We show that the alternative 'iterative' paradigm offers a useful starting point for moving beyond these limited views. We conclude that a 'one size fits all' standardisation of ethical procedures and approach to ethical review acts against the production of knowledge about healthcare and dramatically restricts what can be known about the social practices and conditions of healthcare. Our central argument is that assessment of ethical implications is important, but that the current paradigm does not facilitate an adequate understanding of the very issues it aims to invigilate.
Effective Information Systems: What's the Secret?
ERIC Educational Resources Information Center
Kirkham, Sandi
1994-01-01
Argues that false assumptions about user needs implicit in methodologies for building information systems have resulted in inadequate and inflexible systems. Checkland's Soft Systems Methodology is examined as a useful alternative. Its fundamental features are described, and examples of models demonstrate how the methodology can facilitate…
Analysis of Alternatives for Risk Assessment Methodologies and Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nachtigal, Noel M.; Fruetel, Julia A.; Gleason, Nathaniel J.
The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in themore » risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.« less
A Soft Sensor for Bioprocess Control Based on Sequential Filtering of Metabolic Heat Signals
Paulsson, Dan; Gustavsson, Robert; Mandenius, Carl-Fredrik
2014-01-01
Soft sensors are the combination of robust on-line sensor signals with mathematical models for deriving additional process information. Here, we apply this principle to a microbial recombinant protein production process in a bioreactor by exploiting bio-calorimetric methodology. Temperature sensor signals from the cooling system of the bioreactor were used for estimating the metabolic heat of the microbial culture and from that the specific growth rate and active biomass concentration were derived. By applying sequential digital signal filtering, the soft sensor was made more robust for industrial practice with cultures generating low metabolic heat in environments with high noise level. The estimated specific growth rate signal obtained from the three stage sequential filter allowed controlled feeding of substrate during the fed-batch phase of the production process. The biomass and growth rate estimates from the soft sensor were also compared with an alternative sensor probe and a capacitance on-line sensor, for the same variables. The comparison showed similar or better sensitivity and lower variability for the metabolic heat soft sensor suggesting that using permanent temperature sensors of a bioreactor is a realistic and inexpensive alternative for monitoring and control. However, both alternatives are easy to implement in a soft sensor, alone or in parallel. PMID:25264951
A soft sensor for bioprocess control based on sequential filtering of metabolic heat signals.
Paulsson, Dan; Gustavsson, Robert; Mandenius, Carl-Fredrik
2014-09-26
Soft sensors are the combination of robust on-line sensor signals with mathematical models for deriving additional process information. Here, we apply this principle to a microbial recombinant protein production process in a bioreactor by exploiting bio-calorimetric methodology. Temperature sensor signals from the cooling system of the bioreactor were used for estimating the metabolic heat of the microbial culture and from that the specific growth rate and active biomass concentration were derived. By applying sequential digital signal filtering, the soft sensor was made more robust for industrial practice with cultures generating low metabolic heat in environments with high noise level. The estimated specific growth rate signal obtained from the three stage sequential filter allowed controlled feeding of substrate during the fed-batch phase of the production process. The biomass and growth rate estimates from the soft sensor were also compared with an alternative sensor probe and a capacitance on-line sensor, for the same variables. The comparison showed similar or better sensitivity and lower variability for the metabolic heat soft sensor suggesting that using permanent temperature sensors of a bioreactor is a realistic and inexpensive alternative for monitoring and control. However, both alternatives are easy to implement in a soft sensor, alone or in parallel.
NASA Astrophysics Data System (ADS)
Pérez-Aparicio, Elena; Lillo-Bravo, Isidoro; Moreno-Tejera, Sara; Silva-Pérez, Manuel
2017-06-01
Thermal energy for industrial processes can be generated using thermal (ST) or photovoltaic (PV) solar energy. ST energy has traditionally been the most favorable option due to its cost and efficiency. Current costs and efficiencies values make the PV solar energy become an alternative to ST energy as supplier of industrial process heat. The aim of this study is to provide a useful tool to decide in each case which option is economically and environmentally the most suitable alternative. The methodology used to compare ST and PV systems is based on the calculation of the levelized cost of energy (LCOE) and greenhouse gas emissions (GHG) avoided by using renewable technologies instead of conventional sources of energy. In both cases, these calculations depend on costs and efficiencies associated with ST or PV systems and the conversion factor from thermal or electrical energy to GHG. To make these calculations, a series of hypotheses are assumed related to consumer and energy prices, operation, maintenance and replacement costs, lifetime of the system or working temperature of the industrial process. This study applies the methodology at five different sites which have been selected taking into account their radiometric and meteorological characteristics. In the case of ST energy three technologies are taken into account, compound parabolic concentrator (CPC), linear Fresnel collector (LFC) and parabolic trough collector (PTC). The PV option includes two ways of use of generated electricity, an electrical resistance or a combination of an electrical resistance and a heat pump (HP). Current values of costs and efficiencies make ST system remains as the most favorable option. These parameters may vary significantly over time. The evolution of these parameters may convert PV systems into the most favorable option for particular applications.
Ultrasonic sensor based defect detection and characterisation of ceramics.
Kesharaju, Manasa; Nagarajah, Romesh; Zhang, Tonzhua; Crouch, Ian
2014-01-01
Ceramic tiles, used in body armour systems, are currently inspected visually offline using an X-ray technique that is both time consuming and very expensive. The aim of this research is to develop a methodology to detect, locate and classify various manufacturing defects in Reaction Sintered Silicon Carbide (RSSC) ceramic tiles, using an ultrasonic sensing technique. Defects such as free silicon, un-sintered silicon carbide material and conventional porosity are often difficult to detect using conventional X-radiography. An alternative inspection system was developed to detect defects in ceramic components using an Artificial Neural Network (ANN) based signal processing technique. The inspection methodology proposed focuses on pre-processing of signals, de-noising, wavelet decomposition, feature extraction and post-processing of the signals for classification purposes. This research contributes to developing an on-line inspection system that would be far more cost effective than present methods and, moreover, assist manufacturers in checking the location of high density areas, defects and enable real time quality control, including the implementation of accept/reject criteria. Copyright © 2013 Elsevier B.V. All rights reserved.
Optimisation of Critical Infrastructure Protection: The SiVe Project on Airport Security
NASA Astrophysics Data System (ADS)
Breiing, Marcus; Cole, Mara; D'Avanzo, John; Geiger, Gebhard; Goldner, Sascha; Kuhlmann, Andreas; Lorenz, Claudia; Papproth, Alf; Petzel, Erhard; Schwetje, Oliver
This paper outlines the scientific goals, ongoing work and first results of the SiVe research project on critical infrastructure security. The methodology is generic while pilot studies are chosen from airport security. The outline proceeds in three major steps, (1) building a threat scenario, (2) development of simulation models as scenario refinements, and (3) assessment of alternatives. Advanced techniques of systems analysis and simulation are employed to model relevant airport structures and processes as well as offences. Computer experiments are carried out to compare and optimise alternative solutions. The optimality analyses draw on approaches to quantitative risk assessment recently developed in the operational sciences. To exploit the advantages of the various techniques, an integrated simulation workbench is build up in the project.
Roibás, Laura; Loiseau, Eléonore; Hospido, Almudena
2018-07-01
On a previous study, the carbon footprint (CF) of all production and consumption activities of Galicia, an Autonomous Community located in the north-west of Spain, was determined and the results were used to devise strategies aimed at the reduction and mitigation of the greenhouse gas (GHG) emissions. The territorial LCA methodology was used there to perform the calculations. However, that methodology was initially designed to compute the emissions of all types of polluting substances to the environment (several thousands of substances considered in the life cycle inventories), aimed at performing complete LCA studies. This requirement implies the use of specific modelling approaches and databases that in turn raised some difficulties, i.e., need of large amounts of data (which increased gathering times), low temporal, geographical and technological representativeness of the study, lack of data, and presence of double counting issues when trying to combine the sectorial CF results into those of the total economy. In view of these of difficulties, and considering the need to focus only on GHG emissions, it seems important to improve the robustness of the CF computation while proposing a simplified methodology. This study is the result of those efforts to improve the aforementioned methodology. In addition to the territorial LCA approach, several Input-Output (IO) based alternatives have been used here to compute direct and indirect GHG emissions of all Galician production and consumption activities. The results of the different alternatives were compared and evaluated under a multi-criteria approach considering reliability, completeness, temporal and geographical correlation, applicability and consistency. Based on that, an improved and simplified methodology was proposed to determine the CF of the Galician consumption and production activities from a total responsibility perspective. This methodology adequately reflects the current characteristics of the Galician economy, thus increasing the representativeness of the results, and can be applied to any region in which IO tables and environmental vectors are available. This methodology could thus provide useful information in decision making processes to reduce and prevent GHG emissions. Copyright © 2018 Elsevier Ltd. All rights reserved.
Fernández-Santander, Ana
2008-01-01
The informal activities of cooperative learning and short periods of lecturing has been combined and used in the university teaching of biochemistry as part of the first year course of Optics and Optometry in the academic years 2004-2005 and 2005-2006. The lessons were previously elaborated by the teacher and included all that is necessary to understand the topic (text, figures, graphics, diagrams, pictures, etc.). Additionally, a questionnaire was prepared for every chapter. All lessons contained three parts: objectives, approach and development, and the assessment of the topic. Team work, responsibility, and communication skills were some of the abilities developed with this new methodology. Students worked collaboratively in small groups of two or three following the teacher's instructions with short periods of lecturing that clarified misunderstood concepts. Homework was minimized. On comparing this combined methodology with the traditional one (only lecture), students were found to exhibit a higher satisfaction with the new method. They were more involved in the learning process and had a better attitude toward the subject. The use of this new methodology showed a significant increase in the mean score of the students' academic results. The rate of students who failed the subject was significantly inferior in comparison with those who failed in the previous years when only lecturing was applied. This combined methodology helped the teacher to observe the apprenticeship process of students better and to act as a facilitator in the process of building students' knowledge. Copyright © 2008 International Union of Biochemistry and Molecular Biology, Inc.
Rosset, Peter Michael; Sosa, Braulio Machín; Jaime, Adilén María Roque; Lozano, Dana Rocío Ávila
2011-01-01
Agroecology has played a key role in helping Cuba survive the crisis caused by the collapse of the socialist bloc in Europe and the tightening of the US trade embargo. Cuban peasants have been able to boost food production without scarce and expensive imported agricultural chemicals by first substituting more ecological inputs for the no longer available imports, and then by making a transition to more agroecologically integrated and diverse farming systems. This was possible not so much because appropriate alternatives were made available, but rather because of the Campesino-a-Campesino (CAC) social process methodology that the National Association of Small Farmers (ANAP) used to build a grassroots agroecology movement. This paper was produced in a 'self-study' process spearheaded by ANAP and La Via Campesina, the international agrarian movement of which ANAP is a member. In it we document and analyze the history of the Campesino-to-Campesino Agroecology Movement (MACAC), and the significantly increased contribution of peasants to national food production in Cuba that was brought about, at least in part, due to this movement. Our key findings are (i) the spread of agroecology was rapid and successful largely due to the social process methodology and social movement dynamics, (ii) farming practices evolved over time and contributed to significantly increased relative and absolute production by the peasant sector, and (iii) those practices resulted in additional benefits including resilience to climate change.
NASA Astrophysics Data System (ADS)
Swartz, Charles S.
2003-05-01
The process of distributing and exhibiting a motion picture has changed little since the Lumière brothers presented the first motion picture to an audience in 1895. While this analog photochemical process is capable of producing screen images of great beauty and expressive power, more often the consumer experience is diminished by third generation prints and by the wear and tear of the mechanical process. Furthermore, the film industry globally spends approximately $1B annually manufacturing and shipping prints. Alternatively, distributing digital files would theoretically yield great benefits in terms of image clarity and quality, lower cost, greater security, and more flexibility in the cinema (e.g., multiple language versions). In order to understand the components of the digital cinema chain and evaluate the proposed technical solutions, the Entertainment Technology Center at USC in 2000 established the Digital Cinema Laboratory as a critical viewing environment, with the highest quality film and digital projection equipment. The presentation describes the infrastructure of the Lab, test materials, and testing methodologies developed for compression evaluation, and lessons learned up to the present. In addition to compression, the Digital Cinema Laboratory plans to evaluate other components of the digital cinema process as well.
Ialongo, Cristiano; Bernardini, Sergio
2018-06-18
There is a compelling need for quality tools that enable effective control of the extra-analytical phase. In this regard, Six Sigma seems to offer a valid methodological and conceptual opportunity, and in recent times, the International Federation of Clinical Chemistry and Laboratory Medicine has adopted it for indicating the performance requirements for non-analytical laboratory processes. However, the Six Sigma implies a distinction between short-term and long-term quality that is based on the dynamics of the processes. These concepts are still not widespread and applied in the field of laboratory medicine although they are of fundamental importance to exploit the full potential of this methodology. This paper reviews the Six Sigma quality concepts and shows how they originated from Shewhart's control charts, in respect of which they are not an alternative but a completion. It also discusses the dynamic nature of process and how it arises, concerning particularly the long-term dynamic mean variation, and explains why this leads to the fundamental distinction of quality we previously mentioned.
Carbon Capture and Utilization in the Industrial Sector.
Psarras, Peter C; Comello, Stephen; Bains, Praveen; Charoensawadpong, Panunya; Reichelstein, Stefan; Wilcox, Jennifer
2017-10-03
The fabrication and manufacturing processes of industrial commodities such as iron, glass, and cement are carbon-intensive, accounting for 23% of global CO 2 emissions. As a climate mitigation strategy, CO 2 capture from flue gases of industrial processes-much like that of the power sector-has not experienced wide adoption given its high associated costs. However, some industrial processes with relatively high CO 2 flue concentration may be viable candidates to cost-competitively supply CO 2 for utilization purposes (e.g., polymer manufacturing, etc.). This work develops a methodology that determines the levelized cost ($/tCO 2 ) of separating, compressing, and transporting carbon dioxide. A top-down model determines the cost of separating and compressing CO 2 across 18 industrial processes. Further, the study calculates the cost of transporting CO 2 via pipeline and tanker truck to appropriately paired sinks using a bottom-up cost model and geo-referencing approach. The results show that truck transportation is generally the low-cost alternative given the relatively small volumes (ca. 100 kt CO 2 /a). We apply our methodology to a regional case study in Pennsylvania, which shows steel and cement manufacturing paired to suitable sinks as having the lowest levelized cost of capture, compression, and transportation.
Using Alternate Reality Games to Support First Year Induction with ELGG
ERIC Educational Resources Information Center
Piatt, Katie
2009-01-01
Purpose: This paper aims to describe a pilot project investigating the use of alternate reality game/treasure-hunt formats to provide an alternative to existing mechanisms for introducing new students to university information and services. Design/methodology/approach: An alternate reality game was designed to be played online and offline, which…
The Development of a Methodology for Estimating the Cost of Air Force On-the-Job Training.
ERIC Educational Resources Information Center
Samers, Bernard N.; And Others
The Air Force uses a standardized costing methodology for resident technical training schools (TTS); no comparable methodology exists for computing the cost of on-the-job training (OJT). This study evaluates three alternative survey methodologies and a number of cost models for estimating the cost of OJT for airmen training in the Administrative…
NASA Astrophysics Data System (ADS)
Subagadis, Yohannes Hagos; Schütze, Niels; Grundmann, Jens
2014-05-01
An amplified interconnectedness between a hydro-environmental and socio-economic system brings about profound challenges of water management decision making. In this contribution, we present a fuzzy stochastic approach to solve a set of decision making problems, which involve hydrologically, environmentally, and socio-economically motivated criteria subjected to uncertainty and ambiguity. The proposed methodological framework combines objective and subjective criteria in a decision making procedure for obtaining an acceptable ranking in water resources management alternatives under different type of uncertainty (subjective/objective) and heterogeneous information (quantitative/qualitative) simultaneously. The first step of the proposed approach involves evaluating the performance of alternatives with respect to different types of criteria. The ratings of alternatives with respect to objective and subjective criteria are evaluated by simulation-based optimization and fuzzy linguistic quantifiers, respectively. Subjective and objective uncertainties related to the input information are handled through linking fuzziness and randomness together. Fuzzy decision making helps entail the linguistic uncertainty and a Monte Carlo simulation process is used to map stochastic uncertainty. With this framework, the overall performance of each alternative is calculated using an Order Weighted Averaging (OWA) aggregation operator accounting for decision makers' experience and opinions. Finally, ranking is achieved by conducting pair-wise comparison of management alternatives. This has been done on the basis of the risk defined by the probability of obtaining an acceptable ranking and mean difference in total performance for the pair of management alternatives. The proposed methodology is tested in a real-world hydrosystem, to find effective and robust intervention strategies for the management of a coastal aquifer system affected by saltwater intrusion due to excessive groundwater extraction for irrigated agriculture and municipal use. The results show that the approach gives useful support for robust decision-making and is sensitive to the decision makers' degree of optimism.
The Future Impact of Wind on BPA Power System Ancillary Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makarov, Yuri V.; Lu, Shuai; McManus, Bart
Wind power is growing in a very fast pace as an alternative generating resource. As the ratio of wind power over total system capacity increases, the impact of wind on various system aspects becomes significant. This paper presents a methodology to study the future impact of wind on BPA power system ancillary services including load following and regulation. Existing approaches for similar analysis include dispatch model simulation and standard deviation evaluation. The methodology proposed in this paper uses historical data and stochastic processes to simulate the load balancing processes in BPA power system. Then capacity, ramp rate and ramp durationmore » characteristics are extracted from the simulation results, and load following and regulation requirements are calculated accordingly. It mimics the actual power system operations therefore the results can be more realistic yet the approach is convenient to perform. Further, the ramp rate and ramp duration data obtained from the analysis can be used to evaluate generator response or maneuverability and energy requirement, respectively, additional to the capacity requirement.« less
Wang, Ya-Qi; Wu, Zhen-Feng; Ke, Gang; Yang, Ming
2014-12-31
An effective vacuum assisted extraction (VAE) technique was proposed for the first time and applied to extract bioactive components from Andrographis paniculata. The process was carefully optimized by response surface methodology (RSM). Under the optimized experimental conditions, the best results were obtained using a boiling temperature of 65 °C, 50% ethanol concentration, 16 min of extraction time, one extraction cycles and a 12:1 liquid-solid ratio. Compared with conventional ultrasonic assisted extraction and heat reflux extraction, the VAE technique gave shorter extraction times and remarkable higher extraction efficiency, which indicated that a certain degree of vacuum gave the solvent a better penetration of the solvent into the pores and between the matrix particles, and enhanced the process of mass transfer. The present results demonstrated that VAE is an efficient, simple and fast method for extracting bioactive components from A. paniculata, which shows great potential for becoming an alternative technique for industrial scale-up applications.
González-Sáiz, J M; Esteban-Díez, I; Rodríguez-Tecedor, S; Pérez-Del-Notario, N; Arenzana-Rámila, I; Pizarro, C
2014-12-15
The aim of the present work was to evaluate the effect of the main factors conditioning accelerated ageing processes (oxygen dose, chip dose, wood origin, toasting degree and maceration time) on the phenolic and chromatic profiles of red wines by using a multivariate strategy based on experimental design methodology. The results obtained revealed that the concentrations of monomeric anthocyanins and flavan-3-ols could be modified through the application of particular experimental conditions. This fact was particularly remarkable since changes in phenolic profile were closely linked to changes observed in chromatic parameters. The main strength of this study lies in the possibility of using its conclusions as a basis to make wines with specific colour properties based on quality criteria. To our knowledge, the influence of such a large number of alternative ageing parameters on wine phenolic composition and chromatic attributes has not been studied previously using a comprehensive experimental design methodology. Copyright © 2014 Elsevier Ltd. All rights reserved.
Advanced Machine Learning Emulators of Radiative Transfer Models
NASA Astrophysics Data System (ADS)
Camps-Valls, G.; Verrelst, J.; Martino, L.; Vicent, J.
2017-12-01
Physically-based model inversion methodologies are based on physical laws and established cause-effect relationships. A plethora of remote sensing applications rely on the physical inversion of a Radiative Transfer Model (RTM), which lead to physically meaningful bio-geo-physical parameter estimates. The process is however computationally expensive, needs expert knowledge for both the selection of the RTM, its parametrization and the the look-up table generation, as well as its inversion. Mimicking complex codes with statistical nonlinear machine learning algorithms has become the natural alternative very recently. Emulators are statistical constructs able to approximate the RTM, although at a fraction of the computational cost, providing an estimation of uncertainty, and estimations of the gradient or finite integral forms. We review the field and recent advances of emulation of RTMs with machine learning models. We posit Gaussian processes (GPs) as the proper framework to tackle the problem. Furthermore, we introduce an automatic methodology to construct emulators for costly RTMs. The Automatic Gaussian Process Emulator (AGAPE) methodology combines the interpolation capabilities of GPs with the accurate design of an acquisition function that favours sampling in low density regions and flatness of the interpolation function. We illustrate the good capabilities of our emulators in toy examples, leaf and canopy levels PROSPECT and PROSAIL RTMs, and for the construction of an optimal look-up-table for atmospheric correction based on MODTRAN5.
Development of a methodology for classifying software errors
NASA Technical Reports Server (NTRS)
Gerhart, S. L.
1976-01-01
A mathematical formalization of the intuition behind classification of software errors is devised and then extended to a classification discipline: Every classification scheme should have an easily discernible mathematical structure and certain properties of the scheme should be decidable (although whether or not these properties hold is relative to the intended use of the scheme). Classification of errors then becomes an iterative process of generalization from actual errors to terms defining the errors together with adjustment of definitions according to the classification discipline. Alternatively, whenever possible, small scale models may be built to give more substance to the definitions. The classification discipline and the difficulties of definition are illustrated by examples of classification schemes from the literature and a new study of observed errors in published papers of programming methodologies.
Robust detection-isolation-accommodation for sensor failures
NASA Technical Reports Server (NTRS)
Weiss, J. L.; Pattipati, K. R.; Willsky, A. S.; Eterno, J. S.; Crawford, J. T.
1985-01-01
The results of a one year study to: (1) develop a theory for Robust Failure Detection and Identification (FDI) in the presence of model uncertainty, (2) develop a design methodology which utilizes the robust FDI ththeory, (3) apply the methodology to a sensor FDI problem for the F-100 jet engine, and (4) demonstrate the application of the theory to the evaluation of alternative FDI schemes are presented. Theoretical results in statistical discrimination are used to evaluate the robustness of residual signals (or parity relations) in terms of their usefulness for FDI. Furthermore, optimally robust parity relations are derived through the optimization of robustness metrics. The result is viewed as decentralization of the FDI process. A general structure for decentralized FDI is proposed and robustness metrics are used for determining various parameters of the algorithm.
Certify for success: A methodology for human-centered certification of advanced aviation systems
NASA Technical Reports Server (NTRS)
Small, Ronald L.; Rouse, William B.
1994-01-01
This position paper uses the methodology in Design for Success as a basis for a human factors certification program. The Design for Success (DFS) methodology espouses a multi-step process to designing and developing systems in a human-centered fashion. These steps are as follows: (1) naturalizing - understand stakeholders and their concerns; (2) marketing - understand market-oriented alternatives to meeting stakeholder concerns; (3) engineering - detailed design and development of the system considering tradeoffs between technology, cost, schedule, certification requirements, etc.; (4) system evaluation - determining if the system meets its goal(s); and (5) sales and service - delivering and maintaining the system. Because the main topic of this paper is certification, we will focus our attention on step 4, System Evaluation, since it is the natural precursor to certification. Evaluation involves testing the system and its parts for their correct behaviors. Certification focuses not only on ensuring that the system exhibits the correct behaviors, but ONLY the correct behaviors.
Introduction to SIMRAND: Simulation of research and development project
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1982-01-01
SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.
NASA Astrophysics Data System (ADS)
Konovodov, V. V.; Valentov, A. V.; Kukhar, I. S.; Retyunskiy, O. Yu; Baraksanov, A. S.
2016-08-01
The work proposes the algorithm to calculate strength under alternating stresses using the developed methodology of building the diagram of limiting stresses. The overall safety factor is defined by the suggested formula. Strength calculations of components working under alternating stresses in the great majority of cases are conducted as the checking ones. It is primarily explained by the fact that the overall fatigue strength reduction factor (Kσg or Kτg) can only be chosen approximately during the component design as the engineer at this stage of work has just the approximate idea on the component size and shape.
Cottrell, Erika K; Hall, Jennifer D; Kautz, Glenn; Angier, Heather; Likumahuwa-Ackman, Sonja; Sisulak, Laura; Keller, Sara; Cameron, David C; DeVoe, Jennifer E; Cohen, Deborah J
Alternative payment models have been proposed as a way to facilitate patient-centered medical home model implementation, yet little is known about how payment reform translates into changes in care delivery. We conducted site visits, observed operations, and conducted interviews within 3 Federally Qualified Health Center organizations that were part of Oregon's Alternative Payment Methodology demonstration project. Data were analyzed using an immersion-crystallization approach. We identified several care delivery changes during the early stages of implementation, as well as challenges associated with this new model of payment. Future research is needed to further understand the implications of these changes.
Hermans, C.; Erickson, J.; Noordewier, T.; Sheldon, A.; Kline, M.
2007-01-01
Multicriteria decision analysis (MCDA) provides a well-established family of decision tools to aid stakeholder groups in arriving at collective decisions. MCDA can also function as a framework for the social learning process, serving as an educational aid in decision problems characterized by a high level of public participation. In this paper, the framework and results of a structured decision process using the outranking MCDA methodology preference ranking organization method of enrichment evaluation (PROMETHEE) are presented. PROMETHEE is used to frame multi-stakeholder discussions of river management alternatives for the Upper White River of Central Vermont, in the northeastern United States. Stakeholders met over 10 months to create a shared vision of an ideal river and its services to communities, develop a list of criteria by which to evaluate river management alternatives, and elicit preferences to rank and compare individual and group preferences. The MCDA procedure helped to frame a group process that made stakeholder preferences explicit and substantive discussions about long-term river management possible. ?? 2006 Elsevier Ltd. All rights reserved.
Hermans, Caroline; Erickson, Jon; Noordewier, Tom; Sheldon, Amy; Kline, Mike
2007-09-01
Multicriteria decision analysis (MCDA) provides a well-established family of decision tools to aid stakeholder groups in arriving at collective decisions. MCDA can also function as a framework for the social learning process, serving as an educational aid in decision problems characterized by a high level of public participation. In this paper, the framework and results of a structured decision process using the outranking MCDA methodology preference ranking organization method of enrichment evaluation (PROMETHEE) are presented. PROMETHEE is used to frame multi-stakeholder discussions of river management alternatives for the Upper White River of Central Vermont, in the northeastern United States. Stakeholders met over 10 months to create a shared vision of an ideal river and its services to communities, develop a list of criteria by which to evaluate river management alternatives, and elicit preferences to rank and compare individual and group preferences. The MCDA procedure helped to frame a group process that made stakeholder preferences explicit and substantive discussions about long-term river management possible.
Alternative Schools. Research Report 1976-3.
ERIC Educational Resources Information Center
National School Boards Association, Washington, DC.
During the 1970's, school districts throughout the country have created alternative educational programs allowing students and parents to choose the appropriate educational structure and methodology that suit their individual needs. School boards who establish alternative schools must assess, plan, design, effect, and evaluate educational programs…
Strategic decision making under climate change: a case study on Lake Maggiore water system
NASA Astrophysics Data System (ADS)
Micotti, M.; Soncini Sessa, R.; Weber, E.
2014-09-01
Water resources planning processes involve different kinds of decisions that are generally evaluated under a stationary climate scenario assumption. In general, the possible combinations of interventions are mutually compared as single alternatives. However, the ongoing climate change requires us to reconsider this approach. Indeed, what have to be compared are not individual alternatives, but families of alternatives, characterized by the same structural decisions, i.e. by actions that have long-term effects and entail irrevocable changes in the system. The rationale is that the structural actions, once they have been implemented, cannot be easily modified, while the management decisions can be adapted to the evolving conditions. This paper considers this methodological problem in a real case study, in which a strategic decision has to be taken: a new barrage was proposed to regulate Lake Maggiore outflow, but, alternatively, either the present barrage can be maintained with its present regulation norms or with a new one. The problem was dealt with by multi-criteria decision analysis involving many stakeholders and two decision-makers. An exhaustive set of indicators was defined in the participatory process, conducted under the integrated water resource management paradigm, and many efficient (in Pareto sense) regulation policies were identified. The paper explores different formulations of a global index to evaluate and compare the effectiveness of the classes of alternatives under both stationary and changing hydrological scenarios in order to assess their adaptability to the ongoing climate change.
Rocha, Joana; Coelho, Francisco J R C; Peixe, Luísa; Gomes, Newton C M; Calado, Ricardo
2014-11-11
For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H'). Statistical analyses indicated that preservation significantly affects H'. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H'. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar &pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer.
Rocha, Joana; Coelho, Francisco J. R. C.; Peixe, Luísa; Gomes, Newton C. M.; Calado, Ricardo
2014-01-01
For several years, knowledge on the microbiome associated with marine invertebrates was impaired by the challenges associated with the characterization of bacterial communities. With the advent of culture independent molecular tools it is possible to gain new insights on the diversity and richness of microorganisms associated with marine invertebrates. In the present study, we evaluated if different preservation and processing methodologies (prior to DNA extraction) can affect the bacterial diversity retrieved from snakelocks anemone Anemonia viridis. Denaturing gradient gel electrophoresis (DGGE) community fingerprints were used as proxy to determine the bacterial diversity retrieved (H′). Statistical analyses indicated that preservation significantly affects H′. The best approach to preserve and process A. viridis biomass for bacterial community fingerprint analysis was flash freezing in liquid nitrogen (preservation) followed by the use of a mechanical homogenizer (process), as it consistently yielded higher H′. Alternatively, biomass samples can be processed fresh followed by cell lyses using a mechanical homogenizer or mortar & pestle. The suitability of employing these two alternative procedures was further reinforced by the quantification of the 16S rRNA gene; no significant differences were recorded when comparing these two approaches and the use of liquid nitrogen followed by processing with a mechanical homogenizer. PMID:25384534
Zischg, Jonatan; Goncalves, Mariana L R; Bacchin, Taneha Kuzniecow; Leonhardt, Günther; Viklander, Maria; van Timmeren, Arjan; Rauch, Wolfgang; Sitzenfrei, Robert
2017-09-01
In the urban water cycle, there are different ways of handling stormwater runoff. Traditional systems mainly rely on underground piped, sometimes named 'gray' infrastructure. New and so-called 'green/blue' ambitions aim for treating and conveying the runoff at the surface. Such concepts are mainly based on ground infiltration and temporal storage. In this work a methodology to create and compare different planning alternatives for stormwater handling on their pathways to a desired system state is presented. Investigations are made to assess the system performance and robustness when facing the deeply uncertain spatial and temporal developments in the future urban fabric, including impacts caused by climate change, urbanization and other disruptive events, like shifts in the network layout and interactions of 'gray' and 'green/blue' structures. With the Info-Gap robustness pathway method, three planning alternatives are evaluated to identify critical performance levels at different stages over time. This novel methodology is applied to a real case study problem where a city relocation process takes place during the upcoming decades. In this case study it is shown that hybrid systems including green infrastructures are more robust with respect to future uncertainties, compared to traditional network design.
Liquid by-products from fish canning industry as sustainable sources of ω3 lipids.
Monteiro, Ana; Paquincha, Diogo; Martins, Florinda; Queirós, Rui P; Saraiva, Jorge A; Švarc-Gajić, Jaroslava; Nastić, Nataša; Delerue-Matos, Cristina; Carvalho, Ana P
2018-08-01
Fish canning industry generates large amounts of liquid wastes, which are discarded, after proper treatment to remove the organic load. However, alternative treatment processes may also be designed in order to target the recovery of valuable compounds; with this procedure, these wastewaters are converted into liquid by-products, becoming an additional source of revenue for the company. This study evaluated green and economically sustainable methodologies for the extraction of ω3 lipids from fish canning liquid by-products. Lipids were extracted by processes combining physical and chemical parameters (conventional and pressurized extraction processes), as well as chemical and biological parameters. Furthermore, LCA was applied to evaluate the environmental performance and costs indicators for each process. Results indicated that extraction with high hydrostatic pressure provides the highest amounts of ω3 polyunsaturated fatty acids (3331,5 mg L -1 effluent), apart from presenting the lowest environmental impact and costs. The studied procedures allow to obtain alternative, sustainable and traceable sources of ω3 lipids for further applications in food, pharmaceutical and cosmetic industries. Additionally, such approach contributes towards the organic depuration of canning liquid effluents, therefore reducing the overall waste treatment costs. Copyright © 2018 Elsevier Ltd. All rights reserved.
Cunha, Edite; Pinto, Paula C A G; Saraiva, M Lúcia M F S
2015-08-15
An automated methodology is proposed for the evaluation of a set of ionic liquids (ILs) as alternative reaction media for aldolase based synthetic processes. For that, the effect of traditionally used organic solvents and ILs on the activity of aldolase was studied by means of a novel automated methodology. The implemented methodology is based on the concept of sequential injection analysis (SIA) and relies on the aldolase based cleavage of d-fructose-1,6 diphosphate (DFDP), to produce dihydroxyacetone phosphate (DHAP) and d-glyceraldehyde-3-phosphate (G3P). In the presence of FeCl3, 3-methyl-2-benzothiazoline hydrazine (MBTH) forms a blue cation that can be measured at 670nm, by combination with G3P. The influence of several parameters such as substrate and enzyme concentration, temperature, delay time and MBTH and FeCl3 concentration were studied and the optimum reaction conditions were subsequently selected. The developed methodology showed good precision and a relative standard deviation (rsd) that does not exceed 7% also leading to low reagents consumption as well as effluent production. Resorting to this strategy, the activity of the enzyme was studied in strictly aqueous media and in the presence of dimethylformamide, methanol, bmpyr [Cl], hmim [Cl], bmim [BF4], emim [BF4], emim [Ac], bmim [Cl], emim [TfMs], emim [Ms] and Chol [Ac] up to 50%. The results show that the utilization of ILs as reaction media for aldolase based organic synthesis might present potential advantages over the tested conventional organic solvents. The least toxic IL found in this study was cho [Ac] that causes a reduction of enzyme activity of only 2.7% when used in a concentration of 50%. Generally, it can be concluded that ILs based on choline or short alkyl imidazolium moieties associated with biocompatible anions are the most promising ILs regarding the future inclusion of these solvents in synthetic protocols catalyzed by aldolase. Copyright © 2015 Elsevier B.V. All rights reserved.
Baty, Florent; Klingbiel, Dirk; Zappa, Francesco; Brutsche, Martin
2015-12-01
Alternative splicing is an important component of tumorigenesis. Recent advent of exon array technology enables the detection of alternative splicing at a genome-wide scale. The analysis of high-throughput alternative splicing is not yet standard and methodological developments are still needed. We propose a novel statistical approach-Dually Constrained Correspondence Analysis-for the detection of splicing changes in exon array data. Using this methodology, we investigated the genome-wide alteration of alternative splicing in patients with non-small cell lung cancer treated by bevacizumab/erlotinib. Splicing candidates reveal a series of genes related to carcinogenesis (SFTPB), cell adhesion (STAB2, PCDH15, HABP2), tumor aggressiveness (ARNTL2), apoptosis, proliferation and differentiation (PDE4D, FLT3, IL1R2), cell invasion (ETV1), as well as tumor growth (OLFM4, FGF14), tumor necrosis (AFF3) or tumor suppression (TUSC3, CSMD1, RHOBTB2, SERPINB5), with indication of known alternative splicing in a majority of genes. DCCA facilitates the identification of putative biologically relevant alternative splicing events in high-throughput exon array data. Copyright © 2015 Elsevier Inc. All rights reserved.
Hayes, J E; McGreevy, P D; Forbes, S L; Laing, G; Stuetz, R M
2018-08-01
Detection dogs serve a plethora of roles within modern society, and are relied upon to identify threats such as explosives and narcotics. Despite their importance, research and training regarding detection dogs has involved ambiguity. This is partially due to the fact that the assessment of effectiveness regarding detection dogs continues to be entrenched within a traditional, non-scientific understanding. Furthermore, the capabilities of detection dogs are also based on their olfactory physiology and training methodologies, both of which are hampered by knowledge gaps. Additionally, the future of detection dogs is strongly influenced by welfare and social implications. Most importantly however, is the emergence of progressively inexpensive and efficacious analytical methodologies including gas chromatography related techniques, "e-noses", and capillary electrophoresis. These analytical methodologies provide both an alternative and assistor for the detection dog industry, however the interrelationship between these two detection paradigms requires clarification. These factors, when considering their relative contributions, illustrate a need to address research gaps, formalise the detection dog industry and research process, as well as take into consideration analytical methodologies and their influence on the future status of detection dogs. This review offers an integrated assessment of the factors involved in order to determine the current and future status of detection dogs. Copyright © 2018 Elsevier B.V. All rights reserved.
FCA Group LLC request to the EPA regarding greenhouse gas, off-cycle CO2 credits for High Efficiency Alternators used on 2009 and subsequent model year vehicles and off-cycle fuel consumption credits for 2017 and subsequent model year vehicles.
Analytical group decision making in natural resources: Methodology and application
Schmoldt, D.L.; Peterson, D.L.
2000-01-01
Group decision making is becoming increasingly important in natural resource management and associated scientific applications, because multiple values are treated coincidentally in time and space, multiple resource specialists are needed, and multiple stakeholders must be included in the decision process. Decades of social science research on decision making in groups have provided insights into the impediments to effective group processes and on techniques that can be applied in a group context. Nevertheless, little integration and few applications of these results have occurred in resource management decision processes, where formal groups are integral, either directly or indirectly. A group decision-making methodology is introduced as an effective approach for temporary, formal groups (e.g., workshops). It combines the following three components: (1) brainstorming to generate ideas; (2) the analytic hierarchy process to produce judgments, manage conflict, enable consensus, and plan for implementation; and (3) a discussion template (straw document). Resulting numerical assessments of alternative decision priorities can be analyzed statistically to indicate where group member agreement occurs and where priority values are significantly different. An application of this group process to fire research program development in a workshop setting indicates that the process helps focus group deliberations; mitigates groupthink, nondecision, and social loafing pitfalls; encourages individual interaction; identifies irrational judgments; and provides a large amount of useful quantitative information about group preferences. This approach can help facilitate scientific assessments and other decision-making processes in resource management.
Postoptimality Analysis in the Selection of Technology Portfolios
NASA Technical Reports Server (NTRS)
Adumitroaie, Virgil; Shelton, Kacie; Elfes, Alberto; Weisbin, Charles R.
2006-01-01
This slide presentation reviews a process of postoptimally analysing the selection of technology portfolios. The rationale for the analysis stems from the need for consistent, transparent and auditable decision making processes and tools. The methodology is used to assure that project investments are selected through an optimization of net mission value. The main intent of the analysis is to gauge the degree of confidence in the optimal solution and to provide the decision maker with an array of viable selection alternatives which take into account input uncertainties and possibly satisfy non-technical constraints. A few examples of the analysis are reviewed. The goal of the postoptimality study is to enhance and improve the decision-making process by providing additional qualifications and substitutes to the optimal solution.
Exploring Alternative Approaches to Methodology in Educational Research
ERIC Educational Resources Information Center
Niaz, Mansoor
2004-01-01
The objective of this study is to provide in-service teachers an opportunity to become familiar with the controversial nature of progress in science (growth of knowledge) and its implications for research methodology in education. The study is based on 41 participants who had registered for a nine-week course on Methodology of Investigation in…
Fuzzy multicriteria disposal method and site selection for municipal solid waste.
Ekmekçioğlu, Mehmet; Kaya, Tolga; Kahraman, Cengiz
2010-01-01
The use of fuzzy multiple criteria analysis (MCA) in solid waste management has the advantage of rendering subjective and implicit decision making more objective and analytical, with its ability to accommodate both quantitative and qualitative data. In this paper a modified fuzzy TOPSIS methodology is proposed for the selection of appropriate disposal method and site for municipal solid waste (MSW). Our method is superior to existing methods since it has capability of representing vague qualitative data and presenting all possible results with different degrees of membership. In the first stage of the proposed methodology, a set of criteria of cost, reliability, feasibility, pollution and emission levels, waste and energy recovery is optimized to determine the best MSW disposal method. Landfilling, composting, conventional incineration, and refuse-derived fuel (RDF) combustion are the alternatives considered. The weights of the selection criteria are determined by fuzzy pairwise comparison matrices of Analytic Hierarchy Process (AHP). It is found that RDF combustion is the best disposal method alternative for Istanbul. In the second stage, the same methodology is used to determine the optimum RDF combustion plant location using adjacent land use, climate, road access and cost as the criteria. The results of this study illustrate the importance of the weights on the various factors in deciding the optimized location, with the best site located in Catalca. A sensitivity analysis is also conducted to monitor how sensitive our model is to changes in the various criteria weights. 2010 Elsevier Ltd. All rights reserved.
Alyaseri, Isam; Zhou, Jianpeng
2017-03-01
The aim of this study is to use the life cycle assessment method to measure the environmental performance of the sludge incineration process in a wastewater treatment plant and to propose an alternative that can reduce the environmental impact. To show the damages caused by the treatment processes, the study aimed to use an endpoint approach in evaluating impacts on human health, ecosystem quality, and resources due to the processes. A case study was taken at Bissell Point Wastewater Treatment Plant in Saint Louis, Missouri, U.S. The plant-specific data along with literature data from technical publications were used to build an inventory, and then analyzed the environmental burdens from sludge handling unit in the year 2011. The impact assessment method chosen was ReCipe 2008. The existing scenario (dewatering-multiple hearth incineration-ash to landfill) was evaluated and three alternative scenarios (fluid bed incineration and anaerobic digestion with and without land application) with energy recovery from heat or biogas were proposed and analyzed to find the one with the least environmental impact. The existing scenario shows that the most significant impacts are related to depletion in resources and damage to human health. These impacts mainly came from the operation phase (electricity and fuel consumption and emissions related to combustion). Alternatives showed better performance than the existing scenario. Using ReCipe endpoint methodology, and among the three alternatives tested, the anaerobic digestion had the best overall environmental performance. It is recommended to convert to fluid bed incineration if the concerns were more about human health or to anaerobic digestion if the concerns were more about depletion in resources. The endpoint approach may simplify the outcomes of this study as follows: if the plant is converted to fluid bed incineration, it could prevent an average of 43.2 DALYs in human life, save 0.059 species in the area from extinction, and make a 62% reduction in the plant's current expenses needed by future generations to extract resources per year. At the same time it may prevent 36.1 DALYs in humans, save 0.157 species, and make a 101% reduction in current expenses on resources per year, if converting to anaerobic digestion.
Sela, Itamar; Ashkenazy, Haim; Katoh, Kazutaka; Pupko, Tal
2015-07-01
Inference of multiple sequence alignments (MSAs) is a critical part of phylogenetic and comparative genomics studies. However, from the same set of sequences different MSAs are often inferred, depending on the methodologies used and the assumed parameters. Much effort has recently been devoted to improving the ability to identify unreliable alignment regions. Detecting such unreliable regions was previously shown to be important for downstream analyses relying on MSAs, such as the detection of positive selection. Here we developed GUIDANCE2, a new integrative methodology that accounts for: (i) uncertainty in the process of indel formation, (ii) uncertainty in the assumed guide tree and (iii) co-optimal solutions in the pairwise alignments, used as building blocks in progressive alignment algorithms. We compared GUIDANCE2 with seven methodologies to detect unreliable MSA regions using extensive simulations and empirical benchmarks. We show that GUIDANCE2 outperforms all previously developed methodologies. Furthermore, GUIDANCE2 also provides a set of alternative MSAs which can be useful for downstream analyses. The novel algorithm is implemented as a web-server, available at: http://guidance.tau.ac.il. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Sailaukhanuly, Yerbolat; Zhakupbekova, Arai; Amutova, Farida; Carlsen, Lars
2013-01-01
Knowledge of the environmental behavior of chemicals is a fundamental part of the risk assessment process. The present paper discusses various methods of ranking of a series of persistent organic pollutants (POPs) according to the persistence, bioaccumulation and toxicity (PBT) characteristics. Traditionally ranking has been done as an absolute (total) ranking applying various multicriteria data analysis methods like simple additive ranking (SAR) or various utility functions (UFs) based rankings. An attractive alternative to these ranking methodologies appears to be partial order ranking (POR). The present paper compares different ranking methods like SAR, UF and POR. Significant discrepancies between the rankings are noted and it is concluded that partial order ranking, as a method without any pre-assumptions concerning possible relation between the single parameters, appears as the most attractive ranking methodology. In addition to the initial ranking partial order methodology offers a wide variety of analytical tools to elucidate the interplay between the objects to be ranked and the ranking parameters. In the present study is included an analysis of the relative importance of the single P, B and T parameters. Copyright © 2012 Elsevier Ltd. All rights reserved.
Investigating patients' experiences: methodological usefulness of interpretive interactionism.
Tower, Marion; Rowe, Jennifer; Wallis, Marianne
2012-01-01
To demonstrate the methodological usefulness of interpretive interactionism by applying it to the example of a study investigating healthcare experiences of women affected by domestic violence. Understanding patients' experiences of health, illness and health care is important to nurses. For many years, biomedical discourse has prevailed in healthcare language and research, and has influenced healthcare responses. Contemporary nursing scholarship can be developed by engaging with new ways of understanding therapeutic interactions with patients. Research that uses qualitative methods of inquiry is an important paradigm for nurses who seek to explain and understand or describe experiences rather than predict outcomes. Interpretive interactionism is an interpretive form of inquiry for conducting studies of social or personal problems that have healthcare policy implications. It puts the patient at the centre of the research process and makes visible the experiences of patients as they interact with the healthcare and social systems that surround them. Interpretive interactionism draws on concepts of symbolic interactionism, phenomenology and hermeneutics. Interpretive interactionism is a patient-centred methodology that provides an alternative way of understanding patients' experiences. It can contribute to policy and practice development by drawing on the perspectives and experiences of patients, who are central to the research process. It also allows research findings to be situated in and linked to healthcare policy, professional ethics and organisational approaches to care. Interpretive interactionism has methodological utility because it can contribute to policy and practice development by drawing on the perspectives and experiences of patients who are central to the research process. Interpretive interactionism allows research findings to be situated in and linked to health policy, professional ethics and organisational approaches to caring.
Popova, Daria; Stonier, Adam; Pain, David; Titchener‐Hooker, Nigel J.
2016-01-01
Abstract Increases in mammalian cell culture titres and densities have placed significant demands on primary recovery operation performance. This article presents a methodology which aims to screen rapidly and evaluate primary recovery technologies for their scope for technically feasible and cost‐effective operation in the context of high cell density mammalian cell cultures. It was applied to assess the performance of current (centrifugation and depth filtration options) and alternative (tangential flow filtration (TFF)) primary recovery strategies. Cell culture test materials (CCTM) were generated to simulate the most demanding cell culture conditions selected as a screening challenge for the technologies. The performance of these technology options was assessed using lab scale and ultra scale‐down (USD) mimics requiring 25–110mL volumes for centrifugation and depth filtration and TFF screening experiments respectively. A centrifugation and depth filtration combination as well as both of the alternative technologies met the performance selection criteria. A detailed process economics evaluation was carried out at three scales of manufacturing (2,000L, 10,000L, 20,000L), where alternative primary recovery options were shown to potentially provide a more cost‐effective primary recovery process in the future. This assessment process and the study results can aid technology selection to identify the most effective option for a specific scenario. PMID:27067803
NASA Technical Reports Server (NTRS)
Miles, R. F., Jr.
1986-01-01
A research and development (R&D) project often involves a number of decisions that must be made concerning which subset of systems or tasks are to be undertaken to achieve the goal of the R&D project. To help in this decision making, SIMRAND (SIMulation of Research ANd Development Projects) is a methodology for the selection of the optimal subset of systems or tasks to be undertaken on an R&D project. Using alternative networks, the SIMRAND methodology models the alternative subsets of systems or tasks under consideration. Each path through an alternative network represents one way of satisfying the project goals. Equations are developed that relate the system or task variables to the measure of reference. Uncertainty is incorporated by treating the variables of the equations probabilistically as random variables, with cumulative distribution functions assessed by technical experts. Analytical techniques of probability theory are used to reduce the complexity of the alternative networks. Cardinal utility functions over the measure of preference are assessed for the decision makers. A run of the SIMRAND Computer I Program combines, in a Monte Carlo simulation model, the network structure, the equations, the cumulative distribution functions, and the utility functions.
A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.
Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan
2018-04-01
This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.
NASA Astrophysics Data System (ADS)
Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.
2016-02-01
Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.
Scrum Methodology in Higher Education: Innovation in Teaching, Learning and Assessment
ERIC Educational Resources Information Center
Jurado-Navas, Antonio; Munoz-Luna, Rosa
2017-01-01
The present paper aims to detail the experience developed in a classroom of English Studies from the Spanish University of Málaga, where an alternative project-based learning methodology has been implemented. Such methodology is inspired by scrum sessions widely extended in technological companies where staff members work in teams and are assigned…
We tested two methods for dataset generation and model construction, and three tree-classifier variants to identify the most parsimonious and thematically accurate mapping methodology for the SW ReGAP project. Competing methodologies were tested in the East Great Basin mapping un...
The ALMA CONOPS project: the impact of funding decisions on observatory performance
NASA Astrophysics Data System (ADS)
Ibsen, Jorge; Hibbard, John; Filippi, Giorgio
2014-08-01
In time when every penny counts, many organizations are facing the question of how much scientific impact a budget cut can have or, putting it in more general terms, which is the science impact of alternative (less costly) operational modes. In reply to such question posted by the governing bodies, the ALMA project had to develop a methodology (ALMA Concepts for Operations, CONOPS) that attempts to measure the impact that alternative operational scenarios may have on the overall scientific production of the Observatory. Although the analysis and the results are ALMA specific, the developed approach is rather general and provides a methodology for a cost-performance analysis of alternatives before any radical alterations to the operations model are adopted. This paper describes the key aspects of the methodology: a) the definition of the Figures of Merit (FoMs) for the assessment of quantitative science performance impacts as well as qualitative impacts, and presents a methodology using these FoMs to evaluate the cost and impact of the different operational scenarios; b) the definition of a REFERENCE operational baseline; c) the identification of Alternative Scenarios each replacing one or more concepts in the REFERENCE by a different concept that has a lower cost and some level of scientific and/or operational impact; d) the use of a Cost-Performance plane to graphically combine the effects that the alternative scenarios can have in terms of cost reduction and affected performance. Although is a firstorder assessment, we believe this approach is useful for comparing different operational models and to understand the cost performance impact of these choices. This can be used to take decision to meet budget cuts as well as in evaluating possible new emergent opportunities.
Sonic Boom Mitigation Through Aircraft Design and Adjoint Methodology
NASA Technical Reports Server (NTRS)
Rallabhandi, Siriam K.; Diskin, Boris; Nielsen, Eric J.
2012-01-01
This paper presents a novel approach to design of the supersonic aircraft outer mold line (OML) by optimizing the A-weighted loudness of sonic boom signature predicted on the ground. The optimization process uses the sensitivity information obtained by coupling the discrete adjoint formulations for the augmented Burgers Equation and Computational Fluid Dynamics (CFD) equations. This coupled formulation links the loudness of the ground boom signature to the aircraft geometry thus allowing efficient shape optimization for the purpose of minimizing the impact of loudness. The accuracy of the adjoint-based sensitivities is verified against sensitivities obtained using an independent complex-variable approach. The adjoint based optimization methodology is applied to a configuration previously optimized using alternative state of the art optimization methods and produces additional loudness reduction. The results of the optimizations are reported and discussed.
NASA Technical Reports Server (NTRS)
Kizhner, Semion; Miko, Joseph; Bradley, Damon; Heinzen, Katherine
2008-01-01
NASA Hubble Space Telescope (HST) and upcoming cosmology science missions carry instruments with multiple focal planes populated with many large sensor detector arrays. These sensors are passively cooled to low temperatures for low-level light (L3) and near-infrared (NIR) signal detection, and the sensor readout electronics circuitry must perform at extremely low noise levels to enable new required science measurements. Because we are at the technological edge of enhanced performance for sensors and readout electronics circuitry, as determined by thermal noise level at given temperature in analog domain, we must find new ways of further compensating for the noise in the signal digital domain. To facilitate this new approach, state-of-the-art sensors are augmented at their array hardware boundaries by non-illuminated reference pixels, which can be used to reduce noise attributed to sensors. There are a few proposed methodologies of processing in the digital domain the information carried by reference pixels, as employed by the Hubble Space Telescope and the James Webb Space Telescope Projects. These methods involve using spatial and temporal statistical parameters derived from boundary reference pixel information to enhance the active (non-reference) pixel signals. To make a step beyond this heritage methodology, we apply the NASA-developed technology known as the Hilbert- Huang Transform Data Processing System (HHT-DPS) for reference pixel information processing and its utilization in reconfigurable hardware on-board a spaceflight instrument or post-processing on the ground. The methodology examines signal processing for a 2-D domain, in which high-variance components of the thermal noise are carried by both active and reference pixels, similar to that in processing of low-voltage differential signals and subtraction of a single analog reference pixel from all active pixels on the sensor. Heritage methods using the aforementioned statistical parameters in the digital domain (such as statistical averaging of the reference pixels themselves) zeroes out the high-variance components, and the counterpart components in the active pixels remain uncorrected. This paper describes how the new methodology was demonstrated through analysis of fast-varying noise components using the Hilbert-Huang Transform Data Processing System tool (HHT-DPS) developed at NASA and the high-level programming language MATLAB (Trademark of MathWorks Inc.), as well as alternative methods for correcting for the high-variance noise component, using an HgCdTe sensor data. The NASA Hubble Space Telescope data post-processing, as well as future deep-space cosmology projects on-board instrument data processing from all the sensor channels, would benefit from this effort.
Design methodology for integrated downstream separation systems in an ethanol biorefinery
NASA Astrophysics Data System (ADS)
Mohammadzadeh Rohani, Navid
Energy security and environmental concerns have been the main drivers for a historic shift to biofuel production in transportation fuel industry. Biofuels should not only offer environmental advantages over the petroleum fuels they replace but also should be economically sustainable and viable. The so-called second generation biofuels such as ethanol which is the most produced biofuel are mostly derived from lignocellulosic biomasses. These biofuels are more difficult to produce than the first generation ones mainly due to recalcitrance of the feedstocks in extracting their sugar contents. Costly pre-treatment and fractionation stages are required to break down lignocellulosic feedstocks into their constituent elements. On the other hand the mixture produced in fermentation step in a biorefinery contains very low amount of product which makes the subsequent separation step more difficult and more energy consuming. In an ethanol biorefinery, the dilute fermentation broth requires huge operating cost in downstream separation for recovery of the product in a conventional distillation technique. Moreover, the non-ideal nature of ethanol-water mixture which forms an iseotrope at almost 95 wt%, hinders the attainment of the fuel grade ethanol (99.5 wt%). Therefore, an additional dehydration stage is necessary to purify the ethanol from its azeotropic composition to fuel-grade purity. In order to overcome the constraint pertaining to vapor-liquid equilibrium of ethanol-water separation, several techniques have been investigated and proposed in the industry. These techniques such as membrane-based technologies, extraction and etc. have not only sought to produce a pure fuel-grade ethanol but have also aimed at decreasing the energy consumption of this energy-intensive separation. Decreasing the energy consumption of an ethanol biorefinery is of paramount importance in improving its overall economics and in facilitating the way to displacing petroleum transportation fuel and obtaining energy security. On the other hand, Process Integration (PI) as defined by Natural Resource Canada as the combination of activities which aim at improving process systems, their unit operations and their interactions in order to maximize the efficiency of using water, energy and raw materials can also help biorefineries lower their energy consumptions and improve their economics. Energy integration techniques such as pinch analysis adopted by different industries over the years have ensured using heat sources within a plant to supply the demand internally and decrease the external utility consumption. Therefore, adopting energy integration can be one of the ways biorefinery technology owners can consider in their process development as well as their business model in order to improve their overall economics. The objective of this thesis is to propose a methodology for designing integrated downstream separation in a biorefinery. This methodology is tested in an ethanol biorefinery case study. Several alternative separation techniques are evaluated in their energy consumption and economics in three different scenarios; stand-alone without energy integration, stand-alone with internal energy integration and integrated-with Kraft. The energy consumptions and capital costs of separation techniques are assessed in each scenario and the cost and benefit of integration are determined and finally the best alternative is found through techno-economic metrics. Another advantage of this methodology is the use of a graphical tool which provides insights on decreasing energy consumption by modifying the process condition. The pivot point of this work is the use of a novel energy integration method called Bridge analysis. This systematic method which originally is intended for retrofit situation is used here for integration with Kraft process. Integration potentials are identified through this method and savings are presented for each design. In stand-alone with internal integration scenario, the conventional pinch method is used for energy analysis. The results reveal the importance of energy integration in reducing energy consumption. They also show that in an ethanol biorefinery, by adopting energy integration in the conventional distillation separation, we can achieve greater energy saving compared to other alternative techniques. This in turn suggests that new alternative technologies which imply big risks for the company might not be an option for reducing the energy consumption as long as an internal and external integration is incorporated in the business model of an ethanol biorefinery. It is also noteworthy that the methodology developed in this work can be extended as a future work to include a whole biorefinery system. (Abstract shortened by UMI.).
Jacobo-Velázquez, D A; Ramos-Parra, P A; Hernández-Brenes, C
2010-08-01
High hydrostatic pressure (HHP) pasteurized and refrigerated avocado and mango pulps contain lower microbial counts and thus are safer and acceptable for human consumption for a longer period of time, when compared to fresh unprocessed pulps. However, during their commercial shelf life, changes in their sensory characteristics take place and eventually produce the rejection of these products by consumers. Therefore, in the present study, the use of sensory evaluation was proposed for the shelf-life determinations of HHP-processed avocado and mango pulps. The study focused on evaluating the feasibility of applying survival analysis methodology to the data generated by consumers in order to determine the sensory shelf lives of both HHP-treated pulps of avocado and mango. Survival analysis proved to be an effective methodology for the estimation of the sensory shelf life of avocado and mango pulps processed with HHP, with potential application for other pressurized products. Practical Application: At present, HHP processing is one of the most effective alternatives for the commercial nonthermal pasteurization of fresh tropical fruits. HHP processing improves the microbial stability of the fruit pulps significantly; however, the products continue to deteriorate during their refrigerated storage mainly due to the action of residual detrimental enzymes. This article proposes the application of survival analysis methodology for the determination of the sensory shelf life of HHP-treated avocado and mango pulps. Results demonstrated that the procedure appears to be simple and practical for the sensory shelf-life determination of HHP-treated foods when their main mode of failure is not caused by increases in microbiological counts that can affect human health.
DOT National Transportation Integrated Search
2015-09-01
This report describes an Alternative Fuel Transportation Optimization Tool (AFTOT), developed by the U.S. Department of Transportation (DOT) Volpe National Transportation Systems Center (Volpe) in support of the Federal Aviation Administration (FAA)....
Ng, Stella L
2013-05-01
The discipline of audiology has the opportunity to embark on research in education from an informed perspective, learning from professions that began this journey decades ago. The goal of this article is to position our discipline as a new member in the academic field of health professional education (HPE), with much to learn and contribute. In this article, I discuss the need for theory in informing HPE research. I also stress the importance of balancing our research goals by selecting appropriate methodologies for relevant research questions, to ensure that we respect the complexity of social processes inherent in HPE. Examples of relevant research questions are used to illustrate the need to consider alternative methodologies and to rethink the traditional hierarchy of evidence. I also provide an example of the thought processes and decisions that informed the design of an educational research study using a constructivist grounded theory methodology. As audiology enters the scholarly field of HPE, we need to arm ourselves with some of the knowledge and perspective that informs the field. Thus, we need to broaden our conceptions of what we consider to be appropriate styles of academic writing, relevant research questions, and valid evidence. Also, if we are to embark on qualitative inquiry into audiology education (or other audiology topics), we need to ensure that we conduct this research with an adequate understanding of the theories and methodologies informing such approaches. We must strive to conduct high quality, rigorous qualitative research more often than uninformed, generic qualitative research. These goals are imperative to the advancement of the theoretical landscape of audiology education and evolving the place of audiology in the field of HPE. American Academy of Audiology.
ERIC Educational Resources Information Center
Miller, Elizabeth R.
2013-01-01
Alternative schools educate students who have previously been unsuccessful in the traditional school setting. Many alternative school students are behind on high school credits, and the schools provide options for credit recovery. Computer-assisted instruction is often used for this purpose. Using case study methodology and a critical theoretical…
Study of jojoba oil aging by FTIR.
Le Dréau, Y; Dupuy, N; Gaydou, V; Joachim, J; Kister, J
2009-05-29
As the jojoba oil was used in cosmetic, pharmaceutical, dietetic food, animal feeding, lubrication, polishing and bio-diesel fields, it was important to study its aging at high temperature by oxidative process. In this work a FT-MIR methodology was developed for monitoring accelerate oxidative degradation of jojoba oils. Principal component analysis (PCA) was used to differentiate various samples according to their origin and obtaining process, and to differentiate oxidative conditions applied on oils. Two spectroscopic indices were calculated to report simply the oxidation phenomenon. Results were confirmed and deepened by multivariate curve resolution-alternative least square method (MCR-ALS). It allowed identifying chemical species produced or degraded during the thermal treatment according to a SIMPLISMA pretreatment.
Fuzzy approaches to supplier selection problem
NASA Astrophysics Data System (ADS)
Ozkok, Beyza Ahlatcioglu; Kocken, Hale Gonce
2013-09-01
Supplier selection problem is a multi-criteria decision making problem which includes both qualitative and quantitative factors. In the selection process many criteria may conflict with each other, therefore decision-making process becomes complicated. In this study, we handled the supplier selection problem under uncertainty. In this context; we used minimum criterion, arithmetic mean criterion, regret criterion, optimistic criterion, geometric mean and harmonic mean. The membership functions created with the help of the characteristics of used criteria, and we tried to provide consistent supplier selection decisions by using these memberships for evaluating alternative suppliers. During the analysis, no need to use expert opinion is a strong aspect of the methodology used in the decision-making.
Leung, Janet T Y; Shek, Daniel T L
2011-01-01
This paper examines the use of quantitative and qualitative approaches to study the impact of economic disadvantage on family processes and adolescent development. Quantitative research has the merits of objectivity, good predictive and explanatory power, parsimony, precision and sophistication of analysis. Qualitative research, in contrast, provides a detailed, holistic, in-depth understanding of social reality and allows illumination of new insights. With the pragmatic considerations of methodological appropriateness, design flexibility, and situational responsiveness in responding to the research inquiry, a mixed methods approach could be a possibility of integrating quantitative and qualitative approaches and offers an alternative strategy to study the impact of economic disadvantage on family processes and adolescent development.
Highway User Benefit Analysis System Research Project #128
DOT National Transportation Integrated Search
2000-10-01
In this research, a methodology for estimating road user costs of various competing alternatives was developed. Also, software was developed to calculate the road user cost, perform economic analysis and update cost tables. The methodology is based o...
Researcher / Researched: Repositioning Research Paradigms
ERIC Educational Resources Information Center
Meerwald, Agnes May Lin
2013-01-01
"Researcher / Researched" calls for a complementary research methodology by proposing autoethnography as both a method and text that crosses the boundaries of conventional and alternative methodologies in higher education. Autoethnography rearticulates the researcher / researched positions by blurring the boundary between them. This…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraile-Garcia, Esteban, E-mail: esteban.fraile@unirioja.es; Ferreiro-Cabello, Javier, E-mail: javier.ferreiro@unirioja.es; Qualiberica S.L.
The European Committee for Standardization (CEN) through its Technical Committee CEN/TC-350 is developing a series of standards for assessing the building sustainability, at both product and building levels. The practical application of the selection (decision making) of structural alternatives made by one-way slabs leads to an intermediate level between the product and the building. Thus the present study addresses this problem of decision making, following the CEN guidelines and incorporating relevant aspects of architectural design into residential construction. A life cycle assessment (LCA) is developed in order to obtain valid information for the decision making process (the LCA was developedmore » applying CML methodology although Ecoindicator99 was used in order to facilitate the comparison of the values); this information (the carbon footprint values) is contrasted with other databases and with the information from the Environmental Product Declaration (EPD) of one of the lightening materials (expanded polystyrene), in order to validate the results. Solutions of different column disposition and geometries are evaluated in the three pillars of sustainable construction on residential construction: social, economic and environmental. The quantitative analysis of the variables used in this study enables and facilitates an objective comparison in the design stage by a responsible technician; the application of the proposed methodology reduces the possible solutions to be evaluated by the expert to 12.22% of the options in the case of low values of the column index and to 26.67% for the highest values. - Highlights: • Methodology for selection of structural alternatives in buildings with one-way slabs • Adapted to CEN guidelines (CEN/TC-350) for assessing the building sustainability • LCA is developed in order to obtain valid information for the decision making process. • Results validated comparing carbon footprint, databases and Env. Product Declarations • The proposal reduces the solutions to be evaluated to between 12.22 and 26.67%.« less
NASA Technical Reports Server (NTRS)
Selvaduray, Guna; Lomax, Curtis
1991-01-01
Fusible heat sinks are a possible source for thermal regulation of space suited astronauts. An extensive database search was undertaken to identify candidate materials with liquid solid transformations over the temperature range of -18 C to 5 C; and 1215 candidates were identified. Based on available data, 59 candidate materials with thermal storage capability, DeltaH values higher than that of water were identified. This paper presents the methodology utilized in the study, including the decision process used for materials selection.
A comprehensive safety assessment methodology for innovative geometric designs.
DOT National Transportation Integrated Search
2017-05-01
As the population grows and travel demands increase, alternative interchange designs have become increasingly popular. The diverging diamond interchange is an alternative design that has been implemented in the United States. This design can accommod...
Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S
2015-03-02
A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.
Alternative Methods of Base Level Demand Forecasting for Economic Order Quantity Items,
1975-12-01
Note .. . . . . . . . . . . . . . . . . . . . . . . . 21 AdaptivC Single Exponential Smooti-ing ........ 21 Choosing the Smoothiing Constant... methodology used in the study, an analysis of results, .And a detailed summary. Chapter I. Methodology , contains a description o the data, a...Chapter IV. Detailed Summary, presents a detailed summary of the findings, lists the limitations inherent in the 7’" research methodology , and
A Methodological Conundrum: Comparing Schools in Scotland and England
ERIC Educational Resources Information Center
Marshall, Bethan; Gibbons, Simon
2015-01-01
This article considers a conundrum in research methodology; the fact that, in the main, you have to use a social science-based research methodology if you want to look at what goes on in a classroom. This article proposes an alternative arts-based research method instead based on the work of Eisner, and before him Dewey, where one can use the more…
Parametric evaluation of the cost effectiveness of Shuttle payload vibroacoustic test plans
NASA Technical Reports Server (NTRS)
Stahle, C. V.; Gongloff, H. R.; Keegan, W. B.; Young, J. P.
1978-01-01
Consideration is given to alternate vibroacoustic test plans for sortie and free flyer Shuttle payloads. Statistical decision models for nine test plans provide a viable method of evaluating the cost effectiveness of alternate vibroacoustic test plans and the associated test levels. The methodology is a major step toward the development of a useful tool for the quantitative tailoring of vibroacoustic test programs to sortie and free flyer payloads. A broader application of the methodology is now possible by the use of the OCTAVE computer code.
Prioritization Methodology for Chemical Replacement
NASA Technical Reports Server (NTRS)
Cruit, W.; Schutzenhofer, S.; Goldberg, B.; Everhart, K.
1993-01-01
This project serves to define an appropriate methodology for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weigh the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results are being implemented as a guideline for consideration for current NASA propulsion systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-30
... Trustees) establish a panel of technical experts to review the methods used in the HI and SMI annual... care, and alternate projection methodologies. The panel may also examine other methodological issues...
ALTERNATIVES TO DUPLICATE DIET METHODOLOGY
Duplicate Diet (DD) methodology has been used to collect information about the dietary exposure component in the context of total exposure studies. DD methods have been used to characterize the dietary exposure component in the NHEXAS pilot studies. NERL desired to evaluate it...
Shapira, Aviad; Shoshany, Maxim; Nir-Goldenberg, Sigal
2013-07-01
Environmental management and planning are instrumental in resolving conflicts arising between societal needs for economic development on the one hand and for open green landscapes on the other hand. Allocating green corridors between fragmented core green areas may provide a partial solution to these conflicts. Decisions regarding green corridor development require the assessment of alternative allocations based on multiple criteria evaluations. Analytical Hierarchy Process provides a methodology for both a structured and consistent extraction of such evaluations and for the search for consensus among experts regarding weights assigned to the different criteria. Implementing this methodology using 15 Israeli experts-landscape architects, regional planners, and geographers-revealed inherent differences in expert opinions in this field beyond professional divisions. The use of Agglomerative Hierarchical Clustering allowed to identify clusters representing common decisions regarding criterion weights. Aggregating the evaluations of these clusters revealed an important dichotomy between a pragmatist approach that emphasizes the weight of statutory criteria and an ecological approach that emphasizes the role of the natural conditions in allocating green landscape corridors.
Peptide biomarkers as a way to determine meat authenticity.
Sentandreu, Miguel Angel; Sentandreu, Enrique
2011-11-01
Meat fraud implies many illegal procedures affecting the composition of meat and meat products, something that is commonly done with the aim to increase profit. These practices need to be controlled by legal authorities by means of robust, accurate and sensitive methodologies capable to assure that fraudulent or accidental mislabelling does not arise. Common strategies traditionally used to assess meat authenticity have been based on methods such as chemometric analysis of a large set of data analysis, immunoassays or DNA analysis. The identification of peptide biomarkers specific of a particular meat species, tissue or ingredient by proteomic technologies constitutes an interesting and promising alternative to existing methodologies due to its high discriminating power, robustness and sensitivity. The possibility to develop standardized protein extraction protocols, together with the considerably higher resistance of peptide sequences to food processing as compared to DNA sequences, would overcome some of the limitations currently existing for quantitative determinations of highly processed food samples. The use of routine mass spectrometry equipment would make the technology suitable for control laboratories. Copyright © 2011 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Montero, Luis G., E-mail: luisgonzaga.garcia@upm.e; Lopez, Elena, E-mail: elopez@caminos.upm.e; Monzon, Andres, E-mail: amonzon@caminos.upm.e
Most Strategic Environmental Assessment (SEA) research has been concerned with SEA as a procedure, and there have been relatively few developments and tests of analytical methodologies. The first stage of the SEA is the 'screening', which is the process whereby a decision is taken on whether or not SEA is required for a particular programme or plan. The effectiveness of screening and SEA procedures will depend on how well the assessment fits into the planning from the early stages of the decision-making process. However, it is difficult to prepare the environmental screening for an infrastructure plan involving a whole country.more » To be useful, such methodologies must be fast and simple. We have developed two screening tools which would make it possible to estimate promptly the overall impact an infrastructure plan might have on biodiversity and global warming for a whole country, in order to generate planning alternatives, and to determine whether or not SEA is required for a particular infrastructure plan.« less
NASA Astrophysics Data System (ADS)
Shapira, Aviad; Shoshany, Maxim; Nir-Goldenberg, Sigal
2013-07-01
Environmental management and planning are instrumental in resolving conflicts arising between societal needs for economic development on the one hand and for open green landscapes on the other hand. Allocating green corridors between fragmented core green areas may provide a partial solution to these conflicts. Decisions regarding green corridor development require the assessment of alternative allocations based on multiple criteria evaluations. Analytical Hierarchy Process provides a methodology for both a structured and consistent extraction of such evaluations and for the search for consensus among experts regarding weights assigned to the different criteria. Implementing this methodology using 15 Israeli experts—landscape architects, regional planners, and geographers—revealed inherent differences in expert opinions in this field beyond professional divisions. The use of Agglomerative Hierarchical Clustering allowed to identify clusters representing common decisions regarding criterion weights. Aggregating the evaluations of these clusters revealed an important dichotomy between a pragmatist approach that emphasizes the weight of statutory criteria and an ecological approach that emphasizes the role of the natural conditions in allocating green landscape corridors.
Coz, Alberto; Llano, Tamara; Cifrián, Eva; Viguri, Javier; Maican, Edmond; Sixta, Herbert
2016-01-01
The complete bioconversion of the carbohydrate fraction is of great importance for a lignocellulosic-based biorefinery. However, due to the structure of the lignocellulosic materials, and depending basically on the main parameters within the pretreatment steps, numerous byproducts are generated and they act as inhibitors in the fermentation operations. In this sense, the impact of inhibitory compounds derived from lignocellulosic materials is one of the major challenges for a sustainable biomass-to-biofuel and -bioproduct industry. In order to minimise the negative effects of these compounds, numerous methodologies have been tested including physical, chemical, and biological processes. The main physical and chemical treatments have been studied in this work in relation to the lignocellulosic material and the inhibitor in order to point out the best mechanisms for fermenting purposes. In addition, special attention has been made in the case of lignocellulosic hydrolysates obtained by chemical processes with SO2, due to the complex matrix of these materials and the increase in these methodologies in future biorefinery markets. Recommendations of different detoxification methods have been given. PMID:28773700
Samantra, Chitrasen; Datta, Saurav; Mahapatra, Siba Sankar
2017-03-01
In the context of underground coal mining industry, the increased economic issues regarding implementation of additional safety measure systems, along with growing public awareness to ensure high level of workers safety, have put great pressure on the managers towards finding the best solution to ensure safe as well as economically viable alternative selection. Risk-based decision support system plays an important role in finding such solutions amongst candidate alternatives with respect to multiple decision criteria. Therefore, in this paper, a unified risk-based decision-making methodology has been proposed for selecting an appropriate safety measure system in relation to an underground coal mining industry with respect to multiple risk criteria such as financial risk, operating risk, and maintenance risk. The proposed methodology uses interval-valued fuzzy set theory for modelling vagueness and subjectivity in the estimates of fuzzy risk ratings for making appropriate decision. The methodology is based on the aggregative fuzzy risk analysis and multi-criteria decision making. The selection decisions are made within the context of understanding the total integrated risk that is likely to incur while adapting the particular safety system alternative. Effectiveness of the proposed methodology has been validated through a real-time case study. The result in the context of final priority ranking is seemed fairly consistent.
Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B
2018-05-30
Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.
NASA Astrophysics Data System (ADS)
Torregrosa, A. J.; Broatch, A.; Margot, X.; García-Tíscar, J.
2016-08-01
An experimental methodology is proposed to assess the noise emission of centrifugal turbocompressors like those of automotive turbochargers. A step-by-step procedure is detailed, starting from the theoretical considerations of sound measurement in flow ducts and examining specific experimental setup guidelines and signal processing routines. Special care is taken regarding some limiting factors that adversely affect the measuring of sound intensity in ducts, namely calibration, sensor placement and frequency ranges and restrictions. In order to provide illustrative examples of the proposed techniques and results, the methodology has been applied to the acoustic evaluation of a small automotive turbocharger in a flow bench. Samples of raw pressure spectra, decomposed pressure waves, calibration results, accurate surge characterization and final compressor noise maps and estimated spectrograms are provided. The analysis of selected frequency bands successfully shows how different, known noise phenomena of particular interest such as mid-frequency "whoosh noise" and low-frequency surge onset are correlated with operating conditions of the turbocharger. Comparison against external inlet orifice intensity measurements shows good correlation and improvement with respect to alternative wave decomposition techniques.
Trujillano, Javier; March, Jaume; Sorribas, Albert
2004-01-01
In clinical practice, there is an increasing interest in obtaining adequate models of prediction. Within the possible available alternatives, the artificial neural networks (ANN) are progressively more used. In this review we first introduce the ANN methodology, describing the most common type of ANN, the Multilayer Perceptron trained with backpropagation algorithm (MLP). Then we compare the MLP with the Logistic Regression (LR). Finally, we show a practical scheme to make an application based on ANN by means of an example with actual data. The main advantage of the RN is its capacity to incorporate nonlinear effects and interactions between the variables of the model without need to include them a priori. As greater disadvantages, they show a difficult interpretation of their parameters and large empiricism in their process of construction and training. ANN are useful for the computation of probabilities of a given outcome based on a set of predicting variables. Furthermore, in some cases, they obtain better results than LR. Both methodologies, ANN and LR, are complementary and they help us to obtain more valid models.
Tayabas, Luz María Tejada; León, Teresita Castillo; ESPINO, JOEL MONARREZ
2014-01-01
This short essay aims at commenting on the origin, development, rationale, and main characteristics of qualitative evaluation (QE), emphasizing the value of this methodological tool to evaluate health programs and services. During the past decades, different approaches have come to light proposing complementary alternatives to appraise the performance of public health programs, mainly focusing on the implementation process involved rather than on measuring the impact of such actions. QE is an alternative tool that can be used to illustrate and understand the process faced when executing health programs. It can also lead to useful suggestions to modify its implementation from the stakeholders’ perspectives, as it uses a qualitative approach that considers participants as reflective subjects, generators of meanings. This implies that beneficiaries become involved in an active manner in the evaluated phenomena with the aim of improving the health programs or services that they receive. With this work we want to encourage evaluators in the field of public health to consider the use of QE as a complementary tool for program evaluation to be able to identify areas of opportunity to improve programs’ implementation processes from the perspective of intended beneficiaries. PMID:25152220
GREEN CHEMICAL SYNTHESIS THROUGH CATALYSIS AND ALTERNATE REACTION CONDITIONS
Green chemical synthesis through catalysis and alternate reaction conditions
Encompassing green chemistry techniques and methodologies, we have initiated several projects at the National Risk Management Research laboratory that focus on the design and development of chemic...
Campos, Maria Doroteia; Valadas, Vera; Campos, Catarina; Morello, Laura; Braglia, Luca; Breviario, Diego; Cardoso, Hélia G
2018-01-01
Traceability of processed food and feed products has been gaining importance due to the impact that those products can have on human/animal health and to the associated economic and legal concerns, often related to adulterations and frauds as it can be the case for meat and milk. Despite mandatory traceability requirements for the analysis of feed composition, few reliable and accurate methods are presently available to enforce the legislative frame and allow the authentication of animal feeds. In this study, nine sensitive and species-specific real-time PCR TaqMan MGB assays are described for plant species detection in animal feed samples. The method is based on selective real-time qPCR (RT-qPCR) amplification of target genes belonging to the alternative oxidase (AOX) gene family. The plant species selected for detection in feed samples were wheat, maize, barley, soybean, rice and sunflower as common components of feeds, and cotton, flax and peanut as possible undesirable contaminants. The obtained results were compared with end-point PCR methodology. The applicability of the AOX TaqMan assays was evaluated through the screening of commercial feed samples, and by the analysis of plant mixtures with known composition. The RT-qPCR methodology allowed the detection of the most abundant species in feeds but also the identification of contaminant species present in lower amounts, down to 1% w/w. AOX-based methodology provides a suitable molecular marker approach to ascertain plant species composition of animal feed samples, thus supporting feed control and enforcement of the feed sector and animal production.
A decision tool for selecting trench cap designs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paige, G.B.; Stone, J.J.; Lane, L.J.
1995-12-31
A computer based prototype decision support system (PDSS) is being developed to assist the risk manager in selecting an appropriate trench cap design for waste disposal sites. The selection of the {open_quote}best{close_quote} design among feasible alternatives requires consideration of multiple and often conflicting objectives. The methodology used in the selection process consists of: selecting and parameterizing decision variables using data, simulation models, or expert opinion; selecting feasible trench cap design alternatives; ordering the decision variables and ranking the design alternatives. The decision model is based on multi-objective decision theory and uses a unique approach to order the decision variables andmore » rank the design alternatives. Trench cap designs are evaluated based on federal regulations, hydrologic performance, cover stability and cost. Four trench cap designs, which were monitored for a four year period at Hill Air Force Base in Utah, are used to demonstrate the application of the PDSS and evaluate the results of the decision model. The results of the PDSS, using both data and simulations, illustrate the relative advantages of each of the cap designs and which cap is the {open_quotes}best{close_quotes} alternative for a given set of criteria and a particular importance order of those decision criteria.« less
How scientific experiments are designed: Problem solving in a knowledge-rich, error-rich environment
NASA Astrophysics Data System (ADS)
Baker, Lisa M.
While theory formation and the relation between theory and data has been investigated in many studies of scientific reasoning, researchers have focused less attention on reasoning about experimental design, even though the experimental design process makes up a large part of real-world scientists' reasoning. The goal of this thesis was to provide a cognitive account of the scientific experimental design process by analyzing experimental design as problem-solving behavior (Newell & Simon, 1972). Three specific issues were addressed: the effect of potential error on experimental design strategies, the role of prior knowledge in experimental design, and the effect of characteristics of the space of alternate hypotheses on alternate hypothesis testing. A two-pronged in vivo/in vitro research methodology was employed, in which transcripts of real-world scientific laboratory meetings were analyzed as well as undergraduate science and non-science majors' design of biology experiments in the psychology laboratory. It was found that scientists use a specific strategy to deal with the possibility of error in experimental findings: they include "known" control conditions in their experimental designs both to determine whether error is occurring and to identify sources of error. The known controls strategy had not been reported in earlier studies with science-like tasks, in which participants' responses to error had consisted of replicating experiments and discounting results. With respect to prior knowledge: scientists and undergraduate students drew on several types of knowledge when designing experiments, including theoretical knowledge, domain-specific knowledge of experimental techniques, and domain-general knowledge of experimental design strategies. Finally, undergraduate science students generated and tested alternates to their favored hypotheses when the space of alternate hypotheses was constrained and searchable. This result may help explain findings of confirmation bias in earlier studies using science-like tasks, in which characteristics of the alternate hypothesis space may have made it unfeasible for participants to generate and test alternate hypotheses. In general, scientists and science undergraduates were found to engage in a systematic experimental design process that responded to salient features of the problem environment, including the constant potential for experimental error, availability of alternate hypotheses, and access to both theoretical knowledge and knowledge of experimental techniques.
Ritrovato, Matteo; Faggiano, Francesco C; Tedesco, Giorgia; Derrico, Pietro
2015-06-01
This article outlines the Decision-Oriented Health Technology Assessment: a new implementation of the European network for Health Technology Assessment Core Model, integrating the multicriteria decision-making analysis by using the analytic hierarchy process to introduce a standardized methodological approach as a valued and shared tool to support health care decision making within a hospital. Following the Core Model as guidance (European network for Health Technology Assessment. HTA core model for medical and surgical interventions. Available from: http://www.eunethta.eu/outputs/hta-core-model-medical-and-surgical-interventions-10r. [Accessed May 27, 2014]), it is possible to apply the analytic hierarchy process to break down a problem into its constituent parts and identify priorities (i.e., assigning a weight to each part) in a hierarchical structure. Thus, it quantitatively compares the importance of multiple criteria in assessing health technologies and how the alternative technologies perform in satisfying these criteria. The verbal ratings are translated into a quantitative form by using the Saaty scale (Saaty TL. Decision making with the analytic hierarchy process. Int J Serv Sci 2008;1:83-98). An eigenvectors analysis is used for deriving the weights' systems (i.e., local and global weights' system) that reflect the importance assigned to the criteria and the priorities related to the performance of the alternative technologies. Compared with the Core Model, this methodological approach supplies a more timely as well as contextualized evidence for a specific technology, making it possible to obtain data that are more relevant and easier to interpret, and therefore more useful for decision makers to make investment choices with greater awareness. We reached the conclusion that although there may be scope for improvement, this implementation is a step forward toward the goal of building a "solid bridge" between the scientific evidence and the final decision maker's choice. Copyright © 2015 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Guest, James; Harrop, James S; Aarabi, Bizhan; Grossman, Robert G; Fawcett, James W; Fehlings, Michael G; Tator, Charles H
2012-09-01
The North American Clinical Trials Network (NACTN) includes 9 clinical centers funded by the US Department of Defense and the Christopher Reeve Paralysis Foundation. Its purpose is to accelerate clinical testing of promising therapeutics in spinal cord injury (SCI) through the development of a robust interactive infrastructure. This structure includes key committees that serve to provide longitudinal guidance to the Network. These committees include the Executive, Data Management, and Neurological Outcome Assessments Committees, and the Therapeutic Selection Committee (TSC), which is the subject of this manuscript. The NACTN brings unique elements to the SCI field. The Network's stability is not restricted to a single clinical trial. Network members have diverse expertise and include experts in clinical care, clinical trial design and methodology, pharmacology, preclinical and clinical research, and advanced rehabilitation techniques. Frequent systematic communication is assigned a high value, as is democratic process, fairness and efficiency of decision making, and resource allocation. This article focuses on how decision making occurs within the TSC to rank alternative therapeutics according to 2 main variables: quality of the preclinical data set, and fit with the Network's aims and capabilities. This selection process is important because if the Network's resources are committed to a therapeutic, alternatives cannot be pursued. A proposed methodology includes a multicriteria decision analysis that uses a Multi-Attribute Global Inference of Quality matrix to quantify the process. To rank therapeutics, the TSC uses a series of consensus steps designed to reduce individual and group bias and limit subjectivity. Given the difficulties encountered by industry in completing clinical trials in SCI, stable collaborative not-for-profit consortia, such as the NACTN, may be essential to clinical progress in SCI. The evolution of the NACTN also offers substantial opportunity to refine decision making and group dynamics. Making the best possible decisions concerning therapeutics selection for trial testing is a cornerstone of the Network's function.
Penn, Alexandra S.; Knight, Christopher J. K.; Lloyd, David J. B.; Avitabile, Daniele; Kok, Kasper; Schiller, Frank; Woodward, Amy; Druckman, Angela; Basson, Lauren
2013-01-01
Fuzzy Cognitive Mapping (FCM) is a widely used participatory modelling methodology in which stakeholders collaboratively develop a ‘cognitive map’ (a weighted, directed graph), representing the perceived causal structure of their system. This can be directly transformed by a workshop facilitator into simple mathematical models to be interrogated by participants by the end of the session. Such simple models provide thinking tools which can be used for discussion and exploration of complex issues, as well as sense checking the implications of suggested causal links. They increase stakeholder motivation and understanding of whole systems approaches, but cannot be separated from an intersubjective participatory context. Standard FCM methodologies make simplifying assumptions, which may strongly influence results, presenting particular challenges and opportunities. We report on a participatory process, involving local companies and organisations, focussing on the development of a bio-based economy in the Humber region. The initial cognitive map generated consisted of factors considered key for the development of the regional bio-based economy and their directional, weighted, causal interconnections. A verification and scenario generation procedure, to check the structure of the map and suggest modifications, was carried out with a second session. Participants agreed on updates to the original map and described two alternate potential causal structures. In a novel analysis all map structures were tested using two standard methodologies usually used independently: linear and sigmoidal FCMs, demonstrating some significantly different results alongside some broad similarities. We suggest a development of FCM methodology involving a sensitivity analysis with different mappings and discuss the use of this technique in the context of our case study. Using the results and analysis of our process, we discuss the limitations and benefits of the FCM methodology in this case and in general. We conclude by proposing an extended FCM methodology, including multiple functional mappings within one participant-constructed graph. PMID:24244303
Braun, M Miles
2013-10-01
Study of complementary and alternative medicine's mind and body interventions (CAM-MABI) is hindered not only by the inability to mask participants and their teachers to the study intervention but also by the major practical hurdles of long-term study of practices that can be lifelong. Two other important methodological problems are that study of newly trained practitioners cannot directly address long-term practice, and that long-term practitioners likely self-select in ways that make finding appropriate controls (or a comparison group) challenging. The temporary practice pause then resumption study design (TPPR) introduced here is a new tool that extends the withdrawal study design, established in the field of drug evaluation, to the field of CAM-MABI. With the exception of the inability to mask, TPPR can address the other methodological problems noted above. Of great interest to investigators will likely be measures in practitioners of CAM-MABI that change with temporary pausing of CAM-MABI practice, followed by return of the measures to pre-pause levels with resumption of practice; this would suggest a link of the practice to measured changes. Such findings using this tool may enhance our insight into fundamental biological processes, leading to beneficial practical applications.
Coastal zone management with stochastic multi-criteria analysis.
Félix, A; Baquerizo, A; Santiago, J M; Losada, M A
2012-12-15
The methodology for coastal management proposed in this study takes into account the physical processes of the coastal system and the stochastic nature of forcing agents. Simulation techniques are used to assess the uncertainty in the performance of a set of predefined management strategies based on different criteria representing the main concerns of interest groups. This statistical information as well as the distribution function that characterizes the uncertainty regarding the preferences of the decision makers is fed into a stochastic multi-criteria acceptability analysis that provides the probability of alternatives obtaining certain ranks and also calculates the preferences of a typical decision maker who supports an alternative. This methodology was applied as a management solution for Playa Granada in the Guadalfeo River Delta (Granada, Spain), where the construction of a dam in the river basin is causing severe erosion. The analysis of shoreline evolution took into account the coupled action of atmosphere, ocean, and land agents and their intrinsic stochastic character. This study considered five different management strategies. The criteria selected for the analysis were the economic benefits for three interest groups: (i) indirect beneficiaries of tourist activities; (ii) beach homeowners; and (iii) the administration. The strategies were ranked according to their effectiveness, and the relative importance given to each criterion was obtained. Copyright © 2012 Elsevier Ltd. All rights reserved.
Fragoulakis, Vasilios; Mitropoulou, Christina; van Schaik, Ron H; Maniadakis, Nikolaos; Patrinos, George P
2016-05-01
Genomic Medicine aims to improve therapeutic interventions and diagnostics, the quality of life of patients, but also to rationalize healthcare costs. To reach this goal, careful assessment and identification of evidence gaps for public health genomics priorities are required so that a more efficient healthcare environment is created. Here, we propose a public health genomics-driven approach to adjust the classical healthcare decision making process with an alternative methodological approach of cost-effectiveness analysis, which is particularly helpful for genomic medicine interventions. By combining classical cost-effectiveness analysis with budget constraints, social preferences, and patient ethics, we demonstrate the application of this model, the Genome Economics Model (GEM), based on a previously reported genome-guided intervention from a developing country environment. The model and the attendant rationale provide a practical guide by which all major healthcare stakeholders could ensure the sustainability of funding for genome-guided interventions, their adoption and coverage by health insurance funds, and prioritization of Genomic Medicine research, development, and innovation, given the restriction of budgets, particularly in developing countries and low-income healthcare settings in developed countries. The implications of the GEM for the policy makers interested in Genomic Medicine and new health technology and innovation assessment are also discussed.
Conceptual and Preliminary Design of a Low-Cost Precision Aerial Delivery System
2016-06-01
test results. It includes an analysis of the failure modes encountered during flight experimentation , methodology used for conducting coordinate...and experimentation . Additionally, the current and desired end state of the research is addressed. Finally, this chapter outlines the methodology ...preliminary design phases are utilized to investigate and develop a potentially low-cost alternative to existing systems. Using an Agile methodology
Prioritization methodology for chemical replacement
NASA Technical Reports Server (NTRS)
Goldberg, Ben; Cruit, Wendy; Schutzenhofer, Scott
1995-01-01
This methodology serves to define a system for effective prioritization of efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi quantitative approach derived from quality function deployment techniques (QFD Matrix). QFD is a conceptual map that provides a method of transforming customer wants and needs into quantitative engineering terms. This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives.
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semiquantitative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
Role of large scale energy systems models in R and D planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lamontagne, J.
1980-11-01
Long-term energy policy deals with the problem of finite supplies of convenient energy sources becoming more costly as they are depleted. The development of alternative technologies to provide new sources of energy and extend the lives of current ones is an attractive option available to government. Thus, one aspect of long-term energy policy involves investment in R and D. The importance of the problems addressed by R and D to the future of society (especially with regard to energy) dictates adoption of a cogent approach to resource allocation and to the designation of priorities for R and D. It ismore » hoped that energy systems models when properly used can provide useful inputs to this process. The influence of model results on energy policy makers who are not knowledgable about flaws or uncertainties in the models, errors in assumptions in model inputs which can result in faulty forecasts, the overall usefulness of energy system models, and model limitations are discussed. It is suggested that the large scale energy systems models currently used for assessing a broad spectrum of policy issues need to be replaced with reasonably simple models capable of dealing with uncertainty in a straightforward manner, and their methodologies and the meaning of their results should be transparent, especially to those removed from the modeling process. Energy models should be clearly related to specific issues. Methodologies should be clearly related to specific decisions, and should allow adjustments to be easily made for alternative assumptions and for additional knowledge gained during the evolution of the energy system. (LCL)« less
Uranga, Jon; Arrizabalaga, Haritz; Boyra, Guillermo; Hernandez, Maria Carmen; Goñi, Nicolas; Arregui, Igor; Fernandes, Jose A; Yurramendi, Yosu; Santiago, Josu
2017-01-01
This study presents a methodology for the automated analysis of commercial medium-range sonar signals for detecting presence/absence of bluefin tuna (Tunnus thynnus) in the Bay of Biscay. The approach uses image processing techniques to analyze sonar screenshots. For each sonar image we extracted measurable regions and analyzed their characteristics. Scientific data was used to classify each region into a class ("tuna" or "no-tuna") and build a dataset to train and evaluate classification models by using supervised learning. The methodology performed well when validated with commercial sonar screenshots, and has the potential to automatically analyze high volumes of data at a low cost. This represents a first milestone towards the development of acoustic, fishery-independent indices of abundance for bluefin tuna in the Bay of Biscay. Future research lines and additional alternatives to inform stock assessments are also discussed.
E-Learning as an Opportunity for the Public Administration
NASA Astrophysics Data System (ADS)
Casagranda, Milena; Colazzo, Luigi; Molinari, Andrea; Tomasini, Sara
In this paper we will describe the results of a learning project in the Public Administration, highlighting the methodological approach based on a blended training model in a context that has never experienced this type of activities. The observations contained in the paper will be focused on the evaluation results of this experience and the redesign elements in term of alternation between the classroom and distance training, methodologies, the value and use of the e-learning platform and learning evaluation. The elements that emerge will also provide the basis for the design of future teaching actions for this context (in which at this moment we are involved). The objective is to identify a "learning model", related also to the use of technological tools that are able to support lifelong learning and to define dynamics and process relating to facilitating learning activities of teachers and tutors.
Toward greener dialysis: a case study to illustrate and encourage the salvage of reject water.
Connor, Andrew; Milne, Steve; Owen, Andrew; Boyle, Gerard; Mortimer, Frances; Stevens, Paul
2010-06-01
Climate change is now considered to be a major global public health concern. However, the very provision of health care itself has a significant impact upon the environment. Action must be taken to reduce this impact. Water is a precious and finite natural resource. Vast quantities of high-grade water are required to provide haemodialysis. The reverse osmosis systems used in the purification process reject approximately two-thirds of the water presented to them. Therefore, around 250 litres of 'reject water' result from the production of the dialysate required for one treatment. This good quality reject water is lost-to-drain in the vast majority of centres worldwide. Simple methodologies exist to recycle this water for alternative purposes. We describe here a case study of the only UK renal service we know to have implemented such water-saving methodologies. We outline the benefits in terms of financial and environmental savings.
Uranga, Jon; Arrizabalaga, Haritz; Boyra, Guillermo; Hernandez, Maria Carmen; Goñi, Nicolas; Arregui, Igor; Fernandes, Jose A.; Yurramendi, Yosu; Santiago, Josu
2017-01-01
This study presents a methodology for the automated analysis of commercial medium-range sonar signals for detecting presence/absence of bluefin tuna (Tunnus thynnus) in the Bay of Biscay. The approach uses image processing techniques to analyze sonar screenshots. For each sonar image we extracted measurable regions and analyzed their characteristics. Scientific data was used to classify each region into a class (“tuna” or “no-tuna”) and build a dataset to train and evaluate classification models by using supervised learning. The methodology performed well when validated with commercial sonar screenshots, and has the potential to automatically analyze high volumes of data at a low cost. This represents a first milestone towards the development of acoustic, fishery-independent indices of abundance for bluefin tuna in the Bay of Biscay. Future research lines and additional alternatives to inform stock assessments are also discussed. PMID:28152032
Fraga, Eric S; Ng, Melvin
2015-01-01
Recent developments in catalysts have enhanced the potential for the utilisation of carbon dioxide as a chemical feedstock. Using the appropriate energy efficient catalyst enables a range of chemical pathways leading to desirable products. In doing so, CO2 provides an economically and environmentally beneficial source of C1 feedstock, while improving the issues relating to security of supply that are associated with fossil-based feedstocks. However, the dependence on catalysts brings other supply chains into consideration, supply chains that may also have security of supply issues. The choice of chemical pathways for specific products will therefore entail an assessment not only of economic factors but also the security of supply issues for the catalysts. This is a multi-criteria decision making problem. In this paper, we present a modified 4A framework based on the framework suggested by the Asian Pacific Energy Research centre for macro-economic applications. The 4A methodology is named after the criteria used to compare alternatives: availability, acceptability, applicability and affordability. We have adapted this framework for the consideration of alternative chemical reaction processes using a micro-economic outlook. Data from a number of sources were collected and used to quantify each of the 4A criteria. A graphical representation of the assessments is used to support the decision maker in comparing alternatives. The framework not only allows for the comparison of processes but also highlights current limitations in the CCU processes. The framework presented can be used by a variety of stakeholders, including regulators, investors, and process industries, with the aim of identifying promising routes within a broader multi-criteria decision making process.
DOT National Transportation Integrated Search
2006-08-01
The overall purpose of this research project is to conduct a feasibility study and development of a general methodology to determine the impacts on multi-modal and system efficiency of alternative freight security measures. The methodology to be exam...
DOT National Transportation Integrated Search
2013-11-01
The Highway Capacity Manual (HCM) has had a delay-based level of service methodology for signalized intersections since 1985. : The 2010 HCM has revised the method for calculating delay. This happened concurrent with such jurisdictions as NYC reviewi...
Methodological Issues in Trials of Complementary and Alternative Medicine Interventions
Sikorskii, Alla; Wyatt, Gwen; Victorson, David; Faulkner, Gwen; Rahbar, Mohammad Hossein
2010-01-01
Background Complementary and alternative medicine (CAM) use is widespread among cancer patients. Information on safety and efficacy of CAM therapies is needed for both patients and health care providers. Well-designed randomized clinical trials (RCTs) of CAM therapy interventions can inform both clinical research and practice. Objectives To review important issues that affect the design of RCTs for CAM interventions. Methods Using the methods component of the Consolidated Standards for Reporting Trials (CONSORT) as a guiding framework, and a National Cancer Institute-funded reflexology study as an exemplar, methodological issues related to participants, intervention, objectives, outcomes, sample size, randomization, blinding, and statistical methods were reviewed. Discussion Trials of CAM interventions designed and implemented according to appropriate methodological standards will facilitate the needed scientific rigor in CAM research. Interventions in CAM can be tested using proposed methodology, and the results of testing will inform nursing practice in providing safe and effective supportive care and improving the well-being of patients. PMID:19918155
Cottet, P; d'Hollander, A; Cahana, A; Van Gessel, E; Tassaux, D
2013-10-01
In the healthcare domain, different analytic tools focused on accidents appeared to be poorly adapted to sub-accidental issues. Improving local management and intra-institutional communication with simpler methods, allowing rapid and uncomplicated meta-reporting, could be an attractive alternative. A process-centered structure derived from the industrial domain - DEPOSE(E) - was selected and modified for its use in the healthcare domain. The seven exclusive meta-categories defined - Patient, Equipment, Process, Actor, Supplies, work Room and Organization- constitute 7CARECAT™. A collection of 536 "improvement" reports from a tertiary hospital Post anesthesia care unit (PACU) was used and four meta-categorization rules edited prior to the analysis. Both the relevance of the metacategories and of the rules were tested to build a meta-reporting methodology. The distribution of these categories was analyzed with a χ 2 test. Five hundred and ninety independent facts were collected out of the 536 reports. The frequencies of the categories are: Organization 44%, Actor 37%, Patient 11%, Process 3%, work Room 3%, Equipment 1% and Supplies 1%, with a p-value <0.005 (χ 2). During the analysis, three more rules were edited. The reproducibility, tested randomly on 200 reports, showed a <2% error rate. This meta-reporting methodology, developed with the 7CARECAT™ structure and using a reduced number of operational rules, has successfully produced a stable and consistent classification of sub-accidental events voluntarily reported. This model represents a relevant tool to exchange meta-informations important for local and transversal communication in healthcare institutions. It could be used as a promising tool to improve quality and risk management. Copyright © 2013. Published by Elsevier SAS.
Narrative inquiry: Locating Aboriginal epistemology in a relational methodology.
Barton, Sylvia S
2004-03-01
This methodology utilizes narrative analysis and the elicitation of life stories as understood through dimensions of interaction, continuity, and situation. It is congruent with Aboriginal epistemology formulated by oral narratives through representation, connection, storytelling and art. Needed for culturally competent scholarship is an experience of research whereby inquiry into epiphanies, ritual, routines, metaphors and everyday experience creates a process of reflexive thinking for multiple ways of knowing. Based on the sharing of perspectives, narrative inquiry allows for experimentation into creating new forms of knowledge by contextualizing diabetes from the experience of a researcher overlapped with experiences of participants--a reflective practice in itself. The aim of this paper is to present narrative inquiry as a relational methodology and to analyse critically its appropriateness as an innovative research approach for exploring Aboriginal people's experience living with diabetes. Narrative inquiry represents an alternative culture of research for nursing science to generate understanding and explanation of Aboriginal people's 'diabetic self' stories, and to coax open a window for co-constructing a narrative about diabetes as a chronic illness. The ability to adapt a methodology for use in a cultural context, preserve the perspectives of Aboriginal peoples, maintain the holistic nature of social problems, and value co-participation in respectful ways are strengths of an inquiry partial to a responsive and embodied scholarship.
Sketching Designs Using the Five Design-Sheet Methodology.
Roberts, Jonathan C; Headleand, Chris; Ritsos, Panagiotis D
2016-01-01
Sketching designs has been shown to be a useful way of planning and considering alternative solutions. The use of lo-fidelity prototyping, especially paper-based sketching, can save time, money and converge to better solutions more quickly. However, this design process is often viewed to be too informal. Consequently users do not know how to manage their thoughts and ideas (to first think divergently, to then finally converge on a suitable solution). We present the Five Design Sheet (FdS) methodology. The methodology enables users to create information visualization interfaces through lo-fidelity methods. Users sketch and plan their ideas, helping them express different possibilities, think through these ideas to consider their potential effectiveness as solutions to the task (sheet 1); they create three principle designs (sheets 2,3 and 4); before converging on a final realization design that can then be implemented (sheet 5). In this article, we present (i) a review of the use of sketching as a planning method for visualization and the benefits of sketching, (ii) a detailed description of the Five Design Sheet (FdS) methodology, and (iii) an evaluation of the FdS using the System Usability Scale, along with a case-study of its use in industry and experience of its use in teaching.
Optimizing The DSSC Fabrication Process Using Lean Six Sigma
NASA Astrophysics Data System (ADS)
Fauss, Brian
Alternative energy technologies must become more cost effective to achieve grid parity with fossil fuels. Dye sensitized solar cells (DSSCs) are an innovative third generation photovoltaic technology, which is demonstrating tremendous potential to become a revolutionary technology due to recent breakthroughs in cost of fabrication. The study here focused on quality improvement measures undertaken to improve fabrication of DSSCs and enhance process efficiency and effectiveness. Several quality improvement methods were implemented to optimize the seven step individual DSSC fabrication processes. Lean Manufacturing's 5S method successfully increased efficiency in all of the processes. Six Sigma's DMAIC methodology was used to identify and eliminate each of the root causes of defects in the critical titanium dioxide deposition process. These optimizations resulted with the following significant improvements in the production process: 1. fabrication time of the DSSCs was reduced by 54 %; 2. fabrication procedures were improved to the extent that all critical defects in the process were eliminated; 3. the quantity of functioning DSSCs fabricated was increased from 17 % to 90 %.
Lunar lander and return propulsion system trade study
NASA Technical Reports Server (NTRS)
Hurlbert, Eric A.; Moreland, Robert; Sanders, Gerald B.; Robertson, Edward A.; Amidei, David; Mulholland, John
1993-01-01
This trade study was initiated at NASA/JSC in May 1992 to develop and evaluate main propulsion system alternatives to the reference First Lunar Outpost (FLO) lander and return-stage transportation system concept. Thirteen alternative configurations were developed to explore the impacts of various combinations of return stage propellants, using either pressure or pump-fed propulsion systems and various staging options. Besides two-stage vehicle concepts, the merits of single-stage and stage-and-a-half options were also assessed in combination with high-performance liquid oxygen and liquid hydrogen propellants. Configurations using an integrated modular cryogenic engine were developed to assess potential improvements in packaging efficiency, mass performance, and system reliability compared to non-modular cryogenic designs. The selection process to evaluate the various designs was the analytic hierarchy process. The trade study showed that a pressure-fed MMH/N2O4 return stage and RL10-based lander stage is the best option for a 1999 launch. While results of this study are tailored to FLO needs, the design date, criteria, and selection methodology are applicable to the design of other crewed lunar landing and return vehicles.
Slice-thickness evaluation in CT and MRI: an alternative computerised procedure.
Acri, G; Tripepi, M G; Causa, F; Testagrossa, B; Novario, R; Vermiglio, G
2012-04-01
The efficient use of computed tomography (CT) and magnetic resonance imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice thickness (ST) requires scan exploration of phantoms containing test objects (plane, cone or spiral). To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination of full width at half maximum (FWHM) in real time. The phantom consists of a polymethyl methacrylate (PMMA) box, diagonally crossed by a PMMA septum dividing the box into two sections. The phantom images were acquired and processed using the LabView-based procedure. The LabView (LV) results were compared with those obtained by processing the same phantom images with commercial software, and the Fisher exact test (F test) was conducted on the resulting data sets to validate the proposed methodology. In all cases, there was no statistically significant variation between the two different procedures and the LV procedure, which can therefore be proposed as a valuable alternative to other commonly used procedures and be reliably used on any CT and MRI scanner.
Functional Implications of Novel Human Acid Sphingomyelinase Splice Variants
Rhein, Cosima; Tripal, Philipp; Seebahn, Angela; Konrad, Alice; Kramer, Marcel; Nagel, Christine; Kemper, Jonas; Bode, Jens; Mühle, Christiane; Gulbins, Erich; Reichel, Martin; Becker, Cord-Michael; Kornhuber, Johannes
2012-01-01
Background Acid sphingomyelinase (ASM) hydrolyses sphingomyelin and generates the lipid messenger ceramide, which mediates a variety of stress-related cellular processes. The pathological effects of dysregulated ASM activity are evident in several human diseases and indicate an important functional role for ASM regulation. We investigated alternative splicing as a possible mechanism for regulating cellular ASM activity. Methodology/Principal Findings We identified three novel ASM splice variants in human cells, termed ASM-5, -6 and -7, which lack portions of the catalytic- and/or carboxy-terminal domains in comparison to full-length ASM-1. Differential expression patterns in primary blood cells indicated that ASM splicing might be subject to regulatory processes. The newly identified ASM splice variants were catalytically inactive in biochemical in vitro assays, but they decreased the relative cellular ceramide content in overexpression studies and exerted a dominant-negative effect on ASM activity in physiological cell models. Conclusions/Significance These findings indicate that alternative splicing of ASM is of functional significance for the cellular stress response, possibly representing a mechanism for maintaining constant levels of cellular ASM enzyme activity. PMID:22558155
ERIC Educational Resources Information Center
Elwood, Bryan C.
This report provides a procedure by which the educational administrator can select from available alternatives the best method for transporting students and can evaluate at intervals the success or failure of the method selected. The report outlines the methodology used to analyze the problem, defines the range of alternatives called the…
Bioconversion of lignocellulosic biomass to xylitol: An overview.
Venkateswar Rao, Linga; Goli, Jyosthna Khanna; Gentela, Jahnavi; Koti, Sravanthi
2016-08-01
Lignocellulosic wastes include agricultural and forest residues which are most promising alternative energy sources and serve as potential low cost raw materials that can be exploited to produce xylitol. The strong physical and chemical construction of lignocelluloses is a major constraint for the recovery of xylose. The large scale production of xylitol is attained by nickel catalyzed chemical process that is based on xylose hydrogenation, that requires purified xylose as raw substrate and the process requires high temperature and pressure that remains to be cost intensive and energy consuming. Therefore, there is a necessity to develop an integrated process for biotechnological conversion of lignocelluloses to xylitol and make the process economical. The present review confers about the pretreatment strategies that facilitate cellulose and hemicellulose acquiescent for hydrolysis. There is also an emphasis on various detoxification and fermentation methodologies including genetic engineering strategies for the efficient conversion of xylose to xylitol. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hörmeyer, Ina; Renner, Gregor
2013-09-01
For individuals with complex communication needs, one of the most frequent communicative strategies is the co-construction of meaning with familiar partners. This preliminary single-case study gives insight into a special sequential pattern of co-construction processes - the search sequence - particularly in relation to the processes of confirming and denying meanings proposed by familiar interaction partners. Five different conversations between an adult with cerebral palsy and complex communication needs and two familiar co-participants were videotaped and analyzed using the methodology of conversation analysis (CA). The study revealed that confirmations and denials are not simply two alternative actions, but that several possibilities to realize confirmations and denials exist that differ in their frequency and that have different consequences for the sequential context. This study of confirmations and denials demonstrates that co-construction processes are more complex than have previously been documented.
Thermosonication and optimization of stingless bee honey processing.
Chong, K Y; Chin, N L; Yusof, Y A
2017-10-01
The effects of thermosonication on the quality of a stingless bee honey, the Kelulut, were studied using processing temperature from 45 to 90 ℃ and processing time from 30 to 120 minutes. Physicochemical properties including water activity, moisture content, color intensity, viscosity, hydroxymethylfurfural content, total phenolic content, and radical scavenging activity were determined. Thermosonication reduced the water activity and moisture content by 7.9% and 16.6%, respectively, compared to 3.5% and 6.9% for conventional heating. For thermosonicated honey, color intensity increased by 68.2%, viscosity increased by 275.0%, total phenolic content increased by 58.1%, and radical scavenging activity increased by 63.0% when compared to its raw form. The increase of hydroxymethylfurfural to 62.46 mg/kg was still within the limits of international standards. Optimized thermosonication conditions using response surface methodology were predicted at 90 ℃ for 111 minutes. Thermosonication was revealed as an effective alternative technique for honey processing.
The use of elemental sulfur as an alternative feedstock for polymeric materials
NASA Astrophysics Data System (ADS)
Chung, Woo Jin; Griebel, Jared J.; Kim, Eui Tae; Yoon, Hyunsik; Simmonds, Adam G.; Ji, Hyun Jun; Dirlam, Philip T.; Glass, Richard S.; Wie, Jeong Jae; Nguyen, Ngoc A.; Guralnick, Brett W.; Park, Jungjin; Somogyi, Árpád; Theato, Patrick; Mackay, Michael E.; Sung, Yung-Eun; Char, Kookheon; Pyun, Jeffrey
2013-06-01
An excess of elemental sulfur is generated annually from hydrodesulfurization in petroleum refining processes; however, it has a limited number of uses, of which one example is the production of sulfuric acid. Despite this excess, the development of synthetic and processing methods to convert elemental sulfur into useful chemical substances has not been investigated widely. Here we report a facile method (termed ‘inverse vulcanization’) to prepare chemically stable and processable polymeric materials through the direct copolymerization of elemental sulfur with vinylic monomers. This methodology enabled the modification of sulfur into processable copolymer forms with tunable thermomechanical properties, which leads to well-defined sulfur-rich micropatterned films created by imprint lithography. We also demonstrate that these copolymers exhibit comparable electrochemical properties to elemental sulfur and could serve as the active material in Li-S batteries, exhibiting high specific capacity (823 mA h g-1 at 100 cycles) and enhanced capacity retention.
Xia, Ting; Zhang, Ying; Crabb, Shona; Shah, Pushan
2013-01-01
It has been reported that motor vehicle emissions contribute nearly a quarter of world energy-related greenhouse gases and cause nonnegligible air pollution primarily in urban areas. Reducing car use and increasing ecofriendly alternative transport, such as public and active transport, are efficient approaches to mitigate harmful environmental impacts caused by a large amount of vehicle use. Besides the environmental benefits of promoting alternative transport, it can also induce other health and economic benefits. At present, a number of studies have been conducted to evaluate cobenefits from greenhouse gas mitigation policies. However, relatively few have focused specifically on the transport sector. A comprehensive understanding of the multiple benefits of alternative transport could assist with policy making in the areas of transport, health, and environment. However, there is no straightforward method which could estimate cobenefits effect at one time. In this paper, the links between vehicle emissions and air quality, as well as the health and economic benefits from alternative transport use, are considered, and methodological issues relating to the modelling of these cobenefits are discussed.
Williams, Calum; Rughoobur, Girish; Flewitt, Andrew J; Wilkinson, Timothy D
2016-11-10
A single-step fabrication method is presented for ultra-thin, linearly variable optical bandpass filters (LVBFs) based on a metal-insulator-metal arrangement using modified evaporation deposition techniques. This alternate process methodology offers reduced complexity and cost in comparison to conventional techniques for fabricating LVBFs. We are able to achieve linear variation of insulator thickness across a sample, by adjusting the geometrical parameters of a typical physical vapor deposition process. We demonstrate LVBFs with spectral selectivity from 400 to 850 nm based on Ag (25 nm) and MgF2 (75-250 nm). Maximum spectral transmittance is measured at ∼70% with a Q-factor of ∼20.
This study will provide a general methodology for integrating threshold information from multiple species ecological metrics, allow for prediction of changes of alternative stable states, and provide a risk assessment tool that can be applied to adaptive management. The integr...
Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design
ERIC Educational Resources Information Center
Tajino, Akira; James, Robert; Kijima, Kyoichi
2005-01-01
Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…
Road-corridor planning in the EIA procedure in Spain. A review of case studies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Loro, Manuel, E-mail: manuel.loro@upm.es; Transport Research Centre; Centro de investigación del transporte, TRANSyT-UPM, ETSI Caminos, Canales y Puertos, Universidad Politécnica de Madrid, Prof. Aranguren s/n, 28040 Madrid
The assessment of different alternatives in road-corridor planning must be based on a number of well-defined territorial variables that serve as decision making criteria, and this requires a high-quality preliminary environmental assessment study. In Spain the formal specifications for the technical requirements stipulate the constraints that must be considered in the early stages of defining road corridors, but not how they should be analyzed and ranked. As part of the feasibility study of a new road definition, the most common methodology is to establish different levels of Territorial Carrying Capacity (TCC) in the study area in order to summarize themore » territorial variables on thematic maps and to ease the tracing process of road-corridor layout alternatives. This paper explores the variables used in 22 road-construction projects conducted by the Ministry of Public Works that were subject to the Spanish EIA regulation and published between 2006 and 2008. The aim was to evaluate the quality of the methods applied and the homogeneity and suitability of the variables used for defining the TCC. The variables were clustered into physical, environmental, land-use and cultural constraints for the purpose of comparing the TCC values assigned in the studies reviewed. We found the average quality of the studies to be generally acceptable in terms of the justification of the methodology, the weighting and classification of the variables, and the creation of a synthesis map. Nevertheless, the methods for assessing the TCC are not sufficiently standardized; there is a lack of uniformity in the cartographic information sources and methodologies for the TCC valuation. -- Highlights: • We explore 22 road-corridor planning studies subjected to the Spanish EIA regulation. • We analyze the variables selected for defining territorial carrying capacity. • The quality of the studies is acceptable (methodology, variable weighting, mapping). • There is heterogeneity in the methods for territorial carrying capacity valuation.« less
Integrated model-based retargeting and optical proximity correction
NASA Astrophysics Data System (ADS)
Agarwal, Kanak B.; Banerjee, Shayak
2011-04-01
Conventional resolution enhancement techniques (RET) are becoming increasingly inadequate at addressing the challenges of subwavelength lithography. In particular, features show high sensitivity to process variation in low-k1 lithography. Process variation aware RETs such as process-window OPC are becoming increasingly important to guarantee high lithographic yield, but such techniques suffer from high runtime impact. An alternative to PWOPC is to perform retargeting, which is a rule-assisted modification of target layout shapes to improve their process window. However, rule-based retargeting is not a scalable technique since rules cannot cover the entire search space of two-dimensional shape configurations, especially with technology scaling. In this paper, we propose to integrate the processes of retargeting and optical proximity correction (OPC). We utilize the normalized image log slope (NILS) metric, which is available at no extra computational cost during OPC. We use NILS to guide dynamic target modification between iterations of OPC. We utilize the NILS tagging capabilities of Calibre TCL scripting to identify fragments with low NILS. We then perform NILS binning to assign different magnitude of retargeting to different NILS bins. NILS is determined both for width, to identify regions of pinching, and space, to locate regions of potential bridging. We develop an integrated flow for 1x metal lines (M1) which exhibits lesser lithographic hotspots compared to a flow with just OPC and no retargeting. We also observe cases where hotspots that existed in the rule-based retargeting flow are fixed using our methodology. We finally also demonstrate that such a retargeting methodology does not significantly alter design properties by electrically simulating a latch layout before and after retargeting. We observe less than 1% impact on latch Clk-Q and D-Q delays post-retargeting, which makes this methodology an attractive one for use in improving shape process windows without perturbing designed values.
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm
Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis
2016-01-01
Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds. PMID:27827883
Towards Autonomous Modular UAV Missions: The Detection, Geo-Location and Landing Paradigm.
Kyristsis, Sarantis; Antonopoulos, Angelos; Chanialakis, Theofilos; Stefanakis, Emmanouel; Linardos, Christos; Tripolitsiotis, Achilles; Partsinevelos, Panagiotis
2016-11-03
Nowadays, various unmanned aerial vehicle (UAV) applications become increasingly demanding since they require real-time, autonomous and intelligent functions. Towards this end, in the present study, a fully autonomous UAV scenario is implemented, including the tasks of area scanning, target recognition, geo-location, monitoring, following and finally landing on a high speed moving platform. The underlying methodology includes AprilTag target identification through Graphics Processing Unit (GPU) parallelized processing, image processing and several optimized locations and approach algorithms employing gimbal movement, Global Navigation Satellite System (GNSS) readings and UAV navigation. For the experimentation, a commercial and a custom made quad-copter prototype were used, portraying a high and a low-computational embedded platform alternative. Among the successful targeting and follow procedures, it is shown that the landing approach can be successfully performed even under high platform speeds.
A methodology to modify land uses in a transit oriented development scenario.
Sahu, Akshay
2018-05-01
Developing nations are adopting transit oriented development (TOD) strategies to decongest their transportation systems. These strategies are often adopted after the preparation of land use plans. The goal of this study was to build a methodology to modify these land uses using soft computing. This can help to achieve alternate land use plans relevant to TOD. The methodology incorporates TOD characteristics and objectives. Global TOD parameters (density, diversity, and distance to transit) were studied. Expert opinions gave weights and ranges for the parameters in an Indian TOD scenario. Rules to allocate land use was developed. Objective functions were defined. Four objectives were used. First was to maximize employment density, residential density and percent of mix land use. Second was to shape density and diversity with respect to distance. Third was to minimize degree of land use change, and fourth was to increase compactness of the land use allocation. The methodology was applied to two sectors of Naya Raipur, the new planned administrative capital of the state of Chhattisgarh, India. The city has implemented TOD in the form of Bus rapid transit system (BRTS) over an existing land use. Thousand random plans were generated through the methodology. Top 30 plans were selected as parent population for modifications through genetic algorithm (GA). Alternate plans were generated at the end of GA cycle. The best alternate plan was compared with successful BRTS and TOD land uses for its merits and demerits. It was also compared with the initial land use plan for empirical validation. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Goodwin, Graham. C.; Medioli, Adrian. M.
2013-08-01
Model predictive control has been a major success story in process control. More recently, the methodology has been used in other contexts, including automotive engine control, power electronics and telecommunications. Most applications focus on set-point tracking and use single-sequence optimisation. Here we consider an alternative class of problems motivated by the scheduling of emergency vehicles. Here disturbances are the dominant feature. We develop a novel closed-loop model predictive control strategy aimed at this class of problems. We motivate, and illustrate, the ideas via the problem of fluid deployment of ambulance resources.
An experiment-based comparative study of fuzzy logic control
NASA Technical Reports Server (NTRS)
Berenji, Hamid R.; Chen, Yung-Yaw; Lee, Chuen-Chein; Murugesan, S.; Jang, Jyh-Shing
1989-01-01
An approach is presented to the control of a dynamic physical system through the use of approximate reasoning. The approach has been implemented in a program named POLE, and the authors have successfully built a prototype hardware system to solve the cartpole balancing problem in real-time. The approach provides a complementary alternative to the conventional analytical control methodology and is of substantial use when a precise mathematical model of the process being controlled is not available. A set of criteria for comparing controllers based on approximate reasoning and those based on conventional control schemes is furnished.
Documentation of volume 3 of the 1978 Energy Information Administration annual report to congress
NASA Astrophysics Data System (ADS)
1980-02-01
In a preliminary overview of the projection process, the relationship between energy prices, supply, and demand is addressed. Topics treated in detail include a description of energy economic interactions, assumptions regarding world oil prices, and energy modeling in the long term beyond 1995. Subsequent sections present the general approach and methodology underlying the forecasts, and define and describe the alternative projection series and their associated assumptions. Short term forecasting, midterm forecasting, long term forecasting of petroleum, coal, and gas supplies are included. The role of nuclear power as an energy source is also discussed.
Multi-objective game-theory models for conflict analysis in reservoir watershed management.
Lee, Chih-Sheng
2012-05-01
This study focuses on the development of a multi-objective game-theory model (MOGM) for balancing economic and environmental concerns in reservoir watershed management and for assistance in decision. Game theory is used as an alternative tool for analyzing strategic interaction between economic development (land use and development) and environmental protection (water-quality protection and eutrophication control). Geographic information system is used to concisely illustrate and calculate the areas of various land use types. The MOGM methodology is illustrated in a case study of multi-objective watershed management in the Tseng-Wen reservoir, Taiwan. The innovation and advantages of MOGM can be seen in the results, which balance economic and environmental concerns in watershed management and which can be interpreted easily by decision makers. For comparison, the decision-making process using conventional multi-objective method to produce many alternatives was found to be more difficult. Copyright © 2012 Elsevier Ltd. All rights reserved.
Assembling Appliances Standards from a Basket of Functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siderious, Hans-Paul; Meier, Alan
2014-08-11
Rapid innovation in product design challenges the current methodology for setting standards and labels, especially for electronics, software and networking. Major problems include defining the product, measuring its energy consumption, and choosing the appropriate metric and level for the standard. Most governments have tried to solve these problems by defining ever more specific product subcategories, along with their corresponding test methods and metrics. An alternative approach would treat each energy-using product as something that delivers a basket of functions. Then separate standards would be constructed for the individual functions that can be defined, tested, and evaluated. Case studies of thermostats,more » displays and network equipment are presented to illustrate the problems with the classical approach for setting standards and indicate the merits and drawbacks of the alternative. The functional approach appears best suited to products whose primary purpose is processing information and that have multiple functions.« less
Energy conservation and the transportation sector
NASA Technical Reports Server (NTRS)
1975-01-01
The present status of the energy implications of the transportation systems in the United States was illustrated, with primary emphasis on the technologies and methods for achieving a substantial reduction in the associated energy price (approximately 25% of the nation's energy is consumed directly in the operation of these systems). These technologies may be classified as follows: (1) improvement of system efficiency (system operations or technological), (2) substitution for scarce energy resources (electrification, alternate fuels, use of man power, recycling), (3) curtailment of end use (managed population growth rate, education of citizenry, alternatives to personal transportation, improved urban planning, reduced travel incentives). Examples and illustrations were given. Thirty-four actions were chosen on the basis of a preliminary filtering process with the objective of: (1) demonstrating a methodological approach to arrive at logical and consistent conservation action packages, (2) recommending a viable and supportable specific set of actions.
Bonadio, Federica; Margot, Pierre; Delémont, Olivier; Esseiva, Pierre
2008-11-20
Headspace solid-phase microextraction (HS-SPME) is assessed as an alternative to liquid-liquid extraction (LLE) currently used for 3,4-methylenedioxymethampethamine (MDMA) profiling. Both methods were compared evaluating their performance in discriminating and classifying samples. For this purpose 62 different seizures were analysed using both extraction techniques followed by gas chromatography-mass spectroscopy (GC-MS). A previously validated method provided data for HS-SPME, whereas LLE data were collected applying a harmonized methodology developed and used in the European project CHAMP. After suitable pre-treatment, similarities between sample pairs were studied using the Pearson correlation. Both methods enable to distinguish between samples coming from the same pre-tabletting batches and samples coming from different pre-tabletting batches. This finding emphasizes the use of HS-SPME as an effective alternative to LLE, with additional advantages such as sample preparation and a solvent-free process.
Escalante-Aburto, Anayansi; Ramírez-Wong, Benjamín; Torres-Chávez, Patricia Isabel; López-Cervantes, Jaime; Figueroa-Cárdenas, Juan de Dios; Barrón-Hoyos, Jesús Manuel; Morales-Rosas, Ignacio; Ponce-García, Néstor; Gutiérrez-Dorado, Roberto
2014-12-15
Extrusion is an alternative technology for the production of nixtamalized products. The aim of this study was to obtain an expanded nixtamalized snack with whole blue corn and using the extrusion process, to preserve the highest possible total anthocyanin content, intense blue/purple coloration (color b) and the highest expansion index. A central composite experimental design was used. The extrusion process factors were: feed moisture (FM, 15%-23%), calcium hydroxide concentration (CHC, 0%-0.25%) and final extruder temperature (T, 110-150 °C). The chemical and physical properties evaluated in the extrudates were moisture content (MC, %), total anthocyanins (TA, mg·kg(-1)), pH, color (L, a, b) and expansion index (EI). ANOVA and surface response methodology were applied to evaluate the effects of the extrusion factors. FM and T significantly affected the response variables. An optimization step was performed by overlaying three contour plots to predict the best combination region. The extrudates were obtained under the following optimum factors: FM (%) = 16.94, CHC (%) = 0.095 and T (°C) = 141.89. The predicted extrusion processing factors were highly accurate, yielding an expanded nixtamalized snack with 158.87 mg·kg(-1) TA (estimated: 160 mg·kg(-1)), an EI of 3.19 (estimated: 2.66), and color parameter b of -0.44 (estimated: 0.10).
Prospects of pyrolysis oil from plastic waste as fuel for diesel engines: A review
NASA Astrophysics Data System (ADS)
Mangesh, V. L.; Padmanabhan, S.; Ganesan, S.; PrabhudevRahul, D.; Reddy, T. Dinesh Kumar
2017-05-01
The purpose ofthis study is to review the existing literature about chemical recycling of plastic waste and its potential as fuel for diesel engines. This is a review covering on the field of converting waste plastics into liquid hydrocarbon fuels for diesel engines. Disposal and recycling of waste plastics have become an incremental problem and environmental threat with increasing demand for plastics. One of the effective measures is by converting waste plastic into combustible hydrocarbon liquid as an alternative fuel for running diesel engines. Continued research efforts have been taken by researchers to convert waste plastic in to combustible pyrolysis oil as alternate fuel for diesel engines. An existing literature focuses on the study of chemical structure of the waste plastic pyrolysis compared with diesel oil. Converting waste plastics into fuel oil by different catalysts in catalytic pyrolysis process also reviewed in this paper. The methodology with subsequent hydro treating and hydrocracking of waste plastic pyrolysis oil can reduce unsaturated hydrocarbon bonds which would improve the combustion performance in diesel engines as an alternate fuel.
A routinely applied atmospheric dispersion model was modified to evaluate alternative modeling techniques which allowed for more detailed source data, onsite meteorological data, and several dispersion methodologies. These were evaluated with hourly SO2 concentrations measured at...
Single-Case Experimental Designs: A Systematic Review of Published Research and Current Standards
ERIC Educational Resources Information Center
Smith, Justin D.
2012-01-01
This article systematically reviews the research design and methodological characteristics of single-case experimental design (SCED) research published in peer-reviewed journals between 2000 and 2010. SCEDs provide researchers with a flexible and viable alternative to group designs with large sample sizes. However, methodological challenges have…
Evaluating the Role of Intermediaries in the Electronic Value Chain.
ERIC Educational Resources Information Center
Janssen, Marijn; Sol, Henk G.
2000-01-01
Presents a business engineering methodology that supports the identification of electronic intermediary roles in the electronic value chain. The goal of this methodology is to give stakeholders insight into their current, and possible alternative, situations by means of visualization, to evaluate the added value of business models using…
A Comparison of Two Methods of Determining Interrater Reliability
ERIC Educational Resources Information Center
Fleming, Judith A.; Taylor, Janeen McCracken; Carran, Deborah
2004-01-01
This article offers an alternative methodology for practitioners and researchers to use in establishing interrater reliability for testing purposes. The majority of studies on interrater reliability use a traditional methodology where by two raters are compared using a Pearson product-moment correlation. This traditional method of estimating…
Beyond Composite Scores and Cronbach's Alpha: Advancing Methodological Rigor in Recreation Research
ERIC Educational Resources Information Center
Gagnon, Ryan J.; Stone, Garrett A.; Garst, Barry A.
2017-01-01
Critically examining common statistical approaches and their strengths and weaknesses is an important step in advancing recreation and leisure sciences. To continue this critical examination and to inform methodological decision making, this study compared three approaches to determine how alternative approaches may result in contradictory…
Innovative and Alternative Technology Assessment Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
1980-02-01
This four chapter, six appendix manual presents the procedures and methodology as well as the baseline costs and energy information necessary for the analysis and evaluation of innovative and alternative technology applications submitted for federal grant assistance under the innovative and alternative technology provisions of the Clean Water Act of 1977. The manual clarifies and interprets the intent of Congress and the Environmental Protection Agency in carrying out the mandates of the innovative and alternative provisions of the Clean Water Act of 1977. [DJE 2005
Towards a Lakatosian Analysis of the Piagetian and Alternative Conceptions Research Programs.
ERIC Educational Resources Information Center
Gilbert, John K.; Swift, David J.
1985-01-01
Lakatos's methodology of scientific research programs is summarized and discussed for Piagetian schools and alternative conceptions movement. Commonalities/differences between these two rival programs are presented along with fundamental assumptions, auxiliary hypotheses, and research policy. Suggests that research findings should not be merely…
Promoting energy efficiency through improved electricity pricing: A mid-project report
NASA Astrophysics Data System (ADS)
Action, J. P.; Kohler, D. F.; Mitchell, B. M.; Park, R. E.
1982-03-01
Five related areas of electricity demand analysis under alternative rate forms were studied. Adjustments by large commercial and industrial customers are examined. Residential demand under time of day (TOD) pricing is examined. A methodology for evaluating alternative rate structures is developed and applied.
An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications.
d'Acierno, Antonio; Esposito, Massimo; De Pietro, Giuseppe
2013-01-01
The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed.
An extensible six-step methodology to automatically generate fuzzy DSSs for diagnostic applications
2013-01-01
Background The diagnosis of many diseases can be often formulated as a decision problem; uncertainty affects these problems so that many computerized Diagnostic Decision Support Systems (in the following, DDSSs) have been developed to aid the physician in interpreting clinical data and thus to improve the quality of the whole process. Fuzzy logic, a well established attempt at the formalization and mechanization of human capabilities in reasoning and deciding with noisy information, can be profitably used. Recently, we informally proposed a general methodology to automatically build DDSSs on the top of fuzzy knowledge extracted from data. Methods We carefully refine and formalize our methodology that includes six stages, where the first three stages work with crisp rules, whereas the last three ones are employed on fuzzy models. Its strength relies on its generality and modularity since it supports the integration of alternative techniques in each of its stages. Results The methodology is designed and implemented in the form of a modular and portable software architecture according to a component-based approach. The architecture is deeply described and a summary inspection of the main components in terms of UML diagrams is outlined as well. A first implementation of the architecture has been then realized in Java following the object-oriented paradigm and used to instantiate a DDSS example aimed at accurately diagnosing breast masses as a proof of concept. Conclusions The results prove the feasibility of the whole methodology implemented in terms of the architecture proposed. PMID:23368970
Hayati, Elyas; Majnounian, Baris; Abdi, Ehsan; Sessions, John; Makhdoum, Majid
2013-02-01
Changes in forest landscapes resulting from road construction have increased remarkably in the last few years. On the other hand, the sustainable management of forest resources can only be achieved through a well-organized road network. In order to minimize the environmental impacts of forest roads, forest road managers must design the road network efficiently and environmentally as well. Efficient planning methodologies can assist forest road managers in considering the technical, economic, and environmental factors that affect forest road planning. This paper describes a three-stage methodology using the Delphi method for selecting the important criteria, the Analytic Hierarchy Process for obtaining the relative importance of the criteria, and finally, a spatial multi-criteria evaluation in a geographic information system (GIS) environment for identifying the lowest-impact road network alternative. Results of the Delphi method revealed that ground slope, lithology, distance from stream network, distance from faults, landslide susceptibility, erosion susceptibility, geology, and soil texture are the most important criteria for forest road planning in the study area. The suitability map for road planning was then obtained by combining the fuzzy map layers of these criteria with respect to their weights. Nine road network alternatives were designed using PEGGER, an ArcView GIS extension, and finally, their values were extracted from the suitability map. Results showed that the methodology was useful for identifying road that met environmental and cost considerations. Based on this work, we suggest future work in forest road planning using multi-criteria evaluation and decision making be considered in other regions and that the road planning criteria identified in this study may be useful.
Diversification of the muscle proteome through alternative splicing.
Nakka, Kiran; Ghigna, Claudia; Gabellini, Davide; Dilworth, F Jeffrey
2018-03-06
Skeletal muscles express a highly specialized proteome that allows the metabolism of energy sources to mediate myofiber contraction. This muscle-specific proteome is partially derived through the muscle-specific transcription of a subset of genes. Surprisingly, RNA sequencing technologies have also revealed a significant role for muscle-specific alternative splicing in generating protein isoforms that give specialized function to the muscle proteome. In this review, we discuss the current knowledge with respect to the mechanisms that allow pre-mRNA transcripts to undergo muscle-specific alternative splicing while identifying some of the key trans-acting splicing factors essential to the process. The importance of specific splicing events to specialized muscle function is presented along with examples in which dysregulated splicing contributes to myopathies. Though there is now an appreciation that alternative splicing is a major contributor to proteome diversification, the emergence of improved "targeted" proteomic methodologies for detection of specific protein isoforms will soon allow us to better appreciate the extent to which alternative splicing modifies the activity of proteins (and their ability to interact with other proteins) in the skeletal muscle. In addition, we highlight a continued need to better explore the signaling pathways that contribute to the temporal control of trans-acting splicing factor activity to ensure specific protein isoforms are expressed in the proper cellular context. An understanding of the signal-dependent and signal-independent events driving muscle-specific alternative splicing has the potential to provide us with novel therapeutic strategies to treat different myopathies.
Meeting Report: Alternatives for Developmental Neurotoxicity Testing
Lein, Pamela; Locke, Paul; Goldberg, Alan
2007-01-01
Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternatives to current animal testing protocols and guidelines. To address this need, the Johns Hopkins Center for Alternatives to Animal Testing (CAAT), the U.S. Environmental Protection Agency, and the National Toxicology Program are collaborating in a program called TestSmart DNT, the goals of which are to: (a) develop alternative methodologies for identifying and prioritizing chemicals and exposures that may cause developmental neurotoxicity in humans; (b) develop the policies for incorporating DNT alternatives into regulatory decision making; and (c) identify opportunities for reducing, refining, or replacing the use of animals in DNT. The first TestSmart DNT workshop was an open registration meeting held 13–15 March 2006 in Reston, Virginia. The primary objective was to bring together stakeholders (test developers, test users, regulators, and advocates for children’s health, animal welfare, and environmental health) and individuals representing diverse disciplines (developmental neurobiology, toxicology, policy, and regulatory science) from around the world to share information and concerns relating to the science and policy of DNT. Individual presentations are available at the CAAT TestSmart website. This report provides a synthesis of workgroup discussions and recommendations for future directions and priorities, which include initiating a systematic evaluation of alternative models and technologies, developing a framework for the creation of an open database to catalog DNT data, and devising a strategy for harmonizing the validation process across international jurisdictional borders. PMID:17520065
Meeting report: alternatives for developmental neurotoxicity testing.
Lein, Pamela; Locke, Paul; Goldberg, Alan
2007-05-01
Developmental neurotoxicity testing (DNT) is perceived by many stakeholders to be an area in critical need of alternatives to current animal testing protocols and guidelines. To address this need, the Johns Hopkins Center for Alternatives to Animal Testing (CAAT), the U.S. Environmental Protection Agency, and the National Toxicology Program are collaborating in a program called TestSmart DNT, the goals of which are to: (a) develop alternative methodologies for identifying and prioritizing chemicals and exposures that may cause developmental neurotoxicity in humans; (b) develop the policies for incorporating DNT alternatives into regulatory decision making; and (c) identify opportunities for reducing, refining, or replacing the use of animals in DNT. The first TestSmart DNT workshop was an open registration meeting held 13-15 March 2006 in Reston, Virginia. The primary objective was to bring together stakeholders (test developers, test users, regulators, and advocates for children's health, animal welfare, and environmental health) and individuals representing diverse disciplines (developmental neurobiology, toxicology, policy, and regulatory science) from around the world to share information and concerns relating to the science and policy of DNT. Individual presentations are available at the CAAT TestSmart website. This report provides a synthesis of workgroup discussions and recommendations for future directions and priorities, which include initiating a systematic evaluation of alternative models and technologies, developing a framework for the creation of an open database to catalog DNT data, and devising a strategy for harmonizing the validation process across international jurisdictional borders.
Grounded theory: a methodological spiral from positivism to postmodernism.
Mills, Jane; Chapman, Ysanne; Bonner, Ann; Francis, Karen
2007-04-01
Our aim in this paper is to explain a methodological/methods package devised to incorporate situational and social world mapping with frame analysis, based on a grounded theory study of Australian rural nurses' experiences of mentoring. Situational analysis, as conceived by Adele Clarke, shifts the research methodology of grounded theory from being located within a postpositivist paradigm to a postmodern paradigm. Clarke uses three types of maps during this process: situational, social world and positional, in combination with discourse analysis. During our grounded theory study, the process of concurrent interview data generation and analysis incorporated situational and social world mapping techniques. An outcome of this was our increased awareness of how outside actors influenced participants in their constructions of mentoring. In our attempts to use Clarke's methodological package, however, it became apparent that our constructivist beliefs about human agency could not be reconciled with the postmodern project of discourse analysis. We then turned to the literature on symbolic interactionism and adopted frame analysis as a method to examine the literature on rural nursing and mentoring as secondary form of data. While we found situational and social world mapping very useful, we were less successful in using positional maps. In retrospect, we would argue that collective action framing provides an alternative to analysing such positions in the literature. This is particularly so for researchers who locate themselves within a constructivist paradigm, and who are therefore unwilling to reject the notion of human agency and the ability of individuals to shape their world in some way. Our example of using this package of situational and social worlds mapping with frame analysis is intended to assist other researchers to locate participants more transparently in the social worlds that they negotiate in their everyday practice.
ERIC Educational Resources Information Center
Breault, Holly
2017-01-01
The purpose of this study was to investigate the effect of the HELPS Program on the reading fluency skills of secondary level students attending an alternative education program using single case design methodology. Participants in this study included one 8th grade student and two 9th grade students attending an alternative education program in…
Estimating Soil Hydraulic Parameters using Gradient Based Approach
NASA Astrophysics Data System (ADS)
Rai, P. K.; Tripathi, S.
2017-12-01
The conventional way of estimating parameters of a differential equation is to minimize the error between the observations and their estimates. The estimates are produced from forward solution (numerical or analytical) of differential equation assuming a set of parameters. Parameter estimation using the conventional approach requires high computational cost, setting-up of initial and boundary conditions, and formation of difference equations in case the forward solution is obtained numerically. Gaussian process based approaches like Gaussian Process Ordinary Differential Equation (GPODE) and Adaptive Gradient Matching (AGM) have been developed to estimate the parameters of Ordinary Differential Equations without explicitly solving them. Claims have been made that these approaches can straightforwardly be extended to Partial Differential Equations; however, it has been never demonstrated. This study extends AGM approach to PDEs and applies it for estimating parameters of Richards equation. Unlike the conventional approach, the AGM approach does not require setting-up of initial and boundary conditions explicitly, which is often difficult in real world application of Richards equation. The developed methodology was applied to synthetic soil moisture data. It was seen that the proposed methodology can estimate the soil hydraulic parameters correctly and can be a potential alternative to the conventional method.
Levidow, Les; Lindgaard-Jørgensen, Palle; Nilsson, Asa; Skenhall, Sara Alongi; Assimacopoulos, Dionysis
2014-01-01
The well-known eco-efficiency concept helps to assess the economic value and resource burdens of potential improvements by comparison with the baseline situation. But eco-efficiency assessments have generally focused on a specific site, while neglecting wider effects, for example, through interactions between water users and wastewater treatment (WWT) providers. To address the methodological gap, the EcoWater project has developed a method and online tools for meso-level analysis of the entire water-service value chain. This study investigated improvement options in two large manufacturing companies which have significant potential for eco-efficiency gains. They have been considering investment in extra processes which can lower resource burdens from inputs and wastewater, as well as internalising WWT processes. In developing its methodology, the EcoWater project obtained the necessary information from many agents, involved them in the meso-level assessment and facilitated their discussion on alternative options. Prior discussions with stakeholders stimulated their attendance at a workshop to discuss a comparative eco-efficiency assessment for whole-system improvement. Stakeholders expressed interest in jointly extending the EcoWater method to more options and in discussing investment strategies. In such ways, optimal solutions will depend on stakeholders overcoming fragmentation by sharing responsibility and knowledge.
Argumentation in Science Education: A Model-based Framework
NASA Astrophysics Data System (ADS)
Böttcher, Florian; Meisert, Anke
2011-02-01
The goal of this article is threefold: First, the theoretical background for a model-based framework of argumentation to describe and evaluate argumentative processes in science education is presented. Based on the general model-based perspective in cognitive science and the philosophy of science, it is proposed to understand arguments as reasons for the appropriateness of a theoretical model which explains a certain phenomenon. Argumentation is considered to be the process of the critical evaluation of such a model if necessary in relation to alternative models. Secondly, some methodological details are exemplified for the use of a model-based analysis in the concrete classroom context. Third, the application of the approach in comparison with other analytical models will be presented to demonstrate the explicatory power and depth of the model-based perspective. Primarily, the framework of Toulmin to structurally analyse arguments is contrasted with the approach presented here. It will be demonstrated how common methodological and theoretical problems in the context of Toulmin's framework can be overcome through a model-based perspective. Additionally, a second more complex argumentative sequence will also be analysed according to the invented analytical scheme to give a broader impression of its potential in practical use.
Agzenai, Yahya; Pozuelo, Javier; Sanz, Javier; Perez, Ignacio; Baselga, Juan
2015-01-01
In an effort to give a global view of this field of research, in this mini-review we highlight the most recent publications and patents focusing on modified asphalt pavements that contain certain reinforcing nanoparticles which impart desirable thermal, electrical and mechanical properties. In response to the increasing cost of asphalt binder and road maintenance, there is a need to look for alternative technologies and new asphalt composites, able to self-repair, for preserving and renewing the existing pavements. First, we will focus on the self-healing property of asphalt, the evidences that support that healing takes place immediately after the contact between the faces of a crack, and how the amount of healing can be measured in both the laboratory and the field. Next we review the hypothetical mechanisms of healing to understand the material behaviour and establish models to quantify the damage-healing process. Thereafter, we outline different technologies, nanotechnologies and methodologies used for self-healing paying particular attention to embedded micro-capsules, new nano-materials like carbon nanotubes and nano-fibres, ionomers, and microwave and induction heating processes.
Comparing Alternative Instruments to Measure Service Quality in Higher Education
ERIC Educational Resources Information Center
Brochado, Ana
2009-01-01
Purpose: The purpose of this paper is to examine the performance of five alternative measures of service quality in the high education sector--service quality (SERVQUAL), importance-weighted SERVQUAL, service performance (SERVPERF), importance-weighted SERVPERF, and higher education performance (HEdPERF). Design/methodology/approach: Data were…
Propensity Score Analysis: An Alternative Statistical Approach for HRD Researchers
ERIC Educational Resources Information Center
Keiffer, Greggory L.; Lane, Forrest C.
2016-01-01
Purpose: This paper aims to introduce matching in propensity score analysis (PSA) as an alternative statistical approach for researchers looking to make causal inferences using intact groups. Design/methodology/approach: An illustrative example demonstrated the varying results of analysis of variance, analysis of covariance and PSA on a heuristic…
Laso, J; Margallo, M; Celaya, J; Fullana, P; Bala, A; Gazulla, C; Irabien, A; Aldaco, R
2016-08-01
The anchovy canning industry has high importance in the Cantabria Region (North Spain) from economic, social and touristic points of view. The Cantabrian canned anchovy is world-renowned owing to its handmade and traditional manufacture. The canning process generates huge amounts of several food wastes, whose suitable management can contribute to benefits for both the environment and the economy, closing the loop of the product life cycle. Life cycle assessment methodology was used in this work to assess the environmental performance of two waste management alternatives: Head and spine valorisation to produce fishmeal and fish oil; and anchovy meat valorisation to produce anchovy paste. Fuel oil production has been a hotspot of the valorisation of heads and spines, so several improvements should be applied. With respect to anchovy meat valorisation, the production of polypropylene and glass for packaging was the least environmentally friendly aspect of the process. Furthermore, the environmental characterisation of anchovy waste valorisation was compared with incineration and landfilling alternatives. In both cases, the valorisation management options were the best owing to the avoided burdens associated with the processes. Therefore, it is possible to contribute to the circular economy in the Cantabrian canned anchovy industry. © The Author(s) 2016.
NASA Astrophysics Data System (ADS)
Ziemińska-Stolarska, Aleksandra; Barecka, Magda; Zbiciński, Ireneusz
2017-10-01
Abundant use of natural resources is doubtlessly one of the greatest challenges of sustainable development. Process alternatives, which enable sustainable manufacturing of valuable products from more accessible resources, are consequently required. One of examples of limited resources is Indium, currently broadly used for tin doped indium oxide (ITO) for production of transparent conductive films (TCO) in electronics industry. Therefore, candidates for Indium replacement, which would offer as good performance as the industrial state-of-the-art technology based on ITO are widely studied. However, the environmental impact of new layers remains unknown. Hence, this paper studies the environmental effect of ITO replacement by zinc oxide (ZnO) by means life cycle assessment (LCA) methodology. The analysis enables to quantify the environmental impact over the entire period of life cycle of products—during manufacturing, use phase and waste generation. The analysis was based on experimental data for deposition process. Further, analysis of different impact categories was performed in order to determine specific environmental effects related to technology change. What results from the analysis, is that ZnO is a robust alternative material for ITO replacement regarding environmental load and energy efficiency of deposition process which is also crucial for sustainable TCO layer production.
[Epistemological/methodological contributions to the fortification of an emancipatory con(science)].
Ferreira, Marcelo José Monteiro; Rigotto, Raquel Maria
2014-10-01
This article conducts a critical and reflective analysis into the paths of elaboration, sistematization and communication of the results of research in conjunction with colleges, social movements and individuals in the territory under scrutiny. For this, the article embraces as the core analytical theme the process of shared production of knowledge, both in the epistemological-methodological field and with respect to its social destination. The case study was adopted as the methodology, preceded by the use of focused groups and in-depth interviews as technique. To analyze the qualitative material discourse analysis was adopted in line with the assumptions of in-depth hermeneutics. The results are presented in two stages: Firstly, the new possibilities for a paradigmatic reorientation are discussed from the permanent and procedural interlocution with the empirical field and it's different contexts and authors. Secondly, it analyzes in the praxiological dimension, the distinct ways of appropriation of knowledge produced in dialogue with the social movements and the individuals in the territory under scrutiny. It concludes by highlighting alternative and innovative paths to an edifying academic practice. which stresses solidarity and is sensitive to the vulnerable population and its requests.
NASA Technical Reports Server (NTRS)
Chen, Xiaoqin; Tamma, Kumar K.; Sha, Desong
1993-01-01
The present paper describes a new explicit virtual-pulse time integral methodology for nonlinear structural dynamics problems. The purpose of the paper is to provide the theoretical basis of the methodology and to demonstrate applicability of the proposed formulations to nonlinear dynamic structures. Different from the existing numerical methods such as direct time integrations or mode superposition techniques, the proposed methodology offers new perspectives and methodology of development, and possesses several unique and attractive computational characteristics. The methodology is tested and compared with the implicit Newmark method (trapezoidal rule) through a nonlinear softening and hardening spring dynamic models. The numerical results indicate that the proposed explicit virtual-pulse time integral methodology is an excellent alternative for solving general nonlinear dynamic problems.
Application of cellulose nanofibers to remove water-based flexographic inks from wastewaters.
Balea, Ana; Monte, M Concepción; de la Fuente, Elena; Negro, Carlos; Blanco, Ángeles
2017-02-01
Water-based or flexographic inks in paper and plastic industries are more environmentally favourable than organic solvent-based inks. However, their use also creates new challenges because they remain dissolved in water and alter the recycling process. Conventional deinking technologies such as flotation processes do not effectively remove them. Adsorption, coagulation/flocculation, biological and membrane processes are either expensive or have negative health impacts, making the development of alternative methods necessary. Cellulose nanofibers (CNF) are biodegradable, and their structural and mechanical properties are useful for wastewater treatment. TEMPO-oxidised CNF have been evaluated for the decolourisation of wastewaters that contained copper phthalocyanine blue, carbon black and diarlyide yellow pigments. CNF in combination with a cationic polyacrylamide (cPAM) has also been tested. Jar-test methodology was used to evaluate the efficiency of the different treatments and cationic/anionic demand, turbidity and ink concentration in waters were measured. Results show that dual-component system for ink removal has a high potential as an alternative bio-based adsorbent for the removal of water-based inks. In addition, experiments varying CNF and cPAM concentrations were performed to optimise the ink-removal process. Ink concentration reductions of 100%, 87.5% and 83.3% were achieved for copper phthalocyanine blue, carbon black and diarlyide yellow pigments, respectively. Flocculation studies carried out show the decolourisation mechanism during the dual-component treatment of wastewaters containing water-based inks.
Inquiry-Based Learning for Older People at a University in Spain
ERIC Educational Resources Information Center
Martorell, Ingrid; Medrano, Marc; Sole, Cristian; Vila, Neus; Cabeza, Luisa F.
2009-01-01
With the increasing number of older people in the world and their interest in education, universities play an important role in providing effective learning methodologies. This paper presents a new instructional methodology implementing inquiry-based learning (IBL) in two courses focused on alternative energies in the Program for Older People at…
An Alternative Methodology for Creating Parallel Test Forms Using the IRT Information Function.
ERIC Educational Resources Information Center
Ackerman, Terry A.
The purpose of this paper is to report results on the development of a new computer-assisted methodology for creating parallel test forms using the item response theory (IRT) information function. Recently, several researchers have approached test construction from a mathematical programming perspective. However, these procedures require…
Development of a multi-criteria evaluation system to assess growing pig welfare.
Martín, P; Traulsen, I; Buxadé, C; Krieter, J
2017-03-01
The aim of this paper was to present an alternative multi-criteria evaluation model to assess animal welfare on farms based on the Welfare Quality® (WQ) project, using an example of welfare assessment of growing pigs. The WQ assessment protocol follows a three-step aggregation process. Measures are aggregated into criteria, criteria into principles and principles into an overall assessment. This study focussed on the first step of the aggregation. Multi-attribute utility theory (MAUT) was used to produce a value of welfare for each criterion. The utility functions and the aggregation function were constructed in two separated steps. The Measuring Attractiveness by a Categorical Based Evaluation Technique (MACBETH) method was used for utility function determination and the Choquet Integral (CI) was used as an aggregation operator. The WQ decision-makers' preferences were fitted in order to construct the utility functions and to determine the CI parameters. The methods were tested with generated data sets for farms of growing pigs. Using the MAUT, similar results were obtained to the ones obtained applying the WQ protocol aggregation methods. It can be concluded that due to the use of an interactive approach such as MACBETH, this alternative methodology is more transparent and more flexible than the methodology proposed by WQ, which allows the possibility to modify the model according, for instance, to new scientific knowledge.
Ramales-Valderrama, Rosa Adriana; Vázquez-Durán, Alma; Méndez-Albores, Abraham
2016-01-01
Mycotoxin adsorption onto biomaterials is considered as a promising alternative for decontamination without harmful chemicals. In this research, the adsorption of B-aflatoxins (AFB1 and AFB2) using Pyracantha koidzumii biomasses (leaves, berries and the mixture of leaves/berries) from aqueous solutions was explored. The biosorbent was used at 0.5% (w/v) in samples spiked with 100 ng/mL of B-aflatoxin standards and incubated at 40 °C for up to 24 h. A standard biosorption methodology was employed and aflatoxins were quantified by an immunoaffinity column and UPLC methodologies. The biosorbent-aflatoxin interaction mechanism was investigated from a combination of zeta potential (ζ), Fourier transform infrared spectroscopy (FTIR) and scanning electron microscopy (SEM). The highest aflatoxin uptakes were 86% and 82% at 6 h using leaves and the mixture of leaves/berries biomasses, respectively. A moderate biosorption of 46% was attained when using berries biomass. From kinetic studies, the biosorption process is described using the first order adsorption model. Evidence from FTIR spectra suggests the participation of hydroxyl, amine, carboxyl, amide, phosphate and ketone groups in the biosorption and the mechanism was proposed to be dominated by the electrostatic interaction between the negatively charged functional groups and the positively charged aflatoxin molecules. Biosorption by P. koidzumii biomasses has been demonstrated to be an alternative to conventional systems for B-aflatoxins removal. PMID:27420096
Barbezat, Isabelle; Willener, Rita; Jenni, Giovanna; Hürlimann, Barbara; Geese, Franziska; Spichiger, Elisabeth
2017-07-01
Background: People with an indwelling urinary catheter often suffer from complications and health care professionals are regularly confronted with questions about catheter management. Clinical guidelines are widely accepted to promote evidence-based practice. In the literature, the adaptation of a guideline is described as a valid alternative to the development of a new one. Aim: To translate a guideline for the care for adults with an indwelling urinary catheter in the acute and long term care setting as well as for home care. To adapt the guideline to the Swiss context. Method: In a systematic and pragmatic process, clinical questions were identified, guidelines were searched and evaluated regarding clinical relevance and quality. After each step, the next steps were defined. Results: An English guideline was translated, adapted to the local context and supplemented. The adapted guideline was reviewed by experts, adapted again and approved. After 34 months and an investment of a total of 145 man working days, a guideline for the care for people with an indwelling urinary catheter is available for both institutions. Conclusions: Translation and adaptation of a guideline was a valuable alternative to the development of a new one; nevertheless, the efforts necessary should not be underestimated. For such a project, sufficient professional and methodological resources should be made available to achieve efficient guideline work by a constant team.
Ramales-Valderrama, Rosa Adriana; Vázquez-Durán, Alma; Méndez-Albores, Abraham
2016-07-13
Mycotoxin adsorption onto biomaterials is considered as a promising alternative for decontamination without harmful chemicals. In this research, the adsorption of B-aflatoxins (AFB₁ and AFB₂) using Pyracantha koidzumii biomasses (leaves, berries and the mixture of leaves/berries) from aqueous solutions was explored. The biosorbent was used at 0.5% (w/v) in samples spiked with 100 ng/mL of B-aflatoxin standards and incubated at 40 °C for up to 24 h. A standard biosorption methodology was employed and aflatoxins were quantified by an immunoaffinity column and UPLC methodologies. The biosorbent-aflatoxin interaction mechanism was investigated from a combination of zeta potential (ζ), Fourier transform infrared spectroscopy (FTIR) and scanning electron microscopy (SEM). The highest aflatoxin uptakes were 86% and 82% at 6 h using leaves and the mixture of leaves/berries biomasses, respectively. A moderate biosorption of 46% was attained when using berries biomass. From kinetic studies, the biosorption process is described using the first order adsorption model. Evidence from FTIR spectra suggests the participation of hydroxyl, amine, carboxyl, amide, phosphate and ketone groups in the biosorption and the mechanism was proposed to be dominated by the electrostatic interaction between the negatively charged functional groups and the positively charged aflatoxin molecules. Biosorption by P. koidzumii biomasses has been demonstrated to be an alternative to conventional systems for B-aflatoxins removal.
Validation and calibration of structural models that combine information from multiple sources.
Dahabreh, Issa J; Wong, John B; Trikalinos, Thomas A
2017-02-01
Mathematical models that attempt to capture structural relationships between their components and combine information from multiple sources are increasingly used in medicine. Areas covered: We provide an overview of methods for model validation and calibration and survey studies comparing alternative approaches. Expert commentary: Model validation entails a confrontation of models with data, background knowledge, and other models, and can inform judgments about model credibility. Calibration involves selecting parameter values to improve the agreement of model outputs with data. When the goal of modeling is quantitative inference on the effects of interventions or forecasting, calibration can be viewed as estimation. This view clarifies issues related to parameter identifiability and facilitates formal model validation and the examination of consistency among different sources of information. In contrast, when the goal of modeling is the generation of qualitative insights about the modeled phenomenon, calibration is a rather informal process for selecting inputs that result in model behavior that roughly reproduces select aspects of the modeled phenomenon and cannot be equated to an estimation procedure. Current empirical research on validation and calibration methods consists primarily of methodological appraisals or case-studies of alternative techniques and cannot address the numerous complex and multifaceted methodological decisions that modelers must make. Further research is needed on different approaches for developing and validating complex models that combine evidence from multiple sources.
NASA Technical Reports Server (NTRS)
Kirby, Michelle R.
2002-01-01
The TIES method is a forecasting environment whereby the decision-maker has the ability to easily assess and trade-off the impact of various technologies without sophisticated and time-consuming mathematical formulations. TIES provides a methodical approach where technically feasible alternatives can be identified with accuracy and speed to reduce design cycle time, and subsequently, life cycle costs, and was achieved through the use of various probabilistic methods, such as Response Surface Methodology and Monte Carlo Simulations. Furthermore, structured and systematic techniques are utilized from other fields to identify possible concepts and evaluation criteria by which comparisons can be made. This objective is achieved by employing the use of Morphological Matrices and Multi-Attribute Decision Making techniques. Through the execution of each step, a family of design alternatives for a given set of customer requirements can be identified and assessed subjectively or objectively. This methodology allows for more information (knowledge) to be brought into the earlier phases of the design process and will have direct implications on the affordability of the system. The increased knowledge allows for optimum allocation of company resources and quantitative justification for program decisions. Finally, the TIES method provided novel results and quantitative justification to facilitate decision making in the early stages of design so as to produce affordable and quality products.
Root Source Analysis/ValuStream[Trade Mark] - A Methodology for Identifying and Managing Risks
NASA Technical Reports Server (NTRS)
Brown, Richard Lee
2008-01-01
Root Source Analysis (RoSA) is a systems engineering methodology that has been developed at NASA over the past five years. It is designed to reduce costs, schedule, and technical risks by systematically examining critical assumptions and the state of the knowledge needed to bring to fruition the products that satisfy mission-driven requirements, as defined for each element of the Work (or Product) Breakdown Structure (WBS or PBS). This methodology is sometimes referred to as the ValuStream method, as inherent in the process is the linking and prioritizing of uncertainties arising from knowledge shortfalls directly to the customer's mission driven requirements. RoSA and ValuStream are synonymous terms. RoSA is not simply an alternate or improved method for identifying risks. It represents a paradigm shift. The emphasis is placed on identifying very specific knowledge shortfalls and assumptions that are the root sources of the risk (the why), rather than on assessing the WBS product(s) themselves (the what). In so doing RoSA looks forward to anticipate, identify, and prioritize knowledge shortfalls and assumptions that are likely to create significant uncertainties/ risks (as compared to Root Cause Analysis, which is most often used to look back to discover what was not known, or was assumed, that caused the failure). Experience indicates that RoSA, with its primary focus on assumptions and the state of the underlying knowledge needed to define, design, build, verify, and operate the products, can identify critical risks that historically have been missed by the usual approaches (i.e., design review process and classical risk identification methods). Further, the methodology answers four critical questions for decision makers and risk managers: 1. What s been included? 2. What's been left out? 3. How has it been validated? 4. Has the real source of the uncertainty/ risk been identified, i.e., is the perceived problem the real problem? Users of the RoSA methodology have characterized it as a true bottoms up risk assessment.
NASA Astrophysics Data System (ADS)
Domercant, Jean Charles
The combination of today's national security environment and mandated acquisition policies makes it necessary for military systems to interoperate with each other to greater degrees. This growing interdependency results in complex Systems-of-Systems (SoS) that only continue to grow in complexity to meet evolving capability needs. Thus, timely and affordable acquisition becomes more difficult, especially in the face of mounting budgetary pressures. To counter this, architecting principles must be applied to SoS design. The research objective is to develop an Architecture Real Options Complexity-Based Valuation Methodology (ARC-VM) suitable for acquisition-level decision making, where there is a stated desire for more informed tradeoffs between cost, schedule, and performance during the early phases of design. First, a framework is introduced to measure architecture complexity as it directly relates to military SoS. Development of the framework draws upon a diverse set of disciplines, including Complexity Science, software architecting, measurement theory, and utility theory. Next, a Real Options based valuation strategy is developed using techniques established for financial stock options that have recently been adapted for use in business and engineering decisions. The derived complexity measure provides architects with an objective measure of complexity that focuses on relevant complex system attributes. These attributes are related to the organization and distribution of SoS functionality and the sharing and processing of resources. The use of Real Options provides the necessary conceptual and visual framework to quantifiably and traceably combine measured architecture complexity, time-valued performance levels, as well as programmatic risks and uncertainties. An example suppression of enemy air defenses (SEAD) capability demonstrates the development and usefulness of the resulting architecture complexity & Real Options based valuation methodology. Different portfolios of candidate system types are used to generate an array of architecture alternatives that are then evaluated using an engagement model. This performance data is combined with both measured architecture complexity and programmatic data to assign an acquisition value to each alternative. This proves useful when selecting alternatives most likely to meet current and future capability needs.
2014-01-01
Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on average 15% of the mean values over the succeeding parameter sets. Conclusions Our results indicate that the presented approach is effective for comparing model alternatives and reducing models to the minimum complexity replicating measured data. We therefore believe that this approach has significant potential for reparameterising existing frameworks, for identification of redundant model components of large biophysical models and to increase their predictive capacity. PMID:24886522
DOE Office of Scientific and Technical Information (OSTI.GOV)
Banar, Mufide; Cokaygil, Zerrin; Ozkan, Aysun
Life cycle assessment (LCA) methodology was used to determine the optimum municipal solid waste (MSW) management strategy for Eskisehir city. Eskisehir is one of the developing cities of Turkey where a total of approximately 750 tons/day of waste is generated. An effective MSW management system is needed in this city since the generated MSW is dumped in an unregulated dumping site that has no liner, no biogas capture, etc. Therefore, five different scenarios were developed as alternatives to the current waste management system. Collection and transportation of waste, a material recovery facility (MRF), recycling, composting, incineration and landfilling processes weremore » considered in these scenarios. SimaPro7 libraries were used to obtain background data for the life cycle inventory. One ton of municipal solid waste of Eskisehir was selected as the functional unit. The alternative scenarios were compared through the CML 2000 method and these comparisons were carried out from the abiotic depletion, global warming, human toxicity, acidification, eutrophication and photochemical ozone depletion points of view. According to the comparisons and sensitivity analysis, composting scenario, S3, is the more environmentally preferable alternative. In this study waste management alternatives were investigated only on an environmental point of view. For that reason, it might be supported with other decision-making tools that consider the economic and social effects of solid waste management.« less
A combined ANP-delphi approach to evaluate sustainable tourism
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia-Melon, Monica, E-mail: mgarciam@dpi.upv.es; Gomez-Navarro, Tomas, E-mail: tgomez@dpi.upv.es; Acuna-Dutra, Silvia, E-mail: sacuna@unime.edu.ve
The evaluation of sustainable tourism strategies promoted by National Parks (NP) related stakeholders is a key concern for NP managers. To help them in their strategic evaluation procedures, in this paper we propose a methodology based on the Analytic Network Process and a Delphi-type judgment-ensuring procedure. The approach aims at involving stakeholders in a participatory and consensus-building process. The methodology was applied to Los Roques NP in Venezuela. The problem included three sustainable tourism strategies defined by the stakeholders: eco-efficient resorts, eco-friendly leisure activities and ecological transportation systems. Representatives of eight stakeholders participated in the methodology. 13 sustainability criteria weremore » selected. Results provide some important insights into the overall philosophy and underlying participants' conception of what sustainable development of Los Roques NP means. This conception is broadly shared by stakeholders as they coincided in the weights of most of the criteria, which were assigned individually through the questionnaire. It is particularly noteworthy that tourists and environmentalists almost fully match in their assessments of criteria but not of the alternatives. Moreover, there is a great agreement in the final assessment. This suggests that the regular contact among the different stakeholders, i.e. tourists with inhabitants, authorities with environmentalists, tour operators with representatives of the ministry, etc. has led to a common understanding of the opportunities and threats for the NP. They all agreed that the procedure enhances participation and transparency and it is a necessary source of information and support for their decisions.« less
PRA (Probabilistic Risk Assessments) Participation versus Validation
NASA Technical Reports Server (NTRS)
DeMott, Diana; Banke, Richard
2013-01-01
Probabilistic Risk Assessments (PRAs) are performed for projects or programs where the consequences of failure are highly undesirable. PRAs primarily address the level of risk those projects or programs posed during operations. PRAs are often developed after the design has been completed. Design and operational details used to develop models include approved and accepted design information regarding equipment, components, systems and failure data. This methodology basically validates the risk parameters of the project or system design. For high risk or high dollar projects, using PRA methodologies during the design process provides new opportunities to influence the design early in the project life cycle to identify, eliminate or mitigate potential risks. Identifying risk drivers before the design has been set allows the design engineers to understand the inherent risk of their current design and consider potential risk mitigation changes. This can become an iterative process where the PRA model can be used to determine if the mitigation technique is effective in reducing risk. This can result in more efficient and cost effective design changes. PRA methodology can be used to assess the risk of design alternatives and can demonstrate how major design changes or program modifications impact the overall program or project risk. PRA has been used for the last two decades to validate risk predictions and acceptability. Providing risk information which can positively influence final system and equipment design the PRA tool can also participate in design development, providing a safe and cost effective product.
Joseph, Adrian; Goldrick, Stephen; Mollet, Michael; Turner, Richard; Bender, Jean; Gruber, David; Farid, Suzanne S; Titchener-Hooker, Nigel
2017-05-01
Continuous disk-stack centrifugation is typically used for the removal of cells and cellular debris from mammalian cell culture broths at manufacturing-scale. The use of scale-down methods to characterise disk-stack centrifugation performance enables substantial reductions in material requirements and allows a much wider design space to be tested than is currently possible at pilot-scale. The process of scaling down centrifugation has historically been challenging due to the difficulties in mimicking the Energy Dissipation Rates (EDRs) in typical machines. This paper describes an alternative and easy-to-assemble automated capillary-based methodology to generate levels of EDRs consistent with those found in a continuous disk-stack centrifuge. Variations in EDR were achieved through changes in capillary internal diameter and the flow rate of operation through the capillary. The EDRs found to match the levels of shear in the feed zone of a pilot-scale centrifuge using the experimental method developed in this paper (2.4×10 5 W/Kg) are consistent with those obtained through previously published computational fluid dynamic (CFD) studies (2.0×10 5 W/Kg). Furthermore, this methodology can be incorporated into existing scale-down methods to model the process performance of continuous disk-stack centrifuges. This was demonstrated through the characterisation of culture hold time, culture temperature and EDRs on centrate quality. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Technical Reports Server (NTRS)
Wheatley, Thomas E.; Michaloski, John L.; Lumia, Ronald
1989-01-01
Analysis of a robot control system leads to a broad range of processing requirements. One fundamental requirement of a robot control system is the necessity of a microcomputer system in order to provide sufficient processing capability.The use of multiple processors in a parallel architecture is beneficial for a number of reasons, including better cost performance, modular growth, increased reliability through replication, and flexibility for testing alternate control strategies via different partitioning. A survey of the progression from low level control synchronizing primitives to higher level communication tools is presented. The system communication and control mechanisms of existing robot control systems are compared to the hierarchical control model. The impact of this design methodology on the current robot control systems is explored.
Applying the compound Poisson process model to the reporting of injury-related mortality rates.
Kegler, Scott R
2007-02-16
Injury-related mortality rate estimates are often analyzed under the assumption that case counts follow a Poisson distribution. Certain types of injury incidents occasionally involve multiple fatalities, however, resulting in dependencies between cases that are not reflected in the simple Poisson model and which can affect even basic statistical analyses. This paper explores the compound Poisson process model as an alternative, emphasizing adjustments to some commonly used interval estimators for population-based rates and rate ratios. The adjusted estimators involve relatively simple closed-form computations, which in the absence of multiple-case incidents reduce to familiar estimators based on the simpler Poisson model. Summary data from the National Violent Death Reporting System are referenced in several examples demonstrating application of the proposed methodology.
Biotechnological production of gluconic acid: future implications.
Singh, Om V; Kumar, Raj
2007-06-01
Gluconic acid (GA) is a multifunctional carbonic acid regarded as a bulk chemical in the food, feed, beverage, textile, pharmaceutical, and construction industries. The favored production process is submerged fermentation by Aspergillus niger utilizing glucose as a major carbohydrate source, which accompanied product yield of 98%. However, use of GA and its derivatives is currently restricted because of high prices: about US$ 1.20-8.50/kg. Advancements in biotechnology such as screening of microorganisms, immobilization techniques, and modifications in fermentation process for continuous fermentation, including genetic engineering programmes, could lead to cost-effective production of GA. Among alternative carbohydrate sources, sugarcane molasses, grape must show highest GA yield of 95.8%, and banana must may assist reducing the overall cost of GA production. These methodologies would open new markets and increase applications of GA.
Myoelectric signal processing for control of powered limb prostheses.
Parker, P; Englehart, K; Hudgins, B
2006-12-01
Progress in myoelectric control technology has over the years been incremental, due in part to the alternating focus of the R&D between control methodology and device hardware. The technology has over the past 50 years or so moved from single muscle control of a single prosthesis function to muscle group activity control of multifunction prostheses. Central to these changes have been developments in the means of extracting information from the myoelectric signal. This paper gives an overview of the myoelectric signal processing challenge, a brief look at the challenge from an historical perspective, the state-of-the-art in myoelectric signal processing for prosthesis control, and an indication of where this field is heading. The paper demonstrates that considerable progress has been made in providing clients with useful and reliable myoelectric communication channels, and that exciting work and developments are on the horizon.
Modeling of solar polygeneration plant
NASA Astrophysics Data System (ADS)
Leiva, Roberto; Escobar, Rodrigo; Cardemil, José
2017-06-01
In this work, a exergoeconomic analysis of the joint production of electricity, fresh water, cooling and process heat for a simulated concentrated solar power (CSP) based on parabolic trough collector (PTC) with thermal energy storage (TES) and backup energy system (BS), a multi-effect distillation (MED) module, a refrigeration absorption module, and process heat module is carried out. Polygeneration plant is simulated in northern Chile in Crucero with a yearly total DNI of 3,389 kWh/m2/year. The methodology includes designing and modeling a polygeneration plant and applying exergoeconomic evaluations and calculating levelized cost. Solar polygeneration plant is simulated hourly, in a typical meteorological year, for different solar multiple and hour of storage. This study reveals that the total exergy cost rate of products (sum of exergy cost rate of electricity, water, cooling and heat process) is an alternative method to optimize a solar polygeneration plant.
Lifetime costing of the body-in-white: Steel vs. aluminum
NASA Astrophysics Data System (ADS)
Han, Helen N.; Clark, Joel P.
1995-05-01
In order to make informed material choice decisions and to derive the maximum benefit from the use of alternative materials, the automobile producer must understand the full range of costs and benefits for each material. It is becoming clear that the conventional cost-benefit analysis structure currently used by the automotive industry must be broadened to include nontraditional costs such as the environmental externalities associated with the use of existing and potential automotive technologies. This article develops a methodology for comparing the costs and benefits associated with the use of alternative materials in automotive applications by focusing on steel and aluminum in the unibody body-in-white. Authors' Note: This is the first of two articles documenting a methodology for evaluating the lifetime monetary and environmental costs of alternative materials in automotive applications. This article addresses the traditional money costs while a subsequent paper, which is planned for the August issue, will address the environmental externalities.
Large scale nonlinear programming for the optimization of spacecraft trajectories
NASA Astrophysics Data System (ADS)
Arrieta-Camacho, Juan Jose
Despite the availability of high fidelity mathematical models, the computation of accurate optimal spacecraft trajectories has never been an easy task. While simplified models of spacecraft motion can provide useful estimates on energy requirements, sizing, and cost; the actual launch window and maneuver scheduling must rely on more accurate representations. We propose an alternative for the computation of optimal transfers that uses an accurate representation of the spacecraft dynamics. Like other methodologies for trajectory optimization, this alternative is able to consider all major disturbances. In contrast, it can handle explicitly equality and inequality constraints throughout the trajectory; it requires neither the derivation of costate equations nor the identification of the constrained arcs. The alternative consist of two steps: (1) discretizing the dynamic model using high-order collocation at Radau points, which displays numerical advantages, and (2) solution to the resulting Nonlinear Programming (NLP) problem using an interior point method, which does not suffer from the performance bottleneck associated with identifying the active set, as required by sequential quadratic programming methods; in this way the methodology exploits the availability of sound numerical methods, and next generation NLP solvers. In practice the methodology is versatile; it can be applied to a variety of aerospace problems like homing, guidance, and aircraft collision avoidance; the methodology is particularly well suited for low-thrust spacecraft trajectory optimization. Examples are presented which consider the optimization of a low-thrust orbit transfer subject to the main disturbances due to Earth's gravity field together with Lunar and Solar attraction. Other example considers the optimization of a multiple asteroid rendezvous problem. In both cases, the ability of our proposed methodology to consider non-standard objective functions and constraints is illustrated. Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.
Harrison, David; Coughlin, Conor; Hogan, Dylan; Edwards, Deborah A; Smith, Benjamin C
2018-01-01
The present paper describes a methodology for evaluating impacts of Superfund remedial alternatives on the regional economy in the context of a broader sustainability evaluation. Although economic impact methodology is well established, some applications to Superfund remedial evaluation have created confusion because of seemingly contradictory results. This confusion arises from failure to be explicit about 2 opposing impacts of remediation expenditures: 1) positive regional impacts of spending additional money in the region and 2) negative regional impacts of the need to pay for the expenditures (and thus forgo other expenditures in the region). The present paper provides a template for economic impact assessment that takes both positive and negative impacts into account, thus providing comprehensive estimates of net impacts. The paper also provides a strategy for identifying and estimating major uncertainties in the net impacts. The recommended methodology was applied at the Portland Harbor Superfund Site, located along the Lower Willamette River in Portland, Oregon, USA. The US Environmental Protection Agency (USEPA) developed remedial alternatives that it estimated would cost up to several billion dollars, with construction durations possibly lasting decades. The economic study estimated regional economic impacts-measured in terms of gross regional product (GRP), personal income, population, and employment-for 5 of the USEPA alternatives relative to the "no further action" alternative. Integr Environ Assess Manag 2018;14:32-42. © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC). © 2017 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of Society of Environmental Toxicology & Chemistry (SETAC).
Emerging technologies for the changing global market
NASA Technical Reports Server (NTRS)
Cruit, Wendy; Schutzenhofer, Scott; Goldberg, Ben; Everhart, Kurt
1993-01-01
This project served to define an appropriate methodology for effective prioritization of technology efforts required to develop replacement technologies mandated by imposed and forecast legislation. The methodology used is a semi-quantative approach derived from quality function deployment techniques (QFD Matrix). This methodology aims to weight the full environmental, cost, safety, reliability, and programmatic implications of replacement technology development to allow appropriate identification of viable candidates and programmatic alternatives. The results will be implemented as a guideline for consideration for current NASA propulsion systems.
An Alternative Approach for Nonlinear Latent Variable Models
ERIC Educational Resources Information Center
Mooijaart, Ab; Bentler, Peter M.
2010-01-01
In the last decades there has been an increasing interest in nonlinear latent variable models. Since the seminal paper of Kenny and Judd, several methods have been proposed for dealing with these kinds of models. This article introduces an alternative approach. The methodology involves fitting some third-order moments in addition to the means and…
An Alternative Study of Transfer of Learning in Clinical Evaluation.
ERIC Educational Resources Information Center
Patel, Vimla; Cranton, Patricia A.
The use of an alternative methodology to study transfer of learning in clinical instruction during medical school was investigated. The environment in which clinical instruction takes place was examined, after which hypotheses were proposed and tested in a quasi-experimental design. The first phase of the study, an ethnographic analysis of the…
A Review of Self-Report and Alternative Approaches in the Measurement of Student Motivation
ERIC Educational Resources Information Center
Fulmer, Sara M.; Frijters, Jan C.
2009-01-01
Within psychological and educational research, self-report methodology dominates the study of student motivation. The present review argues that the scope of motivation research can be expanded by incorporating a wider range of methodologies and measurement tools. Several authors have suggested that current study of motivation is overly reliant on…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-12
... a post-preliminary analysis in which we altered the cost-of- production methodology from that which... scope of the order is dispositive. Alternative Cost Methodology In our Preliminary Results we relied on... Results, 75 FR at 12516), and we compared the home-market prices to POR costs for the cost-of-production...
ERIC Educational Resources Information Center
Stephen, Timothy D.
2011-01-01
The problem of how to rank academic journals in the communication field (human interaction, mass communication, speech, and rhetoric) is one of practical importance to scholars, university administrators, and librarians, yet there is no methodology that covers the field's journals comprehensively and objectively. This article reports a new ranking…
Translating Oral Health-Related Quality of Life Measures: Are There Alternative Methodologies?
ERIC Educational Resources Information Center
Brondani, Mario; He, Sarah
2013-01-01
Translating existing sociodental indicators to another language involves a rigorous methodology, which can be costly. Free-of-charge online translator tools are available, but have not been evaluated in the context of research involving quality of life measures. To explore the value of using online translator tools to develop oral health-related…
Archaeology as a social science.
Smith, Michael E; Feinman, Gary M; Drennan, Robert D; Earle, Timothy; Morris, Ian
2012-05-15
Because of advances in methods and theory, archaeology now addresses issues central to debates in the social sciences in a far more sophisticated manner than ever before. Coupled with methodological innovations, multiscalar archaeological studies around the world have produced a wealth of new data that provide a unique perspective on long-term changes in human societies, as they document variation in human behavior and institutions before the modern era. We illustrate these points with three examples: changes in human settlements, the roles of markets and states in deep history, and changes in standards of living. Alternative pathways toward complexity suggest how common processes may operate under contrasting ecologies, populations, and economic integration.
Challenges and breakthroughs in recent research on self-assembly
Ariga, Katsuhiko; Hill, Jonathan P; Lee, Michael V; Vinu, Ajayan; Charvet, Richard; Acharya, Somobrata
2008-01-01
The controlled fabrication of nanometer-scale objects is without doubt one of the central issues in current science and technology. However, existing fabrication techniques suffer from several disadvantages including size-restrictions and a general paucity of applicable materials. Because of this, the development of alternative approaches based on supramolecular self-assembly processes is anticipated as a breakthrough methodology. This review article aims to comprehensively summarize the salient aspects of self-assembly through the introduction of the recent challenges and breakthroughs in three categories: (i) types of self-assembly in bulk media; (ii) types of components for self-assembly in bulk media; and (iii) self-assembly at interfaces. PMID:27877935
NASA Astrophysics Data System (ADS)
Nash, A. E., III
2017-12-01
The most common approaches to identifying the most effective mission design to maximize science return from a potential set of competing alternative design approaches are often inefficient and inaccurate. Recently, Team-X at the Jet Propulsion Laboratory undertook an effort to improve both the speed and quality of science - measurement - mission design trade studies. We will report on the methodology & processes employed and their effectiveness in trade study speed and quality. Our results indicate that facilitated subject matter expert peers are the keys to speed and quality improvements in the effectiveness of science - measurement - mission design trade studies.
Ribosome profiling reveals the what, when, where and how of protein synthesis.
Brar, Gloria A; Weissman, Jonathan S
2015-11-01
Ribosome profiling, which involves the deep sequencing of ribosome-protected mRNA fragments, is a powerful tool for globally monitoring translation in vivo. The method has facilitated discovery of the regulation of gene expression underlying diverse and complex biological processes, of important aspects of the mechanism of protein synthesis, and even of new proteins, by providing a systematic approach for experimental annotation of coding regions. Here, we introduce the methodology of ribosome profiling and discuss examples in which this approach has been a key factor in guiding biological discovery, including its prominent role in identifying thousands of novel translated short open reading frames and alternative translation products.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cole, Charles R.; Bergeron, Marcel P.; Wurstner, Signe K.
2001-05-31
This report describes a new initiative to strengthen the technical defensibility of predictions made with the Hanford site-wide groundwater flow and transport model. The focus is on characterizing major uncertainties in the current model. PNNL will develop and implement a calibration approach and methodology that can be used to evaluate alternative conceptual models of the Hanford aquifer system. The calibration process will involve a three-dimensional transient inverse calibration of each numerical model to historical observations of hydraulic and water quality impacts to the unconfined aquifer system from Hanford operations since the mid-1940s.
Archaeology as a social science
Smith, Michael E.; Feinman, Gary M.; Drennan, Robert D.; Earle, Timothy; Morris, Ian
2012-01-01
Because of advances in methods and theory, archaeology now addresses issues central to debates in the social sciences in a far more sophisticated manner than ever before. Coupled with methodological innovations, multiscalar archaeological studies around the world have produced a wealth of new data that provide a unique perspective on long-term changes in human societies, as they document variation in human behavior and institutions before the modern era. We illustrate these points with three examples: changes in human settlements, the roles of markets and states in deep history, and changes in standards of living. Alternative pathways toward complexity suggest how common processes may operate under contrasting ecologies, populations, and economic integration. PMID:22547811
Performance-costs evaluation for urban storm drainage.
Baptista, M; Barraud, S; Alfakih, E; Nascimento, N; Fernandes, W; Moura, P; Castro, L
2005-01-01
The design process of urban stormwater systems incorporating BMPs involves more complexity unlike the design of classic drainage systems for which just the technique of pipes is likely to be used. This paper presents a simple decision aid methodology and an associated software (AvDren) concerning urban stormwater systems, devoted to the evaluation and the comparison of drainage scenarios using BMPs according to different technical, sanitary, social environmental and economical aspects. This kind of tool is particularly interesting so as to help the decision makers to select the appropriate alternative and to plan the investments especially for developing countries, with important sanitary problems and severe budget restrictions.
On the performance of metrics to predict quality in point cloud representations
NASA Astrophysics Data System (ADS)
Alexiou, Evangelos; Ebrahimi, Touradj
2017-09-01
Point clouds are a promising alternative for immersive representation of visual contents. Recently, an increased interest has been observed in the acquisition, processing and rendering of this modality. Although subjective and objective evaluations are critical in order to assess the visual quality of media content, they still remain open problems for point cloud representation. In this paper we focus our efforts on subjective quality assessment of point cloud geometry, subject to typical types of impairments such as noise corruption and compression-like distortions. In particular, we propose a subjective methodology that is closer to real-life scenarios of point cloud visualization. The performance of the state-of-the-art objective metrics is assessed by considering the subjective scores as the ground truth. Moreover, we investigate the impact of adopting different test methodologies by comparing them. Advantages and drawbacks of every approach are reported, based on statistical analysis. The results and conclusions of this work provide useful insights that could be considered in future experimentation.
Nagy, Gergely György; Várvölgyi, Csaba; Balogh, Zoltán; Orosi, Piroska; Paragh, György
2013-01-06
The incidence of Clostridium difficile associated enteral disease shows dramatic increase worldwide, with appallingly high treatment costs, mortality figures, recurrence rates and treatment refractoriness. It is not surprising, that there is significant interest in the development and introduction of alternative therapeutic strategies. Among these only stool transplantation (or faecal bacteriotherapy) is gaining international acceptance due to its excellent cure rate (≈92%), low recurrence rate (≈6%), safety and cost-effectiveness. Unfortunately faecal transplantation is not available for most patients, although based on promising international results, its introduction into the routine clinical practice is well justified and widely expected. The authors would like to facilitate this process, by presenting a detailed faecal transplantation protocol prepared in their Institution based on the available literature and clinical rationality. Officially accepted national methodological guidelines will need to be issued in the future, founded on the expert opinion of relevant professional societies and upcoming advances in this field.
NASA Astrophysics Data System (ADS)
Avilova, I. P.; Krutilova, M. O.
2018-01-01
Economic growth is the main determinant of the trend to increased greenhouse gas (GHG) emission. Therefore, the reduction of emission and stabilization of GHG levels in the atmosphere become an urgent task to avoid the worst predicted consequences of climate change. GHG emissions in construction industry take a significant part of industrial GHG emission and are expected to consistently increase. The problem could be successfully solved with a help of both economical and organizational restrictions, based on enhanced algorithms of calculation and amercement of environmental harm in building industry. This study aims to quantify of GHG emission caused by different constructive schemes of RC framework in concrete casting. The result shows that proposed methodology allows to make a comparative analysis of alternative projects in residential housing, taking into account an environmental damage, caused by construction process. The study was carried out in the framework of the Program of flagship university development on the base of Belgorod State Technological University named after V.G. Shoukhov
Polysaccharide extraction from Sphallerocarpus gracilis roots by response surface methodology.
Ma, Tingting; Sun, Xiangyu; Tian, Chengrui; Luo, Jiyang; Zheng, Cuiping; Zhan, Jicheng
2016-07-01
The extraction process of Sphallerocarpus gracilis root polysaccharides (SGRP) was optimized using response surface methodology with two methods [hot-water extraction (HWE) and ultrasonic-assisted extraction (UAE)]. The antioxidant activities of SGRP were determined, and the structural features of the untreated materials (HWE residue and UAE residue) and the extracted polysaccharides were compared by scanning electron microscopy. Results showed that the optimal UAE conditions were extraction temperature of 81°C, extraction time of 1.7h, liquid-solid ratio of 17ml/g, ultrasonic power of 300W and three extraction cycles. The optimal HWE conditions were 93°C extraction temperature, 3.6h extraction time, 21ml/g liquid-solid ratio and three extraction cycles. UAE offered a higher extraction yield with a shorter time, lower temperature and a lower solvent consumption compared with HWE, and the extracted polysaccharides possessed a higher antioxidant capacity. Therefore, UAE could be used as an alternative to conventional HWE for SGRP extraction. Copyright © 2016 Elsevier B.V. All rights reserved.
Piccolomini, Angelica A; Fiabon, Alex; Borrotti, Matteo; De Lucrezia, Davide
2017-01-01
We optimized the heterologous expression of trans-isoprenyl diphosphate synthase (IDS), the key enzyme involved in the biosynthesis of trans-polyisoprene. trans-Polyisoprene is a particularly valuable compound due to its superior stiffness, excellent insulation, and low thermal expansion coefficient. Currently, trans-polyisoprene is mainly produced through chemical synthesis and no biotechnological processes have been established so far for its large-scale production. In this work, we employed D-optimal design and response surface methodology to optimize the expression of thermophilic enzymes IDS from Thermococcus kodakaraensis. The design of experiment took into account of six factors (preinduction cell density, inducer concentration, postinduction temperature, salt concentration, alternative carbon source, and protein inhibitor) and seven culture media (LB, NZCYM, TB, M9, Ec, Ac, and EDAVIS) at five different pH points. By screening only 109 experimental points, we were able to improve IDS production by 48% in close-batch fermentation. © 2015 International Union of Biochemistry and Molecular Biology, Inc.
Evaluation of stormwater harvesting sites using multi criteria decision methodology
NASA Astrophysics Data System (ADS)
Inamdar, P. M.; Sharma, A. K.; Cook, Stephen; Perera, B. J. C.
2018-07-01
Selection of suitable urban stormwater harvesting sites and associated project planning are often complex due to spatial, temporal, economic, environmental and social factors, and related various other variables. This paper is aimed at developing a comprehensive methodology framework for evaluating of stormwater harvesting sites in urban areas using Multi Criteria Decision Analysis (MCDA). At the first phase, framework selects potential stormwater harvesting (SWH) sites using spatial characteristics in a GIS environment. In second phase, MCDA methodology is used for evaluating and ranking of SWH sites in multi-objective and multi-stakeholder environment. The paper briefly describes first phase of framework and focuses chiefly on the second phase of framework. The application of the methodology is also demonstrated over a case study comprising of the local government area, City of Melbourne (CoM), Australia for the benefit of wider water professionals engaged in this area. Nine performance measures (PMs) were identified to characterise the objectives and system performance related to the eight alternative SWH sites for the demonstration of the application of developed methodology. To reflect the stakeholder interests in the current study, four stakeholder participant groups were identified, namely, water authorities (WA), academics (AC), consultants (CS), and councils (CL). The decision analysis methodology broadly consisted of deriving PROMETHEE II rankings of eight alternative SWH sites in the CoM case study, under two distinct group decision making scenarios. The major innovation of this work is the development and application of comprehensive methodology framework that assists in the selection of potential sites for SWH, and facilitates the ranking in multi-objective and multi-stakeholder environment. It is expected that the proposed methodology will assist the water professionals and managers with better knowledge that will reduce the subjectivity in the selection and evaluation of SWH sites.
NASA Technical Reports Server (NTRS)
Borden, C. S.; Volkmer, K.; Cochrane, E. H.; Lawson, A. C.
1984-01-01
A simple methodology to estimate photovoltaic system size and life-cycle costs in stand-alone applications is presented. It is designed to assist engineers at Government agencies in determining the feasibility of using small stand-alone photovoltaic systems to supply ac or dc power to the load. Photovoltaic system design considerations are presented as well as the equations for sizing the flat-plate array and the battery storage to meet the required load. Cost effectiveness of a candidate photovoltaic system is based on comparison with the life-cycle cost of alternative systems. Examples of alternative systems addressed are batteries, diesel generators, the utility grid, and other renewable energy systems.
Methodologies and Tools for Tuning Parallel Programs: 80% Art, 20% Science, and 10% Luck
NASA Technical Reports Server (NTRS)
Yan, Jerry C.; Bailey, David (Technical Monitor)
1996-01-01
The need for computing power has forced a migration from serial computation on a single processor to parallel processing on multiprocessors. However, without effective means to monitor (and analyze) program execution, tuning the performance of parallel programs becomes exponentially difficult as program complexity and machine size increase. In the past few years, the ubiquitous introduction of performance tuning tools from various supercomputer vendors (Intel's ParAide, TMC's PRISM, CRI's Apprentice, and Convex's CXtrace) seems to indicate the maturity of performance instrumentation/monitor/tuning technologies and vendors'/customers' recognition of their importance. However, a few important questions remain: What kind of performance bottlenecks can these tools detect (or correct)? How time consuming is the performance tuning process? What are some important technical issues that remain to be tackled in this area? This workshop reviews the fundamental concepts involved in analyzing and improving the performance of parallel and heterogeneous message-passing programs. Several alternative strategies will be contrasted, and for each we will describe how currently available tuning tools (e.g. AIMS, ParAide, PRISM, Apprentice, CXtrace, ATExpert, Pablo, IPS-2) can be used to facilitate the process. We will characterize the effectiveness of the tools and methodologies based on actual user experiences at NASA Ames Research Center. Finally, we will discuss their limitations and outline recent approaches taken by vendors and the research community to address them.
State of the art for the biosorption process--a review.
Michalak, Izabela; Chojnacka, Katarzyna; Witek-Krowiak, Anna
2013-07-01
In recent years, biosorption process has become an economic and eco-friendly alternative treatment technology in the water and wastewater industry. In this light, a number of biosorbents were developed and are successfully employed for treating various pollutants including metals, dyes, phenols, fluoride, and pharmaceuticals in solutions (aqueous/oil). However, still there are few technical barriers in the biosorption process that impede its commercialization and thus to overcome these problems there has been a steadily growing interest in this research field. This resulted in large numbers of publications and patents each year. This review reports the state of the art in biosorption research. In this review, we provide a compendium of know-how in laboratory methodology, mathematical modeling of equilibrium and kinetics, identification of the biosorption mechanism. Various mathematical models of biosorption were discussed: the process in packed-bed column arrangement, as well as by suspended biomass. Particular attention was paid to patents in biosorption and pilot-scale systems. In addition, we provided future aspects in biosorption research.
Modeling and comparative assessment of municipal solid waste gasification for energy production
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arafat, Hassan A., E-mail: harafat@masdar.ac.ae; Jijakli, Kenan
Highlights: • Study developed a methodology for the evaluation of gasification for MSW treatment. • Study was conducted comparatively for USA, UAE, and Thailand. • Study applies a thermodynamic model (Gibbs free energy minimization) using the Gasify software. • The energy efficiency of the process and the compatibility with different waste streams was studied. - Abstract: Gasification is the thermochemical conversion of organic feedstocks mainly into combustible syngas (CO and H{sub 2}) along with other constituents. It has been widely used to convert coal into gaseous energy carriers but only has been recently looked at as a process for producingmore » energy from biomass. This study explores the potential of gasification for energy production and treatment of municipal solid waste (MSW). It relies on adapting the theory governing the chemistry and kinetics of the gasification process to the use of MSW as a feedstock to the process. It also relies on an equilibrium kinetics and thermodynamics solver tool (Gasify®) in the process of modeling gasification of MSW. The effect of process temperature variation on gasifying MSW was explored and the results were compared to incineration as an alternative to gasification of MSW. Also, the assessment was performed comparatively for gasification of MSW in the United Arab Emirates, USA, and Thailand, presenting a spectrum of socioeconomic settings with varying MSW compositions in order to explore the effect of MSW composition variance on the products of gasification. All in all, this study provides an insight into the potential of gasification for the treatment of MSW and as a waste to energy alternative to incineration.« less
Statistical and engineering methods for model enhancement
NASA Astrophysics Data System (ADS)
Chang, Chia-Jung
Models which describe the performance of physical process are essential for quality prediction, experimental planning, process control and optimization. Engineering models developed based on the underlying physics/mechanics of the process such as analytic models or finite element models are widely used to capture the deterministic trend of the process. However, there usually exists stochastic randomness in the system which may introduce the discrepancy between physics-based model predictions and observations in reality. Alternatively, statistical models can be used to develop models to obtain predictions purely based on the data generated from the process. However, such models tend to perform poorly when predictions are made away from the observed data points. This dissertation contributes to model enhancement research by integrating physics-based model and statistical model to mitigate the individual drawbacks and provide models with better accuracy by combining the strengths of both models. The proposed model enhancement methodologies including the following two streams: (1) data-driven enhancement approach and (2) engineering-driven enhancement approach. Through these efforts, more adequate models are obtained, which leads to better performance in system forecasting, process monitoring and decision optimization. Among different data-driven enhancement approaches, Gaussian Process (GP) model provides a powerful methodology for calibrating a physical model in the presence of model uncertainties. However, if the data contain systematic experimental errors, the GP model can lead to an unnecessarily complex adjustment of the physical model. In Chapter 2, we proposed a novel enhancement procedure, named as “Minimal Adjustment”, which brings the physical model closer to the data by making minimal changes to it. This is achieved by approximating the GP model by a linear regression model and then applying a simultaneous variable selection of the model and experimental bias terms. Two real examples and simulations are presented to demonstrate the advantages of the proposed approach. Different from enhancing the model based on data-driven perspective, an alternative approach is to focus on adjusting the model by incorporating the additional domain or engineering knowledge when available. This often leads to models that are very simple and easy to interpret. The concepts of engineering-driven enhancement are carried out through two applications to demonstrate the proposed methodologies. In the first application where polymer composite quality is focused, nanoparticle dispersion has been identified as a crucial factor affecting the mechanical properties. Transmission Electron Microscopy (TEM) images are commonly used to represent nanoparticle dispersion without further quantifications on its characteristics. In Chapter 3, we developed the engineering-driven nonhomogeneous Poisson random field modeling strategy to characterize nanoparticle dispersion status of nanocomposite polymer, which quantitatively represents the nanomaterial quality presented through image data. The model parameters are estimated through the Bayesian MCMC technique to overcome the challenge of limited amount of accessible data due to the time consuming sampling schemes. The second application is to calibrate the engineering-driven force models of laser-assisted micro milling (LAMM) process statistically, which facilitates a systematic understanding and optimization of targeted processes. In Chapter 4, the force prediction interval has been derived by incorporating the variability in the runout parameters as well as the variability in the measured cutting forces. The experimental results indicate that the model predicts the cutting force profile with good accuracy using a 95% confidence interval. To conclude, this dissertation is the research drawing attention to model enhancement, which has considerable impacts on modeling, design, and optimization of various processes and systems. The fundamental methodologies of model enhancement are developed and further applied to various applications. These research activities developed engineering compliant models for adequate system predictions based on observational data with complex variable relationships and uncertainty, which facilitate process planning, monitoring, and real-time control.
Transshipment site selection using the AHP and TOPSIS approaches under fuzzy environment.
Onüt, Semih; Soner, Selin
2008-01-01
Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker to describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.
Product design enhancement using apparent usability and affective quality.
Seva, Rosemary R; Gosiaco, Katherine Grace T; Santos, Ma Crea Eurice D; Pangilinan, Denise Mae L
2011-03-01
In this study, apparent usability and affective quality were integrated in a design framework called the Usability Perception and Emotion Enhancement Model (UPEEM). The UPEEM was validated using structural equation modeling (SEM). The methodology consists of four phases namely product selection, attribute identification, design alternative generation, and design alternative evaluation. The first stage involved the selection of a product that highly involves the consumer. In the attribute identification stage, design elements of the product were identified. The possible values of these elements were also determined for use in the experimentation process. Design of experiments was used to identify how the attributes will be varied in the design alternative stage and which of the attributes significantly contribute to affective quality, apparent usability, and desirability in the design evaluation stage. Results suggest that product attributes related to form are relevant in eliciting intense affect and perception of usability in mobile phones especially those directly related to functionality and aesthetics. This study considered only four product attributes among so many due to the constraints of the research design employed. Attributes related to aesthetic perception of a product enhance apparent usability such as those related to dimensional ratios. Copyright © 2010 Elsevier Ltd and The Ergonomics Society. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Onuet, Semih; Soner, Selin
Site selection is an important issue in waste management. Selection of the appropriate solid waste site requires consideration of multiple alternative solutions and evaluation criteria because of system complexity. Evaluation procedures involve several objectives, and it is often necessary to compromise among possibly conflicting tangible and intangible factors. For these reasons, multiple criteria decision-making (MCDM) has been found to be a useful approach to solve this kind of problem. Different MCDM models have been applied to solve this problem. But most of them are basically mathematical and ignore qualitative and often subjective considerations. It is easier for a decision-maker tomore » describe a value for an alternative by using linguistic terms. In the fuzzy-based method, the rating of each alternative is described using linguistic terms, which can also be expressed as triangular fuzzy numbers. Furthermore, there have not been any studies focused on the site selection in waste management using both fuzzy TOPSIS (technique for order preference by similarity to ideal solution) and AHP (analytical hierarchy process) techniques. In this paper, a fuzzy TOPSIS based methodology is applied to solve the solid waste transshipment site selection problem in Istanbul, Turkey. The criteria weights are calculated by using the AHP.« less
Determination of struvite crystallization mechanisms in urine using turbidity measurement.
Triger, Aurélien; Pic, Jean-Stéphane; Cabassud, Corinne
2012-11-15
Sanitation improvement in developing countries could be achieved through wastewater treatment processes. Nowadays alternative concepts such as urine separate collection are being developed. These processes would be an efficient way to reduce pollution of wastewater while recovering nutrients, especially phosphorus, which are lost in current wastewater treatment methods. The precipitation of struvite (MgNH(4)PO(4)∙6H(2)O) from urine is an efficient process yielding more than 98% phosphorus recovery with very high reaction rates. The work presented here aims to determine the kinetics and mechanisms of struvite precipitation in order to supply data for the design of efficient urine treatment processes. A methodology coupling the resolution of the population balance equation to turbidity measurement was developed, and batch experiments with synthetic and real urine were performed. The main mechanisms of struvite crystallization were identified as crystal growth and nucleation. A satisfactory approximation of the volumetric crystal size distribution was obtained. The study has shown the low influence on the crystallization process of natural organic matter contained in real urine. It has also highlighted the impact of operational parameters. Mixing conditions can create segregation and attrition which influence the nucleation rate, resulting in a change in crystals number, size, and thus final crystal size distribution (CSD). Moreover urine storage conditions can impact urea hydrolysis and lead to spontaneous struvite precipitation in the stock solution also influencing the final CSD. A few limits of the applied methodology and of the proposed modelling, due to these phenomena and to the turbidity measurement, are also discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Siegfried, Nandi; Narasimhan, Manjulaa; Kennedy, Caitlin E; Welbourn, Alice; Yuvraj, Anandi
2017-09-01
In March 2016, WHO reviewed evidence to develop global recommendations on the sexual and reproductive health and rights (SRHR) of women living with HIV. Systematic reviews and a global survey of women living with HIV informed the guideline development decision-making process. New recommendations covered abortion, Caesarean section, safe disclosure, and empowerment and self-efficacy interventions. Identification of key research gaps is part of the WHO guidelines development process, but consistent methods to do so are lacking. Our method aimed to ensure consistency and comprised the systematic application of a framework based on GRADE (Grading of Recommendations, Assessment, Development and Evaluation) to the process. The framework incorporates the strength and quality rating of recommendations and the priorities reported by women in the survey to inform research prioritisation. For each gap, we also articulated: (1) the most appropriate and robust study design to answer the question; (2) alternative pragmatic designs if the ideal design is not feasible; and (3) the methodological challenges facing researchers through identifying potential biases. We found 12 research gaps and identified five appropriate study designs to address the related questions: (1) Cross-sectional surveys; (2) Qualitative interview-driven studies; (3) Registries; (4) Randomised controlled trials; and (5) Medical record audit. Methodological challenges included selection, recruitment, misclassification, measurement and contextual biases, and confounding. In conclusion, a framework based on GRADE can provide a systematic approach to identifying research gaps from a WHO guideline. Incorporation of the priorities of women living with HIV into the framework systematically ensures that women living with HIV can shape future policy decisions affecting their lives. Implementation science and participatory research are appropriate over-arching approaches to enhance uptake of interventions and to ensure inclusion of women living with HIV at all stages of the research process.
NASA Astrophysics Data System (ADS)
Gautam, Girish Dutt; Pandey, Arun Kumar
2018-03-01
Kevlar is the most popular aramid fiber and most commonly used in different technologically advanced industries for various applications. But the precise cutting of Kevlar composite laminates is a difficult task. The conventional cutting methods face various defects such as delamination, burr formation, fiber pullout with poor surface quality and their mechanical performance is greatly affected by these defects. The laser beam machining may be an alternative of the conventional cutting processes due to its non-contact nature, requirement of low specific energy with higher production rate. But this process also faces some problems that may be minimized by operating the machine at optimum parameters levels. This research paper examines the effective utilization of the Nd:YAG laser cutting system on difficult-to-cut Kevlar-29 composite laminates. The objective of the proposed work is to find the optimum process parameters settings for getting the minimum kerf deviations at both sides. The experiments have been conducted on Kevlar-29 composite laminates having thickness 1.25 mm by using Box-Benkhen design with two center points. The experimental data have been used for the optimization by using the proposed methodology. For the optimization, Teaching learning Algorithm based approach has been employed to obtain the minimum kerf deviation at bottom and top sides. A self coded Matlab program has been developed by using the proposed methodology and this program has been used for the optimization. Finally, the confirmation tests have been performed to compare the experimental and optimum results obtained by the proposed methodology. The comparison results show that the machining performance in the laser beam cutting process has been remarkably improved through proposed approach. Finally, the influence of different laser cutting parameters such as lamp current, pulse frequency, pulse width, compressed air pressure and cutting speed on top kerf deviation and bottom kerf deviation during the Nd:YAG laser cutting of Kevlar-29 laminates have been discussed.
NASA Astrophysics Data System (ADS)
Copping, A. E.; Blake, K.; Zdanski, L.
2011-12-01
As marine and hydrokinetic (MHK) energy development projects progress towards early deployments in the U.S., the process of determining the risks to aquatic animals, habitats, and ecosystem processes from these engineered systems continues to be a significant barrier to efficient siting and permitting. Understanding the risk of MHK installations requires that the two elements of risk - consequence and probability - be evaluated. However, standard risk assessment methodologies are not easily applied to MHK interactions with marine and riverine environment as there are few data that describe the interaction of stressors (MHK devices, anchors, foundations, mooring lines and power cables) and receptors (aquatic animals, habitats and ecosystem processes). The number of possible combinations and permutations of stressors and receptors in MHK systems is large: there are many different technologies designed to harvest energy from the tides, waves and flowing rivers; each device is planned for a specific waterbody that supports an endemic ecosystem of animals and habitats, tied together by specific physical and chemical processes. With few appropriate analogue industries in the oceans and rivers, little information on the effects of these technologies on the living world is available. Similarly, without robust data sets of interactions, mathematical probability models are difficult to apply. Pacific Northwest National Laboratory scientists are working with MHK developers, researchers, engineers, and regulators to rank the consequences of planned MHK projects on living systems, and exploring alternative methodologies to estimate probabilities of these encounters. This paper will present the results of ERES, the Environmental Risk Evaluation System, which has been used to rank consequences for major animal groups and habitats for five MHK projects that are in advanced stages of development and/or early commercial deployment. Probability analyses have been performed for high priority stressor/receptor interactions where data are adaptable from other industries. In addition, a methodology for evaluating the probability of encounter, and therefore risk, to an endangered marine mammal from tidal turbine blades will be presented.
Gallart, F; Llorens, P; Latron, J; Cid, N; Rieradevall, M; Prat, N
2016-09-15
Hydrological data for assessing the regime of temporary rivers are often non-existent or scarce. The scarcity of flow data makes impossible to characterize the hydrological regime of temporary streams and, in consequence, to select the correct periods and methods to determine their ecological status. This is why the TREHS software is being developed, in the framework of the LIFE Trivers project. It will help managers to implement adequately the European Water Framework Directive in this kind of water body. TREHS, using the methodology described in Gallart et al. (2012), defines six transient 'aquatic states', based on hydrological conditions representing different mesohabitats, for a given reach at a particular moment. Because of its qualitative nature, this approach allows using alternative methodologies to assess the regime of temporary rivers when there are no observed flow data. These methods, based on interviews and high-resolution aerial photographs, were tested for estimating the aquatic regime of temporary rivers. All the gauging stations (13) belonging to the Catalan Internal Catchments (NE Spain) with recurrent zero-flow periods were selected to validate this methodology. On the one hand, non-structured interviews were conducted with inhabitants of villages near the gauging stations. On the other hand, the historical series of available orthophotographs were examined. Flow records measured at the gauging stations were used to validate the alternative methods. Flow permanence in the reaches was estimated reasonably by the interviews and adequately by aerial photographs, when compared with the values estimated using daily flows. The degree of seasonality was assessed only roughly by the interviews. The recurrence of disconnected pools was not detected by flow records but was estimated with some divergences by the two methods. The combination of the two alternative methods allows substituting or complementing flow records, to be updated in the future through monitoring by professionals and citizens. Copyright © 2016 Elsevier B.V. All rights reserved.
Amy L. Sheaffer; Jay Beaman; Joseph T. O' Leary; Rebecca L. Williams; Doran M. Mason
2001-01-01
Sampling for research in recreation settings in an ongoing challenge. Often certain groups of users are more likely to be sampled. It is important in measuring public support for resource conservation and in understanding use of natural resources for recreation to evaluate issues of bias in survey methodologies. Important methodological issues emerged from a statewide...
Air Force Energy Program Policy Memorandum
2009-06-16
Critical Asset Prioritization Methodology ( CAPM ) tool Manage costs. 3.4.2.5. Metrics Percentage of alternative/renewable fuel used for aviation fuel...supporting critical assets residing on military installations Field the Critical Asset Prioritization Methodology ( CAPM ) tool by Spring 2008. This CAPM ...Increase the number of flexible fuel systems • Identify/develop privately financed/operated energy production on Air Bases • Field the Critical
Do we need methodological theory to do qualitative research?
Avis, Mark
2003-09-01
Positivism is frequently used to stand for the epistemological assumption that empirical science based on principles of verificationism, objectivity, and reproducibility is the foundation of all genuine knowledge. Qualitative researchers sometimes feel obliged to provide methodological alternatives to positivism that recognize their different ethical, ontological, and epistemological commitments and have provided three theories: phenomenology, grounded theory, and ethnography. The author argues that positivism was a doomed attempt to define empirical foundations for knowledge through a rigorous separation of theory and evidence; offers a pragmatic, coherent view of knowledge; and suggests that rigorous, rational empirical investigation does not need methodological theory. Therefore, qualitative methodological theory is unnecessary and counterproductive because it hinders critical reflection on the relation between methodological theory and empirical evidence.
Optimization and analysis of NF3 in situ chamber cleaning plasmas
NASA Astrophysics Data System (ADS)
Ji, Bing; Yang, James H.; Badowski, Peter R.; Karwacki, Eugene J.
2004-04-01
We report on the optimization and analysis of a dilute NF3 in situ plasma-enhanced chemical vapor deposition chamber cleaning plasma for an Applied Materials P-5000 DxL chamber. Using design of experiments methodology, we identified and optimized operating conditions within the following process space: 10-15 mol % NF3 diluted with helium, 200-400 sccm NF3 flow rate, 2.5-3.5 Torr chamber pressure, and 950 W rf power. Optical emission spectroscopy and Fourier transform infrared spectroscopy were used to endpoint the cleaning processes and to quantify plasma effluent emissions, respectively. The results demonstrate that dilute NF3-based in situ chamber cleaning can be a viable alternative to perfluorocarbon-based in situ cleans with added benefits. The relationship between chamber clean time and fluorine atom density in the plasma is also investigated.
Soni-removal of nucleic acids from inclusion bodies.
Neerathilingam, Muniasamy; Mysore, Sumukh; Gandham, Sai Hari A
2014-05-23
Inclusion bodies (IBs) are commonly formed in Escherichia coli due to over expression of recombinant proteins in non-native state. Isolation, denaturation and refolding of these IBs is generally performed to obtain functional protein. However, during this process IBs tend to form non-specific interactions with sheared nucleic acids from the genome, thus getting carried over into downstream processes. This may hinder the refolding of IBs into their native state. To circumvent this, we demonstrate a methodology termed soni-removal which involves disruption of nucleic acid-inclusion body interaction using sonication; followed by solvent based separation. As opposed to conventional techniques that use enzymes and column-based separations, soni-removal is a cost effective alternative for complete elimination of buried and/or strongly bound short nucleic acid contaminants from IBs. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Cross-correlation of point series using a new method
NASA Technical Reports Server (NTRS)
Strothers, Richard B.
1994-01-01
Traditional methods of cross-correlation of two time series do not apply to point time series. Here, a new method, devised specifically for point series, utilizes a correlation measure that is based in the rms difference (or, alternatively, the median absolute difference) between nearest neightbors in overlapped segments of the two series. Error estimates for the observed locations of the points, as well as a systematic shift of one series with respect to the other to accommodate a constant, but unknown, lead or lag, are easily incorporated into the analysis using Monte Carlo techniques. A methodological restriction adopted here is that one series be treated as a template series against which the other, called the target series, is cross-correlated. To estimate a significance level for the correlation measure, the adopted alternative (null) hypothesis is that the target series arises from a homogeneous Poisson process. The new method is applied to cross-correlating the times of the greatest geomagnetic storms with the times of maximum in the undecennial solar activity cycle.
Environmental economics of lignin derived transport fuels.
Obydenkova, Svetlana V; Kouris, Panos D; Hensen, Emiel J M; Heeres, Hero J; Boot, Michael D
2017-11-01
This paper explores the environmental and economic aspects of fast pyrolytic conversion of lignin, obtained from 2G ethanol plants, to transport fuels for both the marine and automotive markets. Various scenarios are explored, pertaining to aggregation of lignin from several sites, alternative energy carries to replace lignin, transport modalities, and allocation methodology. The results highlight two critical factors that ultimately determine the economic and/or environmental fuel viability. The first factor, the logistics scheme, exhibited the disadvantage of the centralized approach, owing to prohibitively expensive transportation costs of the low energy-dense lignin. Life cycle analysis (LCA) displayed the second critical factor related to alternative energy carrier selection. Natural gas (NG) chosen over additional biomass boosts well-to-wheel greenhouse gas emissions (WTW GHG) to a level incompatible with the reduction targets set by the U.S. renewable fuel standard (RFS). Adversely, the process' economics revealed higher profits vs. fossil energy carrier. Copyright © 2017 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Recent advances in 193-nm single-layer photoresists based on alternating copolymers of cycloolefins
NASA Astrophysics Data System (ADS)
Houlihan, Francis M.; Wallow, Thomas I.; Timko, Allen G.; Neria, E.; Hutton, Richard S.; Cirelli, Raymond A.; Nalamasu, Omkaram; Reichmanis, Elsa
1997-07-01
We report on our recent investigations on the formulation and processing of 193 nm single layer photoresists based on alternating copolymers of cycloolefins with maleic anhydride. Resists formulated with cycloolefin copolymers are compatible with 0.262 N tetramethylammonium developers, have excellent adhesion, sensitivity, etch resistance and thermal flow properties. The effect of polymer structure and composition, dissolution inhibitor structure and loading as well as the effect of the photoacid generator on the resist dissolution properties was investigated. Based on the results high contrast formulations were evaluated on a GCA XLS (NA equals 0.53, 4X reduction optics) deep-UV stepper to exhibit 0.27 micrometer L/S pair resolution with excellent photosensitivity. Based on the dissolution properties and a spectroscopic examination of the resist, we have designed materials that show less than 0.17 micrometer L/S pair resolution with 193 nm exposures. In this paper, the formulation methodology is detailed and the most recent results upon both with 248 and 193 nm irradiation are described.
Coghetto, Chaline Caren; Brinques, Graziela Brusch; Ayub, Marco Antônio Záchia
2016-12-01
Probiotic products are dietary supplements containing live microorganisms producing beneficial health effects on the host by improving intestinal balance and nutrient absorption. Among probiotic microorganisms, those classified as lactic acid bacteria are of major importance to the food and feed industries. Probiotic cells can be produced using alternative carbon and nitrogen sources, such as agroindustrial residues, at the same time contributing to reduce process costs. On the other hand, the survival of probiotic cells in formulated food products, as well as in the host gut, is an essential nutritional aspect concerning health benefits. Therefore, several cell microencapsulation techniques have been investigated as a way to improve cell viability and survival under adverse environmental conditions, such as the gastrointestinal milieu of hosts. In this review, different aspects of probiotic cells and technologies of their related products are discussed, including formulation of culture media, and aspects of cell microencapsulation techniques required to improve their survival in the host.
Connelly, L; Price, J
1996-04-01
Alcoholic Wernicke's encephalopathy has been commonplace in Australia for many years and, as this syndrome is attributed to a deficiency in the diet, it should be preventable. This study employs conventional cost-effectiveness methodology to compare the economic efficiency of several thiamin-supplementation alternatives that have been proposed for the prevention of Wernicke's encephalopathy. A series of rankings of these measures is derived from an estimated cost per case averted for each of the alternatives studied. These rankings identify the least cost-effective thiamin-supplementation alternative as that of enriching bread-making flour with thiamin.
Assessment of a satellite power system and six alternative technologies
NASA Technical Reports Server (NTRS)
Wolsko, T.; Whitfield, R.; Samsa, M.; Habegger, L. S.; Levine, E.; Tanzman, E.
1981-01-01
The satellite power system is assessed in comparison to six alternative technologies. The alternatives are: central-station terrestrial photovoltaic systems, conventional coal-fired power plants, coal-gasification/combined-cycle power plants, light water reactor power plants, liquid-metal fast-breeder reactors, and fusion. The comparison is made regarding issues of cost and performance, health and safety, environmental effects, resources, socio-economic factors, and institutional issues. The criteria for selecting the issues and the alternative technologies are given, and the methodology of the comparison is discussed. Brief descriptions of each of the technologies considered are included.
González-García, Sara; García Lozano, Raúl; Moreira, M Teresa; Gabarrell, Xavier; Rieradevall i Pons, Joan; Feijoo, Gumersindo; Murphy, Richard J
2012-06-01
The environmental profile of a set of wood furniture was carried out to define the best design criteria for its eco-design. A baby cot convertible into a bed, a study desk and a bedside table were the objects of study. Two quantitative and qualitative environmental approaches were combined in order to propose improvement alternatives: Life Cycle Assessment (LCA) and Design for Environment (DfE). In the first case Life Cycle Assessment (LCA) was applied to identify the hot spots in the product system. As a next step, LCA information was used in eco-briefing to determine several improvement alternatives. A wood products company located in Catalonia (NE Spain) was assessed in detail, dividing the process into three stages: assembly, finishing and packaging. Ten impact categories were considered in the LCA study: abiotic depletion, acidification, eutrophication, global warming, ozone layer depletion, human toxicity, fresh water aquatic ecotoxicity, marine aquatic ecotoxicity, terrestrial ecotoxicity and photochemical oxidant formation. Two processes can be considered the key environmental factors: the production of the wooden boards and electricity, with contributions of 45-68% and 14-33% respectively depending on the impact categories. Subsequently, several improvement alternatives were proposed in the eco-design process (DfE) to achieve reductions in a short-medium period of time in the environmental impact. These eco-design strategies could reduce the environmental profile of the setup by 14%. The correct methodological adaptation of the concept of eco-briefing, as a tool for communication among environmental technicians and designers, the simplification of the analytical tool used and the LCA, could facilitate the environmental analysis of a product. The results obtained provide information that can help the furniture sector to improve their environmental performance. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
DeMott, Diana; Fuqua, Bryan; Wilson, Paul
2013-01-01
Once a project obtains approval, decision makers have to consider a variety of alternative paths for completing the project and meeting the project objectives. How decisions are made involves a variety of elements including: cost, experience, current technology, ideologies, politics, future needs and desires, capabilities, manpower, timing, available information, and for many ventures management needs to assess the elements of risk versus reward. The use of high level Probabilistic Risk Assessment (PRA) Models during conceptual design phases provides management with additional information during the decision making process regarding the risk potential for proposed operations and design prototypes. The methodology can be used as a tool to: 1) allow trade studies to compare alternatives based on risk, 2) determine which elements (equipment, process or operational parameters) drives the risk, and 3) provide information to mitigate or eliminate risks early in the conceptual design to lower costs. Creating system models using conceptual design proposals and generic key systems based on what is known today can provide an understanding of the magnitudes of proposed systems and operational risks and facilitates trade study comparisons early in the decision making process. Identifying the "best" way to achieve the desired results is difficult, and generally occurs based on limited information. PRA provides a tool for decision makers to explore how some decisions will affect risk before the project is committed to that path, which can ultimately save time and money.
NASA Astrophysics Data System (ADS)
Qaddus, Muhammad Kamil
The gap between estimated and actual savings in energy efficiency and conservation (EE&C) projects or programs forms the problem statement for the scope of public and government buildings. This gap has been analyzed first on impact and then on process-level. On the impact-level, the methodology leads to categorization of the gap as 'Realization Gap'. It then views the categorization of gap within the context of past and current narratives linked to realization gap. On process-level, the methodology leads to further analysis of realization gap on process evaluation basis. The process evaluation criterion, a product of this basis is then applied to two different programs (DESEU and NYC ACE) linked to the scope of this thesis. Utilizing the synergies of impact and process level analysis, it offers proposals on program development and its structure using our process evaluation criterion. Innovative financing and benefits distribution structure is thus developed and will remain part of the proposal. Restricted Stakeholder Crowd Financing and Risk-Free Incentivized return are the products of proposed financing and benefit distribution structure respectively. These products are then complimented by proposing an alternative approach in estimating EE&C savings. The approach advocates estimation based on range-allocation rather than currently utilized unique estimated savings approach. The Way Ahead section thus explores synergy between financial and engineering ranges of energy savings as a multi-discipline approach for future research. Moreover, it provides the proposed program structure with risk aversion and incentive allocation while dealing with uncertainty. This set of new approaches are believed to better fill the realization gap between estimated and actual energy efficiency savings.
NASA Astrophysics Data System (ADS)
Revollo Sarmiento, G. N.; Cipolletti, M. P.; Perillo, M. M.; Delrieux, C. A.; Perillo, Gerardo M. E.
2016-03-01
Tidal flats generally exhibit ponds of diverse size, shape, orientation and origin. Studying the genesis, evolution, stability and erosive mechanisms of these geographic features is critical to understand the dynamics of coastal wetlands. However, monitoring these locations through direct access is hard and expensive, not always feasible, and environmentally damaging. Processing remote sensing images is a natural alternative for the extraction of qualitative and quantitative data due to their non-invasive nature. In this work, a robust methodology for automatic classification of ponds and tidal creeks in tidal flats using Google Earth images is proposed. The applicability of our method is tested in nine zones with different morphological settings. Each zone is processed by a segmentation stage, where ponds and tidal creeks are identified. Next, each geographical feature is measured and a set of shape descriptors is calculated. This dataset, together with a-priori classification of each geographical feature, is used to define a regression model, which allows an extensive automatic classification of large volumes of data discriminating ponds and tidal creeks against other various geographical features. In all cases, we identified and automatically classified different geographic features with an average accuracy over 90% (89.7% in the worst case, and 99.4% in the best case). These results show the feasibility of using freely available Google Earth imagery for the automatic identification and classification of complex geographical features. Also, the presented methodology may be easily applied in other wetlands of the world and perhaps employing other remote sensing imagery.
Richardson-Klavehn, A; Gardiner, J M
1998-05-01
Depth-of-processing effects on incidental perceptual memory tests could reflect (a) contamination by voluntary retrieval, (b) sensitivity of involuntary retrieval to prior conceptual processing, or (c) a deficit in lexical processing during graphemic study tasks that affects involuntary retrieval. The authors devised an extension of incidental test methodology--making conjunctive predictions about response times as well as response proportions--to discriminate among these alternatives. They used graphemic, phonemic, and semantic study tasks, and a word-stem completion test with incidental, intentional, and inclusion instructions. Semantic study processing was superior to phonemic study processing in the intentional and inclusion tests, but semantic and phonemic study processing produced equal priming in the incidental test, showing that priming was uncontaminated by voluntary retrieval--a conclusion reinforced by the response-time data--and that priming was insensitive to prior conceptual processing. The incidental test nevertheless showed a priming deficit following graphemic study processing, supporting the lexical-processing hypothesis. Adding a lexical decision to the 3 study tasks eliminated the priming deficit following graphemic study processing, but did not influence priming following phonemic and semantic processing. The results provide the first clear evidence that depth-of-processing effects on perceptual priming can reflect lexical processes, rather than voluntary contamination or conceptual processes.
Global-Context Based Salient Region Detection in Nature Images
NASA Astrophysics Data System (ADS)
Bao, Hong; Xu, De; Tang, Yingjun
Visually saliency detection provides an alternative methodology to image description in many applications such as adaptive content delivery and image retrieval. One of the main aims of visual attention in computer vision is to detect and segment the salient regions in an image. In this paper, we employ matrix decomposition to detect salient object in nature images. To efficiently eliminate high contrast noise regions in the background, we integrate global context information into saliency detection. Therefore, the most salient region can be easily selected as the one which is globally most isolated. The proposed approach intrinsically provides an alternative methodology to model attention with low implementation complexity. Experiments show that our approach achieves much better performance than that from the existing state-of-art methods.
Methodology to model the energy and greenhouse gas emissions of electronic software distributions.
Williams, Daniel R; Tang, Yinshan
2012-01-17
A new electronic software distribution (ESD) life cycle analysis (LCA) methodology and model structure were constructed to calculate energy consumption and greenhouse gas (GHG) emissions. In order to counteract the use of high level, top-down modeling efforts, and to increase result accuracy, a focus upon device details and data routes was taken. In order to compare ESD to a relevant physical distribution alternative, physical model boundaries and variables were described. The methodology was compiled from the analysis and operational data of a major online store which provides ESD and physical distribution options. The ESD method included the calculation of power consumption of data center server and networking devices. An in-depth method to calculate server efficiency and utilization was also included to account for virtualization and server efficiency features. Internet transfer power consumption was analyzed taking into account the number of data hops and networking devices used. The power consumed by online browsing and downloading was also factored into the model. The embedded CO(2)e of server and networking devices was proportioned to each ESD process. Three U.K.-based ESD scenarios were analyzed using the model which revealed potential CO(2)e savings of 83% when ESD was used over physical distribution. Results also highlighted the importance of server efficiency and utilization methods.
NASA Astrophysics Data System (ADS)
Dutta, Sekhar Chandra; Chakroborty, Suvonkar; Raychaudhuri, Anusrita
Vibration transmitted to the structure during earthquake may vary in magnitude over a wide range. Design methodology should, therefore, enumerates steps so that structures are able to survive in the event of even severe ground motion. However, on account of economic reason, the strengths can be provided to the structures in such a way that the structure remains in elastic range in low to moderate range earthquake and is allowed to undergo inelastic deformation in severe earthquake without collapse. To implement this design philosophy a rigorous nonlinear dynamic analysis is needed to be performed to estimate the inelastic demands. Furthermore, the same is time consuming and requires expertise to judge the results obtained from the same. In this context, the present paper discusses and demonstrates an alternative simple method known as Pushover method, which can be easily used by practicing engineers bypassing intricate nonlinear dynamic analysis and can be thought of as a substitute of the latter. This method is in the process of development and is increasingly becoming popular for its simplicity. The objective of this paper is to emphasize and demonstrate the basic concept, strength and ease of this state of the art methodology for regular use in design offices in performance based seismic design of structures.
Vernon, Donald D; Bolte, Robert G; Scaife, Eric; Hansen, Kristine W
2005-01-01
Freestanding children's hospitals may lack resources, especially surgical manpower, to meet American College of Surgeons trauma center criteria, and may organize trauma care in alternative ways. At a tertiary care children's hospital, attending trauma surgeons and anesthesiologists took out-of-hospital call and directed initial care for only the most severely injured patients, whereas pediatric emergency physicians directed care for patients with less severe injuries. Survival data were analyzed using TRISS methodology. A total of 903 trauma patients were seen by the system during the period 10/1/96-6/30/01. Median Injury Severity Score was 16, and 508 of patients had Injury Severity Score > or =15. There were 83 deaths, 21 unexpected survivors, and 13 unexpected deaths. TRISS analysis showed that z-score was 4.39 and W-statistic was 3.07. Mortality outcome from trauma in a pediatric hospital using this alternative approach to trauma care was significantly better than predicted by TRISS methodology.
Dollins, Haley E; Bray, Kimberly Krust; Gadbury-Amyot, Cynthia C
2013-10-01
Inequitable access to dental care contributes to oral health disparities. Midlevel dental provider models are utilized across the globe as a way to bridge the gap between preventive and restorative dental professionals and increase access to dental care. The purpose of this study was threefold: to examine lessons learned from the state legislative process related to creation of the hygienist-therapist in a Midwestern state, to improve understanding of the relationship between alternative oral health delivery models and public policy and to inform the development and passage of future policies aimed at addressing the unmet dental needs of the public. This research investigation utilized a qualitative research methodology to examine the process of legislation relating to an alternative oral health delivery model (hygienist-therapist) through the eyes of key stakeholders. Interview data was analyzed and then triangulated with 3 data sources: interviews with key stakeholders, documents and researcher participant field notes. Data analysis resulted in consensus on 3 emergent themes with accompanying categories. The themes that emerged included social justice, partnerships and coalitions, and the legislative process. This qualitative case study suggests that the creation of a new oral health workforce model was a long and arduous process involving multiple stakeholders and negotiation between the parties involved. The creation of this new workforce model was recognized as a necessary step to increasing access to dental care at the state and national level. The research in this case study may serve to inform advocates of access to oral health care as other states pursue their own workforce models.
A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery
ERIC Educational Resources Information Center
Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh
2012-01-01
The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…
Design and Diagnosis Problem Solving with Multifunctional Technical Knowledge Bases
1992-09-29
STRUCTURE METHODOLOGY Design problem solving is a complex activity involving a number of subtasks. and a number of alternative methods potentially available...Conference on Artificial Intelligence. London: The British Computer Society, pp. 621-633. Friedland, P. (1979). Knowledge-based experimental design ...Computing Milieuxl: Management of Computing and Information Systems- -ty,*m man- agement General Terms: Design . Methodology Additional Key Words and Phrases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dowdy, M.W.; Couch, M.D.
A vehicle comparison methodology based on the Otto-Engine Equivalent (OEE) vehicle concept is described. As an illustration of this methodology, the concept is used to make projections of the fuel economy potential of passenger cars using various alternative power systems. Sensitivities of OEE vehicle results to assumptions made in the calculational procedure are discussed. Factors considered include engine torque boundary, rear axle ratio, performance criteria, engine transient response, and transmission shift logic.
ERIC Educational Resources Information Center
Cuadra, Ernesto; Crouch, Luis
Student promotion, repetition, and dropout rates constitute the basic data needed to forecast future enrollment and new resources. Information on student flow is significantly related to policy formulation aimed at improving internal efficiency, because dropping out and grade repetition increase per pupil cost, block access to eligible school-age…
ERIC Educational Resources Information Center
Young, I. Phillip; Fawcett, Paul
2013-01-01
Several teacher models exist for using high-stakes testing outcomes to make continuous employment decisions for principals. These models are reviewed, and specific flaws are noted if these models are retrofitted for principals. To address these flaws, a different methodology is proposed on the basis of actual field data. Specially addressed are…
NASA Astrophysics Data System (ADS)
Vana, Sudha; Uijt de Haag, Maarten
2010-04-01
This paper discusses an alternative ADS-B implementation that uses available provisions (Mode-S, UAT and GPS receivers) and existing GPS algorithms and techniques. This alternative has many advantages over the current ADS-B implementation, especially with respect to integrity of the solution. The paper will describe the methodology, its advantages, simulation results and implementation issues.
ERIC Educational Resources Information Center
Delaney, Jennifer A.; Kearney, Tyler D.
2016-01-01
This study considered the impact of state-level guaranteed tuition programs on alternative student-based revenue streams. It used a quasi-experimental, difference-in-difference methodology with a panel dataset of public four-year institutions from 2000-2012. Illinois' 2004 "Truth-in-Tuition" law was used as the policy of interest and the…
Walker, C; Kaiser, K; Klein, W; Lagadic, L; Peakall, D; Sheffield, S; Soldan, T; Yasuno, M
1998-01-01
There is growing public pressure to minimize the use of vertebrates in ecotoxicity testing; therefore, effective alternatives to toxicity tests causing suffering are being sought. This report discusses alternatives and differs in some respects from the reports of the other three groups because the primary concern is with harmful effects of chemicals at the level of population and above rather than with harmful effects upon individuals. It is concluded that progress toward the objective of minimizing testing that causes suffering would be served by the following initiatives--a clearer definition of goals and strategies when undertaking testing procedures; development of alternative assays, including in vitro test systems, that are based on new technology; development of nondestructive assays for vertebrates (e.g., biomarkers) that do not cause suffering; selection of most appropriate species, strains, and developmental stages for testing procedures (but no additional species for basic testing); better integrated and more flexible testing procedures incorporating biomarker responses, ecophysiological concepts, and ecological end points (progress in this direction depends upon expert judgment). In general, testing procedures could be made more realistic, taking into account problems with mixtures, and with volatile or insoluble chemicals. PMID:9599690
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costa, Mafalda T., E-mail: mafaldatcosta@gmail.com; Carolino, Elisabete, E-mail: lizcarolino@gmail.com; Oliveira, Teresa A., E-mail: teresa.oliveira@uab.pt
In water supply systems with distribution networkthe most critical aspects of control and Monitoring of water quality, which generates crises system, are the effects of cross-contamination originated by the network typology. The classics of control of quality systems through the application of Shewhart charts are generally difficult to manage in real time due to the high number of charts that must be completed and evaluated. As an alternative to the traditional control systems with Shewhart charts, this study aimed to apply a simplified methodology of a monitoring plan quality parameters in a drinking water distribution, by applying Hotelling’s T{sup 2}more » charts and supplemented with Shewhart charts with Bonferroni limits system, whenever instabilities with processes were detected.« less
Speelman, Craig P.; McGann, Marek
2013-01-01
In this paper we voice concerns about the uncritical manner in which the mean is often used as a summary statistic in psychological research. We identify a number of implicit assumptions underlying the use of the mean and argue that the fragility of these assumptions should be more carefully considered. We examine some of the ways in which the potential violation of these assumptions can lead us into significant theoretical and methodological error. Illustrations of alternative models of research already extant within Psychology are used to explore methods of research less mean-dependent and suggest that a critical assessment of the assumptions underlying its use in research play a more explicit role in the process of study design and review. PMID:23888147
Heat conversion alternative petrochemical complexes efficiency
NASA Astrophysics Data System (ADS)
Mrakin, A. N.; Selivanov, A. A.; Morev, A. A.; Batrakov, P. A.; Kulbyakina, A. V.; Sotnikov, D. G.
2017-08-01
The paper presents the energy and economic efficiency calculation results of the petrochemical complexes based upon the sulfur oil shales processing by solid (ash) heat-carrier low-temperature carbonization plants by Galoter technology. The criterion for such enterprises fuel efficiency determining was developed on the base of the exergy methodology taking into account the recurrent publications consolidation. In this case, in supplying the consumers with paving bitumen, motor benzol, thiophene, toluene, 2-methylthiophene, xylene, gas sulfur, complex thermodynamic effectiveness was founded to amount to 53 %, and if ash residue realization is possible then it was founded to be to 70 %. The project economic attractiveness determining studies depending on the feedstock cost, its delivery way and investments amount changing were conducted.
A Methodology to Support Decision Making in Flood Plan Mitigation
NASA Astrophysics Data System (ADS)
Biscarini, C.; di Francesco, S.; Manciola, P.
2009-04-01
The focus of the present document is on specific decision-making aspects of flood risk analysis. A flood is the result of runoff from rainfall in quantities too great to be confined in the low-water channels of streams. Little can be done to prevent a major flood, but we may be able to minimize damage within the flood plain of the river. This broad definition encompasses many possible mitigation measures. Floodplain management considers the integrated view of all engineering, nonstructural, and administrative measures for managing (minimizing) losses due to flooding on a comprehensive scale. The structural measures are the flood-control facilities designed according to flood characteristics and they include reservoirs, diversions, levees or dikes, and channel modifications. Flood-control measures that modify the damage susceptibility of floodplains are usually referred to as nonstructural measures and may require minor engineering works. On the other hand, those measures designed to modify the damage potential of permanent facilities are called non-structural and allow reducing potential damage during a flood event. Technical information is required to support the tasks of problem definition, plan formulation, and plan evaluation. The specific information needed and the related level of detail are dependent on the nature of the problem, the potential solutions, and the sensitivity of the findings to the basic information. Actions performed to set up and lay out the study are preliminary to the detailed analysis. They include: defining the study scope and detail, the field data collection, a review of previous studies and reports, and the assembly of needed maps and surveys. Risk analysis can be viewed as having many components: risk assessment, risk communication and risk management. Risk assessment comprises an analysis of the technical aspects of the problem, risk communication deals with conveying the information and risk management involves the decision process. In the present paper we propose a novel methodology for supporting the priority setting in the assessment of such issues, beyond the typical "expected value" approach. Scientific contribution and management aspects are merged to create a simplified method for plan basin implementation, based on risk and economic analyses. However, the economic evaluation is not the sole criterion for flood-damage reduction plan selection. Among the different criteria that are relevant to the decision process, safety and quality of human life, economic damage, expenses related with the chosen measures and environmental issues should play a fundamental role on the decisions made by the authorities. Some numerical indices, taking in account administrative, technical, economical and risk aspects, are defined and are combined together in a mathematical formula that defines a Priority Index (PI). In particular, the priority index defines a ranking of priority interventions, thus allowing the formulation of the investment plan. The research is mainly focused on the technical factors of risk assessment, providing quantitative and qualitative estimates of possible alternatives, containing measures of the risk associated with those alternatives. Moreover, the issues of risk management are analyzed, in particular with respect to the role of decision making in the presence of risk information. However, a great effort is devoted to make this index easy to be formulated and effective to allow a clear and transparent comparison between the alternatives. Summarizing this document describes a major- steps for incorporation of risk analysis into the decision making process: framing of the problem in terms of risk analysis, application of appropriate tools and techniques to obtain quantified results, use of the quantified results in the choice of structural and non-structural measures. In order to prove the reliability of the proposed methodology and to show how risk-based information can be incorporated into a flood analysis process, its application to some middle italy river basins is presented. The methodology assessment is performed by comparing different scenarios and showing that the optimal decision stems from a feasibility evaluation.
NASA Astrophysics Data System (ADS)
Century, Daisy Nelson
This probing study focused on alternative and traditional assessments, their comparative impacts on students' attitudes and science learning outcomes. Four basic questions were asked: What type of science learning stemming from the instruction can best be assessed by the use of traditional paper-and pencil test? What type of science learning stemming from the instruction can best be assessed by the use of alternative assessment? What are the differences in the types of learning outcomes that can be assessed by the use of paper-pencil test and alternative assessment test? Is there a difference in students' attitude towards learning science when assessment of outcomes is by alternative assessment means compared to traditional means compared to traditional means? A mixed methodology involving quantitative and qualitative techniques was utilized. However, the study was essentially a case study. Quantitative data analysis included content achievement and attitude results, to which non-parametric statistics were applied. Analysis of qualitative data was done as a case study utilizing pre-set protocols resulting in a narrative summary style of report. These outcomes were combined in order to produce conclusions. This study revealed that the traditional method yielded more concrete cognitive content learning than did the alternative assessment. The alternative assessment yielded more psychomotor, cooperative learning and critical thinking skills. In both the alternative and the traditional methods the student's attitudes toward science were positive. There was no significant differences favoring either group. The quantitative findings of no statistically significant differences suggest that at a minimum there is no loss in the use of alternative assessment methods, in this instance, performance testing. Adding the results from the qualitative analysis to this suggests (1) that class groups were more satisfied when alternative methods were employed, and (2) that the two assessment methodologies are complementary to each other, and thus should probably be used together to produce maximum benefit.
NASA Astrophysics Data System (ADS)
Babbar-Sebens, M.; Minsker, B. S.
2006-12-01
In the water resources management field, decision making encompasses many kinds of engineering, social, and economic constraints and objectives. Representing all of these problem dependant criteria through models (analytical or numerical) and various formulations (e.g., objectives, constraints, etc.) within an optimization- simulation system can be a very non-trivial issue. Most models and formulations utilized for discerning desirable traits in a solution can only approximate the decision maker's (DM) true preference criteria, and they often fail to consider important qualitative and incomputable phenomena related to the management problem. In our research, we have proposed novel decision support frameworks that allow DMs to actively participate in the optimization process. The DMs explicitly indicate their true preferences based on their subjective criteria and the results of various simulation models and formulations. The feedback from the DMs is then used to guide the search process towards solutions that are "all-rounders" from the perspective of the DM. The two main research questions explored in this work are: a) Does interaction between the optimization algorithm and a DM assist the system in searching for groundwater monitoring designs that are robust from the DM's perspective?, and b) How can an interactive search process be made more effective when human factors, such as human fatigue and cognitive learning processes, affect the performance of the algorithm? The application of these frameworks on a real-world groundwater long-term monitoring (LTM) case study in Michigan highlighted the following salient advantages: a) in contrast to the non-interactive optimization methodology, the proposed interactive frameworks were able to identify low cost monitoring designs whose interpolation maps respected the expected spatial distribution of the contaminants, b) for many same-cost designs, the interactive methodologies were able to propose multiple alternatives that met the DM's preference criteria, therefore allowing the expert to select among several strong candidate designs depending on her/his LTM budget, c) two of the methodologies - Case-Based Micro Interactive Genetic Algorithm (CBMIGA) and Interactive Genetic Algorithm with Mixed Initiative Interaction (IGAMII) - were also able to assist in controlling human fatigue and adapt to the DM's learning process.
Assessing technical performance at diverse ambulatory care sites.
Osterweis, M; Bryant, E
1978-01-01
The purpose of the large study reported here was to develop and test methods for assessing the quality of health care that would be broadly applicable to diverse ambulatory care organizations for periodic comparative review. Methodological features included the use of an age-sex stratified random sampling scheme, dependence on medical records as the source of data, a fixed study period year, use of Kessner's tracer methodology (including not only acute and chronic diseases but also screening and immunization rates as indicators), and a fixed tracer matrix at all test sites. This combination of methods proved more efficacious in estimating certain parameters for the total patient populations at each site (including utilization patterns, screening, and immunization rates) and the process of care for acute conditions than it did in examining the process of care for the selected chronic condition. It was found that the actual process of care at all three sites for the three acute conditions (streptococcal pharyngitis, urinary tract infection, and iron deficiency anemia) often differed from the expected process in terms of both diagnostic procedures and treatment. For hypertension, the chronic disease tracer, medical records were frequently a deficient data source from which to draw conclusions about the adequacy of treatment. Several aspects of the study methodology were found to be detrimental to between-site comparisons of the process of care for chronic disease management. The use of an age-sex stratified random sampling scheme resulted in the identification of too few cases of hypertension at some sites for analytic purposes, thereby necessitating supplementary sampling by diagnosis. The use of a fixed study period year resulted in an arbitrary starting point in the course of the disease. Furthermore, in light of the diverse sociodemographic characteristics of the patient populations, the use of a fixed matrix of tracer conditions for all test sites is questionable. The discussion centers on these and other problems encountered in attempting to compare technical performance within diverse ambulatory care organizations and provides some guidelines as to the utility of alternative methods for assessing the quality of health care.
Conceptual Chemical Process Design for Sustainability. ...
This chapter examines the sustainable design of chemical processes, with a focus on conceptual design, hierarchical and short-cut methods, and analyses of process sustainability for alternatives. The chapter describes a methodology for incorporating process sustainability analyses throughout the conceptual design. Hierarchical and short-cut decision-making methods will be used to approach sustainability. An example showing a sustainability-based evaluation of chlor-alkali production processes is presented with economic analysis and five pollutants described as emissions. These emissions are analyzed according to their human toxicity potential by ingestion using the Waste Reduction Algorithm and a method based on US Environmental Protection Agency reference doses, with the addition of biodegradation for suitable components. Among the emissions, mercury as an element will not biodegrade, and results show the importance of this pollutant to the potential toxicity results and therefore the sustainability of the process design. The dominance of mercury in determining the long-term toxicity results when energy use is included suggests that all process system evaluations should (re)consider the role of mercury and other non-/slow-degrading pollutants in sustainability analyses. The cycling of nondegrading pollutants through the biosphere suggests the need for a complete analysis based on the economic, environmental, and social aspects of sustainability. Chapter reviews
Polcari, J.
2013-08-16
The signal processing concept of signal-to-noise ratio (SNR), in its role as a performance measure, is recast within the more general context of information theory, leading to a series of useful insights. Establishing generalized SNR (GSNR) as a rigorous information theoretic measure inherent in any set of observations significantly strengthens its quantitative performance pedigree while simultaneously providing a specific definition under general conditions. This directly leads to consideration of the log likelihood ratio (LLR): first, as the simplest possible information-preserving transformation (i.e., signal processing algorithm) and subsequently, as an absolute, comparable measure of information for any specific observation exemplar. Furthermore,more » the information accounting methodology that results permits practical use of both GSNR and LLR as diagnostic scalar performance measurements, directly comparable across alternative system/algorithm designs, applicable at any tap point within any processing string, in a form that is also comparable with the inherent performance bounds due to information conservation.« less
Sánchez, Óscar J; Cardona, Carlos A
2012-01-01
In this work, the hierarchical decomposition methodology was used to conceptually design the production of fuel ethanol from sugarcane. The decomposition of the process into six levels of analysis was carried out. Several options of technological configurations were assessed in each level considering economic and environmental criteria. The most promising alternatives were chosen rejecting the ones with a least favorable performance. Aspen Plus was employed for simulation of each one of the technological configurations studied. Aspen Icarus was used for economic evaluation of each configuration, and WAR algorithm was utilized for calculation of the environmental criterion. The results obtained showed that the most suitable synthesized flowsheet involves the continuous cultivation of Zymomonas mobilis with cane juice as substrate and including cell recycling and the ethanol dehydration by molecular sieves. The proposed strategy demonstrated to be a powerful tool for conceptual design of biotechnological processes considering both techno-economic and environmental indicators. Copyright © 2011 Elsevier Ltd. All rights reserved.
Ruiz-Espinosa, H; Amador-Espejo, G G; Barcenas-Pozos, M E; Angulo-Guerrero, J O; Garcia, H S; Welti-Chanes, J
2013-02-01
Multiple-pass ultrahigh pressure homogenization (UHPH) was used for reducing microbial population of both indigenous spoilage microflora in whole raw milk and a baroresistant pathogen (Staphylococcus aureus) inoculated in whole sterile milk to define pasteurization-like processing conditions. Response surface methodology was followed and multiple response optimization of UHPH operating pressure (OP) (100, 175, 250 MPa) and number of passes (N) (1-5) was conducted through overlaid contour plot analysis. Increasing OP and N had a significant effect (P < 0·05) on microbial reduction of both spoilage microflora and Staph. aureus in milk. Optimized UHPH processes (five 202-MPa passes; four 232-MPa passes) defined a region where a 5-log(10) reduction of total bacterial count of milk and a baroresistant pathogen are attainable, as a requisite parameter for establishing an alternative method of pasteurization. Multiple-pass UHPH optimized conditions might help in producing safe milk without the detrimental effects associated with thermal pasteurization. © 2012 The Society for Applied Microbiology.
Sulis, William H
2017-10-01
Walter Freeman III pioneered the application of nonlinear dynamical systems theories and methodologies in his work on mesoscopic brain dynamics.Sadly, mainstream psychology and psychiatry still cling to linear correlation based data analysis techniques, which threaten to subvert the process of experimentation and theory building. In order to progress, it is necessary to develop tools capable of managing the stochastic complexity of complex biopsychosocial systems, which includes multilevel feedback relationships, nonlinear interactions, chaotic dynamics and adaptability. In addition, however, these systems exhibit intrinsic randomness, non-Gaussian probability distributions, non-stationarity, contextuality, and non-Kolmogorov probabilities, as well as the absence of mean and/or variance and conditional probabilities. These properties and their implications for statistical analysis are discussed. An alternative approach, the Process Algebra approach, is described. It is a generative model, capable of generating non-Kolmogorov probabilities. It has proven useful in addressing fundamental problems in quantum mechanics and in the modeling of developing psychosocial systems.
NASA Astrophysics Data System (ADS)
Kiran Kumar, Kalla; Nagaraju, Dega; Gayathri, S.; Narayanan, S.
2017-05-01
Priority Sequencing Rules provide the guidance for the order in which the jobs are to be processed at a workstation. The application of different priority rules in job shop scheduling gives different order of scheduling. More experimentation needs to be conducted before a final choice is made to know the best priority sequencing rule. Hence, a comprehensive method of selecting the right choice is essential in managerial decision making perspective. This paper considers seven different priority sequencing rules in job shop scheduling. For evaluation and selection of the best priority sequencing rule, a set of eight criteria are considered. The aim of this work is to demonstrate the methodology of evaluating and selecting the best priority sequencing rule by using hybrid multi criteria decision making technique (MCDM), i.e., analytical hierarchy process (AHP) with technique for order preference by similarity to ideal solution (TOPSIS). The criteria weights are calculated by using AHP whereas the relative closeness values of all priority sequencing rules are computed based on TOPSIS with the help of data acquired from the shop floor of a manufacturing firm. Finally, from the findings of this work, the priority sequencing rules are ranked from most important to least important. The comprehensive methodology presented in this paper is very much essential for the management of a workstation to choose the best priority sequencing rule among the available alternatives for processing the jobs with maximum benefit.
NASA Technical Reports Server (NTRS)
2003-01-01
Topics covered include: Real-Time, High-Frequency QRS Electrocardiograph; Software for Improved Extraction of Data From Tape Storage; Radio System for Locating Emergency Workers; Software for Displaying High-Frequency Test Data; Capacitor-Chain Successive-Approximation ADC; Simpler Alternative to an Optimum FQPSK-B Viterbi Receiver; Multilayer Patch Antenna Surrounded by a Metallic Wall; Software To Secure Distributed Propulsion Simulations; Explicit Pore Pressure Material Model in Carbon-Cloth Phenolic; Meshed-Pumpkin Super-Pressure Balloon Design; Corrosion Inhibitors as Penetrant Dyes for Radiography; Transparent Metal-Salt-Filled Polymeric Radiation Shields; Lightweight Energy Absorbers for Blast Containers; Brush-Wheel Samplers for Planetary Exploration; Dry Process for Making Polyimide/ Carbon-and-Boron-Fiber Tape; Relatively Inexpensive Rapid Prototyping of Small Parts; Magnetic Field Would Reduce Electron Backstreaming in Ion Thrusters; Alternative Electrochemical Systems for Ozonation of Water; Interferometer for Measuring Displacement to Within 20 pm; UV-Enhanced IR Raman System for Identifying Biohazards; Prognostics Methodology for Complex Systems; Algorithms for Haptic Rendering of 3D Objects; Modeling and Control of Aerothermoelastic Effects; Processing Digital Imagery to Enhance Perceptions of Realism; Analysis of Designs of Space Laboratories; Shields for Enhanced Protection Against High-Speed Debris; Study of Dislocation-Ordered In(x)Ga(1-x)As/GaAs Quantum Dots; and Tilt-Sensitivity Analysis for Space Telescopes.
A sup-score test for the cure fraction in mixture models for long-term survivors.
Hsu, Wei-Wen; Todem, David; Kim, KyungMann
2016-12-01
The evaluation of cure fractions in oncology research under the well known cure rate model has attracted considerable attention in the literature, but most of the existing testing procedures have relied on restrictive assumptions. A common assumption has been to restrict the cure fraction to a constant under alternatives to homogeneity, thereby neglecting any information from covariates. This article extends the literature by developing a score-based statistic that incorporates covariate information to detect cure fractions, with the existing testing procedure serving as a special case. A complication of this extension, however, is that the implied hypotheses are not typical and standard regularity conditions to conduct the test may not even hold. Using empirical processes arguments, we construct a sup-score test statistic for cure fractions and establish its limiting null distribution as a functional of mixtures of chi-square processes. In practice, we suggest a simple resampling procedure to approximate this limiting distribution. Our simulation results show that the proposed test can greatly improve efficiency over tests that neglect the heterogeneity of the cure fraction under the alternative. The practical utility of the methodology is illustrated using ovarian cancer survival data with long-term follow-up from the surveillance, epidemiology, and end results registry. © 2016, The International Biometric Society.
Methodologies for processing plant material into acceptable food on a small scale
NASA Technical Reports Server (NTRS)
Parks, Thomas R.; Bindon, John N.; Bowles, Anthony J. G.; Golbitz, Peter; Lampi, Rauno A.; Marquardt, Robert F.
1994-01-01
Based on the Controlled Environment Life Support System (CELSS) production of only four crops, wheat, white potatoes, soybeans, and sweet potatoes; a crew size of twelve; a daily planting/harvesting regimen; and zero-gravity conditions, estimates were made on the quantity of food that would need to be grown to provide adequate nutrition; and the corresponding amount of biomass that would result. Projections were made of the various types of products that could be made from these crops, the unit operations that would be involved, and what menu capability these products could provide. Equipment requirements to perform these unit operations were screened to identify commercially available units capable of operating (or being modified to operate) under CELSS/zero-gravity conditions. Concept designs were developed for those equipment needs for which no suitable units were commercially available. Prototypes of selected concept designs were constructed and tested on a laboratory scale, as were selected commercially available units. This report discusses the practical considerations taken into account in the various design alternatives, some of the many product/process factors that relate to equipment development, and automation alternatives. Recommendations are made on both general and specific areas in which it was felt additional investigation would benefit CELSS missions.
Material selection and assembly method of battery pack for compact electric vehicle
NASA Astrophysics Data System (ADS)
Lewchalermwong, N.; Masomtob, M.; Lailuck, V.; Charoenphonphanich, C.
2018-01-01
Battery packs become the key component in electric vehicles (EVs). The main costs of which are battery cells and assembling processes. The battery cell is indeed priced from battery manufacturers while the assembling cost is dependent on battery pack designs. Battery pack designers need overall cost as cheap as possible, but it still requires high performance and more safety. Material selection and assembly method as well as component design are very important to determine the cost-effectiveness of battery modules and battery packs. Therefore, this work presents Decision Matrix, which can aid in the decision-making process of component materials and assembly methods for a battery module design and a battery pack design. The aim of this study is to take the advantage of incorporating Architecture Analysis method into decision matrix methods by capturing best practices for conducting design architecture analysis in full account of key design components critical to ensure efficient and effective development of the designs. The methodology also considers the impacts of choice-alternatives along multiple dimensions. Various alternatives for materials and assembly techniques of battery pack are evaluated, and some sample costs are presented. Due to many components in the battery pack, only seven components which are positive busbar and Z busbar are represented in this paper for using decision matrix methods.
The Otto-engine-equivalent vehicle concept
NASA Technical Reports Server (NTRS)
Dowdy, M. W.; Couch, M. D.
1978-01-01
A vehicle comparison methodology based on the Otto-Engine Equivalent (OEE) vehicle concept is described. As an illustration of this methodology, the concept is used to make projections of the fuel economy potential of passenger cars using various alternative power systems. Sensitivities of OEE vehicle results to assumptions made in the calculational procedure are discussed. Factors considered include engine torque boundary, rear axle ratio, performance criteria, engine transient response, and transmission shift logic.
ERIC Educational Resources Information Center
Abdallah, Mahmoud M. S.; Wegerif, Rupert B.
2014-01-01
This article discusses educational design-based research (DBR) as an emerging paradigm/methodology in educational enquiry that can be used as a mixed-method, problem-oriented research framework, and thus can act as an alternative to other traditional paradigms/methodologies prominent within the Egyptian context of educational enquiry. DBR is often…
What lies behind crop decisions?Coming to terms with revealing farmers' preferences
NASA Astrophysics Data System (ADS)
Gomez, C.; Gutierrez, C.; Pulido-Velazquez, M.; López Nicolás, A.
2016-12-01
The paper offers a fully-fledged applied revealed preference methodology to screen and represent farmers' choices as the solution of an optimal program involving trade-offs among the alternative welfare outcomes of crop decisions such as profits, income security and management easiness. The recursive two-stage method is proposed as an alternative to cope with the methodological problems inherent to common practice positive mathematical program methodologies (PMP). Differently from PMP, in the model proposed in this paper, the non-linear costs that are required for both calibration and smooth adjustment are not at odds with the assumptions of linear Leontief technologies and fixed crop prices and input costs. The method frees the model from ad-hoc assumptions about costs and then recovers the potential of economic analysis as a means to understand the rationale behind observed and forecasted farmers' decisions and then to enhance the potential of the model to support policy making in relevant domains such as agricultural policy, water management, risk management and climate change adaptation. After the introduction, where the methodological drawbacks and challenges are set up, section two presents the theoretical model, section three develops its empirical application and presents its implementation to a Spanish irrigation district and finally section four concludes and makes suggestions for further research.
ERIC Educational Resources Information Center
Baker, Bruce D.
2011-01-01
This article applies the education cost function methodology in order to estimate additional costs associated with black student concentration and with alternative, race-neutral measures of urban poverty. Recent research highlights the continued importance of the role of race in educational outcomes, and how the intersection of peer group effects…
ERIC Educational Resources Information Center
Dickenson, Tammiee S.; Gilmore, Joanna A.; Price, Karen J.; Bennett, Heather L.
2013-01-01
This study evaluated the benefits of item enhancements applied to science-inquiry items for incorporation into an alternate assessment based on modified achievement standards for high school students. Six items were included in the cognitive lab sessions involving both students with and without disabilities. The enhancements (e.g., use of visuals,…
Free Radical Chemistry Enabled by Visible Light-Induced Electron Transfer.
Staveness, Daryl; Bosque, Irene; Stephenson, Corey R J
2016-10-18
Harnessing visible light as the driving force for chemical transformations generally offers a more environmentally friendly alternative compared with classical synthetic methodology. The transition metal-based photocatalysts commonly employed in photoredox catalysis absorb efficiently in the visible spectrum, unlike most organic substrates, allowing for orthogonal excitation. The subsequent excited states are both more reducing and more oxidizing than the ground state catalyst and are competitive with some of the more powerful single-electron oxidants or reductants available to organic chemists yet are simply accessed via irradiation. The benefits of this strategy have proven particularly useful in radical chemistry, a field that traditionally employs rather toxic and hazardous reagents to generate the desired intermediates. In this Account, we discuss our efforts to leverage visible light photoredox catalysis in radical-based bond-forming and bond-cleaving events for which few, if any, environmentally benign alternatives exist. Mechanistic investigations have driven our contributions in this field, for both facilitating desired transformations and offering new, unexpected opportunities. In fact, our total synthesis of (+)-gliocladin C was only possible upon elucidating the propensity for various trialkylamine additives to elicit a dual behavior as both a reductive quencher and a H-atom donor. Importantly, while natural product synthesis was central to our initial motivations to explore these photochemical processes, we have since demonstrated applicability within other subfields of chemistry, and our evaluation of flow technologies demonstrates the potential to translate these results from the bench to pilot scale. Our forays into photoredox catalysis began with fundamental methodology, providing a tin-free reductive dehalogenation that exchanged the gamut of hazardous reagents previously employed for such a transformation for visible light-mediated, ambient temperature conditions. Evolving from this work, a new avenue toward atom transfer radical addition (ATRA) chemistry was developed, enabling dual functionalization of both double and triple bonds. Importantly, we have also expanded our portfolio to target clinically relevant scaffolds. Photoredox catalysis proved effective in generating high value fluorinated alkyl radicals through the use of abundantly available starting materials, providing access to libraries of trifluoromethylated (hetero)arenes as well as intriguing gem-difluoro benzyl motifs via a novel photochemical radical Smiles rearrangement. Finally, we discuss a photochemical strategy toward sustainable lignin processing through selective C-O bond cleavage methodology. The collection of these efforts is meant to highlight the potential for visible light-mediated radical chemistry to impact a variety of industrial sectors.
Test Driven Development of Scientific Models
NASA Technical Reports Server (NTRS)
Clune, Thomas L.
2014-01-01
Test-Driven Development (TDD), a software development process that promises many advantages for developer productivity and software reliability, has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices.After a brief overview of the TDD process and my experience in applying the methodology for development activities at Goddard, I will delve more deeply into some of the challenges that are posed by numerical and scientific software as well as tools and implementation approaches that should address those challenges.
Garcia-Allende, P Beatriz; Mirapeix, Jesus; Conde, Olga M; Cobo, Adolfo; Lopez-Higuera, Jose M
2009-01-01
Plasma optical spectroscopy is widely employed in on-line welding diagnostics. The determination of the plasma electron temperature, which is typically selected as the output monitoring parameter, implies the identification of the atomic emission lines. As a consequence, additional processing stages are required with a direct impact on the real time performance of the technique. The line-to-continuum method is a feasible alternative spectroscopic approach and it is particularly interesting in terms of its computational efficiency. However, the monitoring signal highly depends on the chosen emission line. In this paper, a feature selection methodology is proposed to solve the uncertainty regarding the selection of the optimum spectral band, which allows the employment of the line-to-continuum method for on-line welding diagnostics. Field test results have been conducted to demonstrate the feasibility of the solution.
Logic as Marr's Computational Level: Four Case Studies.
Baggio, Giosuè; van Lambalgen, Michiel; Hagoort, Peter
2015-04-01
We sketch four applications of Marr's levels-of-analysis methodology to the relations between logic and experimental data in the cognitive neuroscience of language and reasoning. The first part of the paper illustrates the explanatory power of computational level theories based on logic. We show that a Bayesian treatment of the suppression task in reasoning with conditionals is ruled out by EEG data, supporting instead an analysis based on defeasible logic. Further, we describe how results from an EEG study on temporal prepositions can be reanalyzed using formal semantics, addressing a potential confound. The second part of the article demonstrates the predictive power of logical theories drawing on EEG data on processing progressive constructions and on behavioral data on conditional reasoning in people with autism. Logical theories can constrain processing hypotheses all the way down to neurophysiology, and conversely neuroscience data can guide the selection of alternative computational level models of cognition. Copyright © 2014 Cognitive Science Society, Inc.
Morgenstern, Jon; Naqvi, Nasir H; Debellis, Robert; Breiter, Hans C
2013-06-01
In the last decade, there has been an upsurge of interest in understanding the mechanisms of behavior change (MOBC) and effective behavioral interventions as a strategy to improve addiction-treatment efficacy. However, there remains considerable uncertainty about how treatment research should proceed to address the MOBC issue. In this article, we argue that limitations in the underlying models of addiction that inform behavioral treatment pose an obstacle to elucidating MOBC. We consider how advances in the cognitive neuroscience of addiction offer an alternative conceptual and methodological approach to studying the psychological processes that characterize addiction, and how such advances could inform treatment process research. In addition, we review neuroimaging studies that have tested aspects of neurocognitive theories as a strategy to inform addiction therapies and discuss future directions for transdisciplinary collaborations across cognitive neuroscience and MOBC research. 2013 APA, all rights reserved
Gomariz, María; Blaya, Salvador; Acebal, Pablo; Carretero, Luis
2014-01-01
We theoretically and experimentally analyze the formation of thick Purple Membrane (PM) polyacrylamide (PA) films by means of optical spectroscopy by considering the absorption of bacteriorhodopsin and scattering. We have applied semiclassical quantum mechanical techniques for the calculation of absorption spectra by taking into account the Fano effects on the ground state of bacteriorhodopsin. A model of the formation of PM-polyacrylamide films has been proposed based on the growth of polymeric chains around purple membrane. Experimentally, the temporal evolution of the polymerization process of acrylamide has been studied as function of the pH solution, obtaining a good correspondence to the proposed model. Thus, due to the formation of intermediate bacteriorhodopsin-doped nanogel, by controlling the polymerization process, an alternative methodology for the synthesis of bacteriorhodopsin-doped nanogels can be provided. PMID:25329473
Gomariz, María; Blaya, Salvador; Acebal, Pablo; Carretero, Luis
2014-01-01
We theoretically and experimentally analyze the formation of thick Purple Membrane (PM) polyacrylamide (PA) films by means of optical spectroscopy by considering the absorption of bacteriorhodopsin and scattering. We have applied semiclassical quantum mechanical techniques for the calculation of absorption spectra by taking into account the Fano effects on the ground state of bacteriorhodopsin. A model of the formation of PM-polyacrylamide films has been proposed based on the growth of polymeric chains around purple membrane. Experimentally, the temporal evolution of the polymerization process of acrylamide has been studied as function of the pH solution, obtaining a good correspondence to the proposed model. Thus, due to the formation of intermediate bacteriorhodopsin-doped nanogel, by controlling the polymerization process, an alternative methodology for the synthesis of bacteriorhodopsin-doped nanogels can be provided.
Nessen, Merel A; van der Zwaan, Dennis J; Grevers, Sander; Dalebout, Hans; Staats, Martijn; Kok, Esther; Palmblad, Magnus
2016-05-11
Proteomics methodology has seen increased application in food authentication, including tandem mass spectrometry of targeted species-specific peptides in raw, processed, or mixed food products. We have previously described an alternative principle that uses untargeted data acquisition and spectral library matching, essentially spectral counting, to compare and identify samples without the need for genomic sequence information in food species populations. Here, we present an interlaboratory comparison demonstrating how a method based on this principle performs in a realistic context. We also increasingly challenge the method by using data from different types of mass spectrometers, by trying to distinguish closely related and commercially important flatfish, and by analyzing heavily contaminated samples. The method was found to be robust in different laboratories, and 94-97% of the analyzed samples were correctly identified, including all processed and contaminated samples.
Morgenstern, Jon; Naqvi, Nasir H.; Debellis, Robert; Breiter, Hans C.
2013-01-01
In the last decade, there has been an upsurge of interest in understanding the mechanisms of behavior change (MOBC) and effective behavioral interventions as a strategy to improve addiction-treatment efficacy. However, there remains considerable uncertainty about how treatment research should proceed to address the MOBC issue. In this article, we argue that limitations in the underlying models of addiction that inform behavioral treatment pose an obstacle to elucidating MOBC. We consider how advances in the cognitive neuroscience of addiction offer an alternative conceptual and methodological approach to studying the psychological processes that characterize addiction, and how such advances could inform treatment process research. In addition, we review neuroimaging studies that have tested aspects of neurocognitive theories as a strategy to inform addiction therapies and discuss future directions for transdisciplinary collaborations across cognitive neuroscience and MOBC research. PMID:23586452
New insights into soil temperature time series modeling: linear or nonlinear?
NASA Astrophysics Data System (ADS)
Bonakdari, Hossein; Moeeni, Hamid; Ebtehaj, Isa; Zeynoddin, Mohammad; Mahoammadian, Abdolmajid; Gharabaghi, Bahram
2018-03-01
Soil temperature (ST) is an important dynamic parameter, whose prediction is a major research topic in various fields including agriculture because ST has a critical role in hydrological processes at the soil surface. In this study, a new linear methodology is proposed based on stochastic methods for modeling daily soil temperature (DST). With this approach, the ST series components are determined to carry out modeling and spectral analysis. The results of this process are compared with two linear methods based on seasonal standardization and seasonal differencing in terms of four DST series. The series used in this study were measured at two stations, Champaign and Springfield, at depths of 10 and 20 cm. The results indicate that in all ST series reviewed, the periodic term is the most robust among all components. According to a comparison of the three methods applied to analyze the various series components, it appears that spectral analysis combined with stochastic methods outperformed the seasonal standardization and seasonal differencing methods. In addition to comparing the proposed methodology with linear methods, the ST modeling results were compared with the two nonlinear methods in two forms: considering hydrological variables (HV) as input variables and DST modeling as a time series. In a previous study at the mentioned sites, Kim and Singh Theor Appl Climatol 118:465-479, (2014) applied the popular Multilayer Perceptron (MLP) neural network and Adaptive Neuro-Fuzzy Inference System (ANFIS) nonlinear methods and considered HV as input variables. The comparison results signify that the relative error projected in estimating DST by the proposed methodology was about 6%, while this value with MLP and ANFIS was over 15%. Moreover, MLP and ANFIS models were employed for DST time series modeling. Due to these models' relatively inferior performance to the proposed methodology, two hybrid models were implemented: the weights and membership function of MLP and ANFIS (respectively) were optimized with the particle swarm optimization (PSO) algorithm in conjunction with the wavelet transform and nonlinear methods (Wavelet-MLP & Wavelet-ANFIS). A comparison of the proposed methodology with individual and hybrid nonlinear models in predicting DST time series indicates the lowest Akaike Information Criterion (AIC) index value, which considers model simplicity and accuracy simultaneously at different depths and stations. The methodology presented in this study can thus serve as an excellent alternative to complex nonlinear methods that are normally employed to examine DST.
Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.
Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M
2017-07-01
In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.
Risk-informed selection of a highway trajectory in the neighborhood of an oil-refinery.
Papazoglou, I A; Nivolianitou, Z; Aneziris, O; Christou, M D; Bonanos, G
1999-06-11
A methodology for characterizing alternative trajectories of a new highway in the neighborhood of an oil-refinery with respect to the risk to public health is presented. The approach is based on a quantitative assessment of the risk that the storage facilities of flammable materials of the refinery pose to the users of the highway. Physical phenomena with a potential for detrimental consequences to public health such as BLEVE (Boiling Liquid Expanding Vapor Explosion), Unconfined Vapor Cloud Explosion, flash fire and pool fire are considered. Methodological and procedural steps for assessing the individual risk around the tank farm of the oil-refinery are presented. Based on the individual risk, group risk for each alternative highway trajectory is determined. Copyright 1999 Elsevier Science B.V.
Alternative power supply systems for remote industrial customers
NASA Astrophysics Data System (ADS)
Kharlamova, N. V.; Khalyasmaa, A. I.; Eroshenko, S. A.
2017-06-01
The paper addresses the problem of alternative power supply of remote industrial clusters with renewable electric energy generation. As a result of different technologies comparison, consideration is given to wind energy application. The authors present a methodology of mean expected wind generation output calculation, based on Weibull distribution, which provides an effective express-tool for preliminary assessment of required installed generation capacity. The case study is based on real data including database of meteorological information, relief characteristics, power system topology etc. Wind generation feasibility estimation for a specific territory is followed by power flow calculations using Monte Carlo methodology. Finally, the paper provides a set of recommendations to ensure safe and reliable power supply for the final customers and, subsequently, to provide sustainable development of the regions, located far from megalopolises and industrial centres.
Proceedings of the NASA Microbiology Workshop
NASA Technical Reports Server (NTRS)
Roman, M. C.; Jan, D. L.
2012-01-01
Long-term spaceflight is characterized by extraordinary challenges to maintain the life-supporting instrumentation free from microbial contamination and the crew healthy. The methodology currently employed for microbial monitoring in space stations or short spaceflights within the orbit of Earth have been instrumental in safeguarding the success of the missions, but suffers certain shortcomings that are critical for long spaceflights. This workshop addressed current practices and methodologies for microbial monitoring in space systems, and identified and discussed promising alternative methodologies and cutting-edge technologies for pursuit in the microbial monitoring that hold promise for supporting future NASA long-duration space missions.
Development of a support tool for complex decision-making in the provision of rural maternity care.
Hearns, Glen; Klein, Michael C; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea
2010-02-01
Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology.
Development of a Support Tool for Complex Decision-Making in the Provision of Rural Maternity Care
Hearns, Glen; Klein, Michael C.; Trousdale, William; Ulrich, Catherine; Butcher, David; Miewald, Christiana; Lindstrom, Ronald; Eftekhary, Sahba; Rosinski, Jessica; Gómez-Ramírez, Oralia; Procyk, Andrea
2010-01-01
Context: Decisions in the organization of safe and effective rural maternity care are complex, difficult, value laden and fraught with uncertainty, and must often be based on imperfect information. Decision analysis offers tools for addressing these complexities in order to help decision-makers determine the best use of resources and to appreciate the downstream effects of their decisions. Objective: To develop a maternity care decision-making tool for the British Columbia Northern Health Authority (NH) for use in low birth volume settings. Design: Based on interviews with community members, providers, recipients and decision-makers, and employing a formal decision analysis approach, we sought to clarify the influences affecting rural maternity care and develop a process to generate a set of value-focused objectives for use in designing and evaluating rural maternity care alternatives. Setting: Four low-volume communities with variable resources (with and without on-site births, with or without caesarean section capability) were chosen. Participants: Physicians (20), nurses (18), midwives and maternity support service providers (4), local business leaders, economic development officials and elected officials (12), First Nations (women [pregnant and non-pregnant], chiefs and band members) (40), social workers (3), pregnant women (2) and NH decision-makers/administrators (17). Results: We developed a Decision Support Manual to assist with assessing community needs and values, context for decision-making, capacity of the health authority or healthcare providers, identification of key objectives for decision-making, developing alternatives for care, and a process for making trade-offs and balancing multiple objectives. The manual was deemed an effective tool for the purpose by the client, NH. Conclusions: Beyond assisting the decision-making process itself, the methodology provides a transparent communication tool to assist in making difficult decisions. While the manual was specifically intended to deal with rural maternity issues, the NH decision-makers feel the method can be easily adapted to assist decision-making in other contexts in medicine where there are conflicting objectives, values and opinions. Decisions on the location of new facilities or infrastructure, or enhancing or altering services such as surgical or palliative care, would be examples of complex decisions that might benefit from this methodology. PMID:21286270
Logic-based models in systems biology: a predictive and parameter-free network analysis method†
Wynn, Michelle L.; Consul, Nikita; Merajver, Sofia D.
2012-01-01
Highly complex molecular networks, which play fundamental roles in almost all cellular processes, are known to be dysregulated in a number of diseases, most notably in cancer. As a consequence, there is a critical need to develop practical methodologies for constructing and analysing molecular networks at a systems level. Mathematical models built with continuous differential equations are an ideal methodology because they can provide a detailed picture of a network’s dynamics. To be predictive, however, differential equation models require that numerous parameters be known a priori and this information is almost never available. An alternative dynamical approach is the use of discrete logic-based models that can provide a good approximation of the qualitative behaviour of a biochemical system without the burden of a large parameter space. Despite their advantages, there remains significant resistance to the use of logic-based models in biology. Here, we address some common concerns and provide a brief tutorial on the use of logic-based models, which we motivate with biological examples. PMID:23072820
A Framework for Reproducible Latent Fingerprint Enhancements.
Carasso, Alfred S
2014-01-01
Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.
A Framework for Reproducible Latent Fingerprint Enhancements
Carasso, Alfred S.
2014-01-01
Photoshop processing1 of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology. PMID:26601028
Rescia, Alejandro J; Astrada, Elizabeth N; Bono, Julieta; Blasco, Carlos A; Meli, Paula; Adámoli, Jorge M
2006-08-01
A linear engineering project--i.e. a pipeline--has a potential long- and short-term impact on the environment and on the inhabitants therein. We must find better, less expensive, and less time-consuming ways to obtain information on the environment and on any modifications resulting from anthropic activity. We need scientifically sound, rapid and affordable assessment and monitoring methods. Construction companies, industries and the regulating government organisms lack the resources needed to conduct long-term basic studies of the environment. Thus there is a need to make the necessary adjustments and improvements in the environmental data considered useful for this development project. More effective and less costly methods are generally needed. We characterized the landscape of the study area, situated in the center and north-east of Argentina. Little is known of the ecology of this region and substantial research is required in order to develop sustainable uses and, at the same time, to develop methods for reducing impacts, both primary and secondary, resulting from anthropic activity in this area. Furthermore, we made an assessment of the environmental impact of the planned linear project, applying an ad hoc impact index, and we analyzed the different alternatives for a corridor, each one of these involving different sections of the territory. Among the alternative corridors considered, this study locates the most suitable ones in accordance with a selection criterion based on different environmental and conservation aspects. We selected the corridor that we considered to be the most compatible--i.e. with the least potential environmental impact--for the possible construction and operation of the linear project. This information, along with suitable measures for mitigating possible impacts, should be the basis of an environmental management plan for the design process and location of the project. We pointed out the objectivity and efficiency of this methodological approach, along with the possibility of integrating the information in order to allow for the application thereof in this type of study.
The statistical validity of nursing home survey findings.
Woolley, Douglas C
2011-11-01
The Medicare nursing home survey is a high-stakes process whose findings greatly affect nursing homes, their current and potential residents, and the communities they serve. Therefore, survey findings must achieve high validity. This study looked at the validity of one key assessment made during a nursing home survey: the observation of the rate of errors in administration of medications to residents (med-pass). Statistical analysis of the case under study and of alternative hypothetical cases. A skilled nursing home affiliated with a local medical school. The nursing home administrators and the medical director. Observational study. The probability that state nursing home surveyors make a Type I or Type II error in observing med-pass error rates, based on the current case and on a series of postulated med-pass error rates. In the common situation such as our case, where med-pass errors occur at slightly above a 5% rate after 50 observations, and therefore trigger a citation, the chance that the true rate remains above 5% after a large number of observations is just above 50%. If the true med-pass error rate were as high as 10%, and the survey team wished to achieve 75% accuracy in determining that a citation was appropriate, they would have to make more than 200 med-pass observations. In the more common situation where med pass errors are closer to 5%, the team would have to observe more than 2000 med-passes to achieve even a modest 75% accuracy in their determinations. In settings where error rates are low, large numbers of observations of an activity must be made to reach acceptable validity of estimates for the true rates of errors. In observing key nursing home functions with current methodology, the State Medicare nursing home survey process does not adhere to well-known principles of valid error determination. Alternate approaches in survey methodology are discussed. Copyright © 2011 American Medical Directors Association. Published by Elsevier Inc. All rights reserved.
Health economic evaluation: important principles and methodology.
Rudmik, Luke; Drummond, Michael
2013-06-01
To discuss health economic evaluation and improve the understanding of common methodology. This article discusses the methodology for the following types of economic evaluations: cost-minimization, cost-effectiveness, cost-utility, cost-benefit, and economic modeling. Topics include health-state utility measures, the quality-adjusted life year (QALY), uncertainty analysis, discounting, decision tree analysis, and Markov modeling. Economic evaluation is the comparative analysis of alternative courses of action in terms of both their costs and consequences. With increasing health care expenditure and limited resources, it is important for physicians to consider the economic impact of their interventions. Understanding common methodology involved in health economic evaluation will improve critical appraisal of the literature and optimize future economic evaluations. Copyright © 2012 The American Laryngological, Rhinological and Otological Society, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sebastiani, M.; Llambi, L.D.; Marquez, E.
1998-07-01
In Venezuela, the idea of tiering information between land-use ordering instruments and impact assessment is absent. In this article the authors explore a methodological alternative to bridge the information presented in land-use ordering instruments with the information requirements for impact assessment. The methodology is based on the steps carried out for an environmental impact assessment as well as on those considered to develop land-use ordering instruments. The methodology is applied to the territorial ordering plan and its proposal for the protection zone of the Cataniapo River basin. The purpose of the protection zone is to preserve the water quality andmore » quantity of the river basin for human consumption.« less
Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rearden, Bradley T; Williams, Mark L
2007-01-01
Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less
Differential HFE Gene Expression Is Regulated by Alternative Splicing in Human Tissues
Proença, Daniela; Faustino, Paula
2011-01-01
Background The pathophysiology of HFE-derived Hereditary Hemochromatosis and the function of HFE protein in iron homeostasis remain uncertain. Also, the role of alternative splicing in HFE gene expression regulation and the possible function of the corresponding protein isoforms are still unknown. The aim of this study was to gain insights into the physiological significance of these alternative HFE variants. Methodology/Principal Findings Alternatively spliced HFE transcripts in diverse human tissues were identified by RT-PCR, cloning and sequencing. Total HFE transcripts, as well as two alternative splicing transcripts were quantified using a real-time PCR methodology. Intracellular localization, trafficking and protein association of GFP-tagged HFE protein variants were analysed in transiently transfected HepG2 cells by immunoprecipitation and immunofluorescence assays. Alternatively spliced HFE transcripts present both level- and tissue-specificity. Concerning the exon 2 skipping and intron 4 inclusion transcripts, the liver presents the lowest relative level, while duodenum presents one of the highest amounts. The protein resulting from exon 2 skipping transcript is unable to associate with β2M and TfR1 and reveals an ER retention. Conversely, the intron 4 inclusion transcript gives rise to a truncated, soluble protein (sHFE) that is mostly secreted by cells to the medium in association with β2M. Conclusions/Significance HFE gene post-transcriptional regulation is clearly affected by a tissue-dependent alternative splicing mechanism. Among the corresponding proteins, a sHFE isoform stands out, which upon being secreted into the bloodstream, may act in remote tissues. It could be either an agonist or antagonist of the full length HFE, through hepcidin expression regulation in the liver or by controlling dietary iron absorption in the duodenum. PMID:21407826
Schievano, Andrea; D'Imporzano, Giuliana; Salati, Silvia; Adani, Fabrizio
2011-09-01
The mass balance (input/output mass flows) of full-scale anaerobic digestion (AD) processes should be known for a series of purposes, e.g. to understand carbon and nutrients balances, to evaluate the contribution of AD processes to elemental cycles, especially when digestates are applied to agricultural land and to measure the biodegradation yields and the process efficiency. In this paper, three alternative methods were studied, to determine the mass balance in full-scale processes, discussing their reliability and applicability. Through a 1-year survey on three full-scale AD plants and through 38 laboratory-scale batch digesters, the congruency of the considered methods was demonstrated and a linear equation was provided that allows calculating the wet weight losses (WL) from the methane produced (MP) by the plant (WL=41.949*MP+20.853, R(2)=0.950, p<0.01). Additionally, this new tool was used to calculate carbon, nitrogen, phosphorous and potassium balances of the three observed AD plants. Copyright © 2011 Elsevier Ltd. All rights reserved.
Lunar-base construction equipment and methods evaluation
NASA Technical Reports Server (NTRS)
Boles, Walter W.; Ashley, David B.; Tucker, Richard L.
1993-01-01
A process for evaluating lunar-base construction equipment and methods concepts is presented. The process is driven by the need for more quantitative, systematic, and logical methods for assessing further research and development requirements in an area where uncertainties are high, dependence upon terrestrial heuristics is questionable, and quantitative methods are seldom applied. Decision theory concepts are used in determining the value of accurate information and the process is structured as a construction-equipment-and-methods selection methodology. Total construction-related, earth-launch mass is the measure of merit chosen for mathematical modeling purposes. The work is based upon the scope of the lunar base as described in the National Aeronautics and Space Administration's Office of Exploration's 'Exploration Studies Technical Report, FY 1989 Status'. Nine sets of conceptually designed construction equipment are selected as alternative concepts. It is concluded that the evaluation process is well suited for assisting in the establishment of research agendas in an approach that is first broad, with a low level of detail, followed by more-detailed investigations into areas that are identified as critical due to high degrees of uncertainty and sensitivity.
24 CFR 3288.110 - Alternative Process agreements.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Alternative Process agreements... HOUSING AND URBAN DEVELOPMENT MANUFACTURED HOME DISPUTE RESOLUTION PROGRAM Alternative Process in HUD-Administered States § 3288.110 Alternative Process agreements. (a) Required agreement. To use the Alternative...
Intelligent systems/software engineering methodology - A process to manage cost and risk
NASA Technical Reports Server (NTRS)
Friedlander, Carl; Lehrer, Nancy
1991-01-01
A systems development methodology is discussed that has been successfully applied to the construction of a number of intelligent systems. This methodology is a refinement of both evolutionary and spiral development methodologies. It is appropriate for development of intelligent systems. The application of advanced engineering methodology to the development of software products and intelligent systems is an important step toward supporting the transition of AI technology into aerospace applications. A description of the methodology and the process model from which it derives is given. Associated documents and tools are described which are used to manage the development process and record and report the emerging design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Buddadee, Bancha; Wirojanagud, Wanpen; Watts, Daniel J.
In this paper, a multi-objective optimization model is proposed as a tool to assist in deciding for the proper utilization scheme of excess bagasse produced in sugarcane industry. Two major scenarios for excess bagasse utilization are considered in the optimization. The first scenario is the typical situation when excess bagasse is used for the onsite electricity production. In case of the second scenario, excess bagasse is processed for the offsite ethanol production. Then the ethanol is blended with an octane rating of 91 gasoline by a portion of 10% and 90% by volume respectively and the mixture is used asmore » alternative fuel for gasoline vehicles in Thailand. The model proposed in this paper called 'Environmental System Optimization' comprises the life cycle impact assessment of global warming potential (GWP) and the associated cost followed by the multi-objective optimization which facilitates in finding out the optimal proportion of the excess bagasse processed in each scenario. Basic mathematical expressions for indicating the GWP and cost of the entire process of excess bagasse utilization are taken into account in the model formulation and optimization. The outcome of this study is the methodology developed for decision-making concerning the excess bagasse utilization available in Thailand in view of the GWP and economic effects. A demonstration example is presented to illustrate the advantage of the methodology which may be used by the policy maker. The methodology developed is successfully performed to satisfy both environmental and economic objectives over the whole life cycle of the system. It is shown in the demonstration example that the first scenario results in positive GWP while the second scenario results in negative GWP. The combination of these two scenario results in positive or negative GWP depending on the preference of the weighting given to each objective. The results on economics of all scenarios show the satisfied outcomes.« less
Howerter, Amy; Hollenstein, Tom; Boon, Heather; Niemeyer, Kathryn; Brule, David
2012-01-01
This paper presents state space grids (SSGs) as a mathematically less intensive methodology for process-oriented research beyond traditional qualitative and quantitative approaches in whole systems of complementary and alternative medicine (WS-CAM). SSGs, originally applied in developmental psychology research, offer a logical, flexible, and accessible tool for capturing emergent changes in the temporal dynamics of patient behaviors, manifestations of resilience, and outcomes. The SSG method generates a two-dimensional visualization and quantification of the inter-relationships between variables on a moment-to-moment basis. SSGs can describe dyadic interactive behavior in real time and, followed longitudinally, allow evaluation of how change occurs over extended time periods. Practice theories of WS-CAM encompass the holistic health concept of whole-person outcomes, including nonlinear pathways to complex, multidimensional changes. Understanding how the patient as a living system arrives at these outcomes requires studying the process of healing, e.g., sudden abrupt worsening and/or improvements, 'healing crises', and 'unstuckness', from which the multiple inter-personal and intra-personal outcomes emerge. SSGs can document the indirect, emergent dynamic effects of interventions, transitional phases, and the mutual interaction of patient and environment that underlie the healing process. Two WS-CAM research exemplars are provided to demonstrate the feasibility of using SSGs in both dyadic and within-patient contexts, and to illustrate the possibilities for clinically relevant, process-focused hypotheses. This type of research has the potential to help clinicians select, modify and optimize treatment plans earlier in the course of care and produce more successful outcomes for more patients. Copyright © 2012 S. Karger AG, Basel.
Accurate 3d Scanning of Damaged Ancient Greek Inscriptions for Revealing Weathered Letters
NASA Astrophysics Data System (ADS)
Papadaki, A. I.; Agrafiotis, P.; Georgopoulos, A.; Prignitz, S.
2015-02-01
In this paper two non-invasive non-destructive alternative techniques to the traditional and invasive technique of squeezes are presented alongside with specialized developed processing methods, aiming to help the epigraphists to reveal and analyse weathered letters in ancient Greek inscriptions carved in masonry or marble. The resulting 3D model would serve as a detailed basis for the epigraphists to try to decipher the inscription. The data were collected by using a Structured Light scanner. The creation of the final accurate three dimensional model is a complicated procedure requiring large computation cost and human effort. It includes the collection of geometric data in limited space and time, the creation of the surface, the noise filtering and the merging of individual surfaces. The use of structured light scanners is time consuming and requires costly hardware and software. Therefore an alternative methodology for collecting 3D data of the inscriptions was also implemented for reasons of comparison. Hence, image sequences from varying distances were collected using a calibrated DSLR camera aiming to reconstruct the 3D scene through SfM techniques in order to evaluate the efficiency and the level of precision and detail of the obtained reconstructed inscriptions. Problems in the acquisition processes as well as difficulties in the alignment step and mesh optimization are also encountered. A meta-processing framework is proposed and analysed. Finally, the results of processing and analysis and the different 3D models are critically inspected and then evaluated by a specialist in terms of accuracy, quality and detail of the model and the capability of revealing damaged and "hidden" letters.
Balneaves, Lynda G; Truant, Tracy L O; Kelly, Mary; Verhoef, Marja J; Davison, B Joyce
2007-08-01
The purpose of this study was to explore the personal and social processes women with breast cancer engaged in when making decisions about complementary and alternative medicine (CAM). The overall aim was to develop a conceptual model of the treatment decision-making process specific to breast cancer care and CAM that will inform future information and decision support strategies. Grounded theory methodology explored the decisions of women with breast cancer using CAM. Semistructured interviews were conducted with 20 women diagnosed with early-stage breast cancer. Following open, axial, and selective coding, the constant comparative method was used to identify key themes in the data and develop a conceptual model of the CAM decision-making process. The final decision-making model, Bridging the Gap, was comprised of four core concepts including maximizing choices/minimizing risks, experiencing conflict, gathering and filtering information, and bridging the gap. Women with breast cancer used one of three decision-making styles to address the paradigmatic, informational, and role conflict they experienced as a result of the gap they perceived between conventional care and CAM: (1) taking it one step at a time, (2) playing it safe, and (3) bringing it all together. Women with breast cancer face conflict and anxiety when making decisions about CAM within a conventional cancer care context. Information and decision support strategies are needed to ensure women are making safe, informed treatment decisions about CAM. The model, Bridging the Gap, provides a conceptual framework for future decision support interventions.
NASA Technical Reports Server (NTRS)
Bard, J. F.
1986-01-01
The role that automation, robotics, and artificial intelligence will play in Space Station operations is now beginning to take shape. Although there is only limited data on the precise nature of the payoffs that these technologies are likely to afford there is a general consensus that, at a minimum, the following benefits will be realized: increased responsiveness to innovation, lower operating costs, and reduction of exposure to hazards. Nevertheless, the question arises as to how much automation can be justified with the technical and economic constraints of the program? The purpose of this paper is to present a methodology which can be used to evaluate and rank different approaches to automating the functions and tasks planned for the Space Station. Special attention is given to the impact of advanced automation on human productivity. The methodology employed is based on the Analytic Hierarchy Process. This permits the introduction of individual judgements to resolve the confict that normally arises when incomparable criteria underly the selection process. Because of the large number of factors involved in the model, the overall problem is decomposed into four subproblems individually focusing on human productivity, economics, design, and operations, respectively. The results from each are then combined to yield the final rankings. To demonstrate the methodology, an example is developed based on the selection of an on-orbit assembly system. Five alternatives for performing this task are identified, ranging from an astronaut working in space, to a dexterous manipulator with sensory feedback. Computational results are presented along with their implications. A final parametric analysis shows that the outcome is locally insensitive to all but complete reversals in preference.
NASA Astrophysics Data System (ADS)
Kempka, Thomas; De Lucia, Marco; Kühn, Michael
2015-04-01
The integrated assessment of long-term site behaviour taking into account a high spatial resolution at reservoir scale requires a sophisticated methodology to represent coupled thermal, hydraulic, mechanical and chemical processes of relevance. Our coupling methodology considers the time-dependent occurrence and significance of multi-phase flow processes, mechanical effects and geochemical reactions (Kempka et al., 2014). Hereby, a simplified hydro-chemical coupling procedure was developed (Klein et al., 2013) and validated against fully coupled hydro-chemical simulations (De Lucia et al., 2015). The numerical simulation results elaborated for the pilot site Ketzin demonstrate that mechanical reservoir, caprock and fault integrity are maintained during the time of operation and that after 10,000 years CO2 dissolution is the dominating trapping mechanism and mineralization occurs on the order of 10 % to 25 % with negligible changes to porosity and permeability. De Lucia, M., Kempka, T., Kühn, M. A coupling alternative to reactive transport simulations for long-term prediction of chemical reactions in heterogeneous CO2 storage systems (2014) Geosci Model Dev Discuss 7:6217-6261. doi:10.5194/gmdd-7-6217-2014. Kempka, T., De Lucia, M., Kühn, M. Geomechanical integrity verification and mineral trapping quantification for the Ketzin CO2 storage pilot site by coupled numerical simulations (2014) Energy Procedia 63:3330-3338, doi:10.1016/j.egypro.2014.11.361. Klein E, De Lucia M, Kempka T, Kühn M. Evaluation of longterm mineral trapping at the Ketzin pilot site for CO2 storage: an integrative approach using geo-chemical modelling and reservoir simulation. Int J Greenh Gas Con 2013; 19:720-730. doi:10.1016/j.ijggc.2013.05.014.
Modeling Collective Animal Behavior with a Cognitive Perspective: A Methodological Framework
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters. PMID:22761685
Siontorou, Christina G; Batzias, Fragiskos A
2014-03-01
Biosensor technology began in the 1960s to revolutionize instrumentation and measurement. Despite the glucose sensor market success that revolutionized medical diagnostics, and artificial pancreas promise currently the approval stage, the industry is reluctant to capitalize on other relevant university-produced knowledge and innovation. On the other hand, the scientific literature is extensive and persisting, while the number of university-hosted biosensor groups is growing. Considering the limited marketability of biosensors compared to the available research output, the biosensor field has been used by the present authors as a suitable paradigm for developing a methodological combined framework for "roadmapping" university research output in this discipline. This framework adopts the basic principles of the Analytic Hierarchy Process (AHP), replacing the lower level of technology alternatives with internal barriers (drawbacks, limitations, disadvantages), modeled through fault tree analysis (FTA) relying on fuzzy reasoning to count for uncertainty. The proposed methodology is validated retrospectively using ion selective field effect transistor (ISFET) - based biosensors as a case example, and then implemented prospectively membrane biosensors, putting an emphasis on the manufacturability issues. The analysis performed the trajectory of membrane platforms differently than the available market roadmaps that, considering the vast industrial experience in tailoring and handling crystallic forms, suggest the technology path of biomimetic and synthetic materials. The results presented herein indicate that future trajectories lie along with nanotechnology, and especially nanofabrication and nano-bioinformatics, and focused, more on the science-path, that is, on controlling the natural process of self-assembly and the thermodynamics of bioelement-lipid interaction. This retained the nature-derived sensitivity of the biosensor platform, pointing out the differences between the scope of academic research and the market viewpoint.
Modeling collective animal behavior with a cognitive perspective: a methodological framework.
Weitz, Sebastian; Blanco, Stéphane; Fournier, Richard; Gautrais, Jacques; Jost, Christian; Theraulaz, Guy
2012-01-01
The last decades have seen an increasing interest in modeling collective animal behavior. Some studies try to reproduce as accurately as possible the collective dynamics and patterns observed in several animal groups with biologically plausible, individual behavioral rules. The objective is then essentially to demonstrate that the observed collective features may be the result of self-organizing processes involving quite simple individual behaviors. Other studies concentrate on the objective of establishing or enriching links between collective behavior researches and cognitive or physiological ones, which then requires that each individual rule be carefully validated. Here we discuss the methodological consequences of this additional requirement. Using the example of corpse clustering in ants, we first illustrate that it may be impossible to discriminate among alternative individual rules by considering only observational data collected at the group level. Six individual behavioral models are described: They are clearly distinct in terms of individual behaviors, they all reproduce satisfactorily the collective dynamics and distribution patterns observed in experiments, and we show theoretically that it is strictly impossible to discriminate two of these models even in the limit of an infinite amount of data whatever the accuracy level. A set of methodological steps are then listed and discussed as practical ways to partially overcome this problem. They involve complementary experimental protocols specifically designed to address the behavioral rules successively, conserving group-level data for the overall model validation. In this context, we highlight the importance of maintaining a sharp distinction between model enunciation, with explicit references to validated biological concepts, and formal translation of these concepts in terms of quantitative state variables and fittable functional dependences. Illustrative examples are provided of the benefits expected during the often long and difficult process of refining a behavioral model, designing adapted experimental protocols and inversing model parameters.
NASA Technical Reports Server (NTRS)
Baecher, Juergen; Bandte, Oliver; DeLaurentis, Dan; Lewis, Kemper; Sicilia, Jose; Soboleski, Craig
1995-01-01
This report documents the efforts of a Georgia Tech High Speed Civil Transport (HSCT) aerospace student design team in completing a design methodology demonstration under NASA's Advanced Design Program (ADP). Aerodynamic and propulsion analyses are integrated into the synthesis code FLOPS in order to improve its prediction accuracy. Executing the integrated product and process development (IPPD) methodology proposed at the Aerospace Systems Design Laboratory (ASDL), an improved sizing process is described followed by a combined aero-propulsion optimization, where the objective function, average yield per revenue passenger mile ($/RPM), is constrained by flight stability, noise, approach speed, and field length restrictions. Primary goals include successful demonstration of the application of the response surface methodolgy (RSM) to parameter design, introduction to higher fidelity disciplinary analysis than normally feasible at the conceptual and early preliminary level, and investigations of relationships between aerodynamic and propulsion design parameters and their effect on the objective function, $/RPM. A unique approach to aircraft synthesis is developed in which statistical methods, specifically design of experiments and the RSM, are used to more efficiently search the design space for optimum configurations. In particular, two uses of these techniques are demonstrated. First, response model equations are formed which represent complex analysis in the form of a regression polynomial. Next, a second regression equation is constructed, not for modeling purposes, but instead for the purpose of optimization at the system level. Such an optimization problem with the given tools normally would be difficult due to the need for hard connections between the various complex codes involved. The statistical methodology presents an alternative and is demonstrated via an example of aerodynamic modeling and planform optimization for a HSCT.
Leavesley, G.; Hay, L.
1998-01-01
Coupled atmospheric and hydrological models provide an opportunity for the improved management of water resources in headwater basins. Issues currently limiting full implementation of coupled-model methodologies include (a) the degree of uncertainty in the accuracy of precipitation and other meteorological variables simulated by atmospheric models, and (b) the problem of discordant scales between atmospheric and bydrological models. Alternative methodologies being developed to address these issues are reviewed.
The kidney allocation score: methodological problems, moral concerns and unintended consequences.
Hippen, B
2009-07-01
The growing disparity between the demand for and supply of kidneys for transplantation has generated interest in alternative systems of allocating kidneys from deceased donors. This personal viewpoint focuses attention on the Kidney Allocation Score (KAS) proposal promulgated by the UNOS/OPTN Kidney Committee. I identify several methodological and moral flaws in the proposed system, concluding that any iteration of the KAS proposal should be met with more skepticism than sanguinity.
An Educational Methodology for Enhancing Familiarity with United States Air Force Combat Logistics.
1983-09-01
degree of specialization in the supply 5 system and recommended an alternative organizational struc- ture under combat conditions (29:5,9). Two needs...many sub-issues. CLASSROOM MATERIAL SOURCES: Alternatives to standard air base functions should definitely be brought out also, especially the Air Force...C . -’ , - . . . .... .-. -... . or aspects of the subject. The investigative format we use should be flexible enough that we can explore
Organizing for teamwork in healthcare: an alternative to team training?
Rydenfält, Christofer; Odenrick, Per; Larsson, Per Anders
2017-05-15
Purpose The purpose of this paper is to explore how organizational design could support teamwork and to identify organizational design principles that promote successful teamwork. Design/methodology/approach Since traditional team training sessions take resources away from production, the alternative approach pursued here explores the promotion of teamwork by means of organizational design. A wide and pragmatic definition of teamwork is applied: a team is considered to be a group of people that are set to work together on a task, and teamwork is then what they do in relation to their task. The input - process - output model of teamwork provides structure to the investigation. Findings Six teamwork enablers from the healthcare team literature - cohesion, collaboration, communication, conflict resolution, coordination, and leadership - are discussed, and the organizational design measures required to implement them are identified. Three organizational principles are argued to facilitate the teamwork enablers: team stability, occasions for communication, and a participative and adaptive approach to leadership. Research limitations/implications The findings could be used as a foundation for intervention studies to improve team performance or as a framework for evaluation of existing organizations. Practical implications By implementing these organizational principles, it is possible to achieve many of the organizational traits associated with good teamwork. Thus, thoughtful organization for teamwork can be used as an alternative or complement to the traditional team training approach. Originality/value With regards to the vast literature on team training, this paper offers an alternative perspective on how to improve team performance in healthcare.
Schnyer, Rosa N; Allen, John J B
2002-10-01
An important methodological challenge encountered in acupuncture clinical research involves the design of treatment protocols that help ensure standardization and replicability while allowing for the necessary flexibility to tailor treatments to each individual. Manualization of protocols used in clinical trials of acupuncture and other traditionally-based complementary and alternative medicine (CAM) systems facilitates the systematic delivery of replicable and standardized, yet individually-tailored treatments. To facilitate high-quality CAM acupuncture research by outlining a method for the systematic design and implementation of protocols used in CAM clinical trials based on the concept of treatment manualization. A series of treatment manuals was developed to systematically articulate the Chinese medical theoretical and clinical framework for a given Western-defined illness, to increase the quality and consistency of treatment, and to standardize the technical aspects of the protocol. In all, three manuals were developed for National Institutes of Health (NIH)-funded clinical trials of acupuncture for depression, spasticity in cerebral palsy, and repetitive stress injury. In Part I, the rationale underlying these manuals and the challenges encountered in creating them are discussed, and qualitative assessments of their utility are provided. In Part II, a methodology to develop treatment manuals for use in clinical trials is detailed, and examples are given. A treatment manual provides a precise way to train and supervise practitioners, enable evaluation of conformity and competence, facilitate the training process, and increase the ability to identify the active therapeutic ingredients in clinical trials of acupuncture.
NASA Astrophysics Data System (ADS)
Hassan, Moinuddin; Ilev, Ilko
2016-03-01
Ophthalmic Viscosurgical Devices (OVDs) in clinical setting are a major health risk factor for potential endotoxin contamination in the eye, due to their extensive applications in cataract surgery for space creation, stabilization and protection of intraocular tissue and intraocular lens (IOL) during implantation. Endotoxin contamination of OVDs is implicated in toxic anterior syndrome (TASS), a severe complication of cataract surgery that leads to intraocular damage and even blindness. Current standard methods for endotoxin contamination detection utilize rabbit assay or Limulus amoebocyte lysate (LAL) assays. These endotoxin detection strategies are extremely difficult for gel-like type devices such as OVDs. To overcome the endotoxin detection limitations in OVDs, we have developed an alternative optical detection methodology for label-free and real-time sensing of bacterial endotoxin in OVDs, based on fiber-optic Fourier transform infrared (FO-FTIR) transmission spectrometry in the mid-IR spectral range from 2.5 micron to 12 micron. Endotoxin contaminated OVD test samples were prepared by serial dilutions of endotoxins on OVDs. The major results of this study revealed two salient spectral peak shifts (in the regions 2925 to 2890 cm^-1 and 1125 to 1100 cm^-1), which are associated with endotoxin in OVDs. In addition, FO-FTIR experimental results processed using a multivariate analysis confirmed the observed specific peak shifts associated with endotoxin contamination in OVDs. Thus, employing the FO-FTIR sensing methodology integrated with a multivariate analysis could potentially be used as an alternative endotoxin detection technique in OVD.
Alternate approaches to repress endogenous microRNA activity in Arabidopsis thaliana
Wang, Ming-Bo
2011-01-01
MicroRnAs (miRnAs) are an endogenous class of regulatory small RnA (sRnA). in plants, miRnAs are processed from short non-protein-coding messenger RnAs (mRnAs) transcribed from small miRnA genes (MIR genes). Traditionally in the model plant Arabidopsis thaliana (Arabidopsis), the functional analysis of a gene product has relied on the identification of a corresponding T-DnA insertion knockout mutant from a large, randomly-mutagenized population. However, because of the small size of MIR genes and presence of multiple, highly conserved members in most plant miRnA families, it has been extremely laborious and time consuming to obtain a corresponding single or multiple, null mutant plant line. Our recent study published in Molecular Plant1 outlines an alternate method for the functional characterization of miRnA action in Arabidopsis, termed anti-miRnA technology. Using this approach we demonstrated that the expression of individual miRnAs or entire miRnA families, can be readily and efficiently knocked-down. Our approach is in addition to two previously reported methodologies that also allow for the targeted suppression of either individual miRnAs, or all members of a MIR gene family; these include miRnA target mimicry2,3 and transcriptional gene silencing (TGS) of MIR gene promoters.4 All three methodologies rely on endogenous gene regulatory machinery and in this article we provide an overview of these technologies and discuss their strengths and weaknesses in inhibiting the activity of their targeted miRnA(s). PMID:21358288
NASA Astrophysics Data System (ADS)
Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash
2017-10-01
Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.
Kosmowski, Frédéric; Stevenson, James; Campbell, Jeff; Ambel, Alemayehu; Haile Tsegay, Asmelash
2017-10-01
Maintaining permanent coverage of the soil using crop residues is an important and commonly recommended practice in conservation agriculture. Measuring this practice is an essential step in improving knowledge about the adoption and impact of conservation agriculture. Different data collection methods can be implemented to capture the field level crop residue coverage for a given plot, each with its own implication on survey budget, implementation speed and respondent and interviewer burden. In this paper, six alternative methods of crop residue coverage measurement are tested among the same sample of rural households in Ethiopia. The relative accuracy of these methods are compared against a benchmark, the line-transect method. The alternative methods compared against the benchmark include: (i) interviewee (respondent) estimation; (ii) enumerator estimation visiting the field; (iii) interviewee with visual-aid without visiting the field; (iv) enumerator with visual-aid visiting the field; (v) field picture collected with a drone and analyzed with image-processing methods and (vi) satellite picture of the field analyzed with remote sensing methods. Results of the methodological experiment show that survey-based methods tend to underestimate field residue cover. When quantitative data on cover are needed, the best estimates are provided by visual-aid protocols. For categorical analysis (i.e., >30% cover or not), visual-aid protocols and remote sensing methods perform equally well. Among survey-based methods, the strongest correlates of measurement errors are total farm size, field size, distance, and slope. Results deliver a ranking of measurement options that can inform survey practitioners and researchers.
The experiential curriculum: an alternate model for anaesthesia education.
Tweed, W A; Donen, N
1994-12-01
The shift to direct entry into residency training from medical school for all graduates will offer new challenges for anaesthesia training programmes. In this paper we argue that it also offers us an opportunity to re-evaluate our current approach to anaesthesia education. Emphasis in the residency programmes should be to provide trainees with clinical experiences and stimulation that will develop the required traditional competencies. It should also cultivate competency in clinical decision-making, intuition and judgement. Our purpose is to generate discussion by proposing an alternate curriculum model, the experiential curriculum. The basic premise is that learning is a process and outcome is to a large extent related to what the learner does. The process begins with an experience that provides for observation and reflection. Integration of the thoughts provides the basis for executing either existing or new actions. In the experiential curriculum residency training and learning are enhanced by documenting and critically evaluating the experiences to which the resident is exposed. Included within such a structured programme are the methodologies of problem-based and evidence-based learning. Faculty development will be required to help the resident pursue these skills of self-evaluation and efficient learning. We believe that incorporation of an experiential curriculum into the residency training programme will achieve the goals listed above and allow maturation of the process of lifelong learning. It will also allow greater achievement of the application of new information to one's practice.
What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.
Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy
2016-01-01
When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use 'research synthesis' as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.
ERIC Educational Resources Information Center
Lauckner, Heidi; Paterson, Margo; Krupa, Terry
2012-01-01
Often, research projects are presented as final products with the methodologies cleanly outlined and little attention paid to the decision-making processes that led to the chosen approach. Limited attention paid to these decision-making processes perpetuates a sense of mystery about qualitative approaches, particularly for new researchers who will…
77 FR 8277 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-14
... approved collection. (2) Title of the Form/Collection: Methodological research to support the redesign of... to take 15-30 minutes, while a cognitive interview for testing alternative methods for measuring...
Express bus-fringe parking planning methodology.
DOT National Transportation Integrated Search
1975-01-01
The conception, calibration, and evaluation of alternative disaggregate behavioral models of the express bus-fringe parking travel choice situation are described. Survey data collected for the Parham Express Service in Richmond, Virginia, are used to...
EMERGY METHODS: VALUABLE INTEGRATED ASSESSMENT TOOLS
NHEERL's Atlantic Ecology Division is investigating emergy methods as tools for integrated assessment in several projects evaluating environmental impacts, policies, and alternatives for remediation and intervention. Emergy accounting is a methodology that provides a quantitative...
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeRosa, C.T.; Choudhury, H.; Schoeny, R.S.
Risk assessment can be thought of as a conceptual approach to bridge the gap between the available data and the ultimate goal of characterizing the risk or hazard associated with a particular environmental problem. To lend consistency to and to promote quality in the process, the US Environmental Protection Agency (EPA) published Guidelines for Risk Assessment of Carcinogenicity, Developmental Toxicity, Germ Cell Mutagenicity and Exposure Assessment, and Risk Assessment of Chemical Mixtures. The guidelines provide a framework for organizing the information, evaluating data, and for carrying out the risk assessment in a scientifically plausible manner. In the absence of sufficientmore » scientific information or when abundant data are available, the guidelines provide alternative methodologies that can be employed in the risk assessment. 4 refs., 3 figs., 2 tabs.« less
[Basic principles and methodological considerations of health economic evaluations].
Loza, Cesar; Castillo-Portilla, Manuel; Rojas, José Luis; Huayanay, Leandro
2011-01-01
Health Economics is an essential instrument for health management, and economic evaluations can be considered as tools assisting the decision-making process for the allocation of resources in health. Currently, economic evaluations are increasingly being used worldwide, thus encouraging evidence-based decision-making and seeking efficient and rational alternatives within the framework of health services activities. In this review, we present an overview and define the basic types of economic evaluations, with emphasis on complete Economic Evaluations (EE). In addition, we review key concepts regarding the perspectives from which EE can be conducted, the types of costs that can be considered, the time horizon, discounting, assessment of uncertainty and decision rules. Finally, we describe concepts about the extrapolation and spread of economic evaluations in health.
On a Chirplet Transform Based Method for Co-channel Voice Separation
NASA Astrophysics Data System (ADS)
Dugnol, B.; Fernández, C.; Galiano, G.; Velasco, J.
We use signal and image theory based algorithms to produce estimations of the number of wolves emitting howls or barks in a given field recording as an individuals counting alternative to the traditional trace collecting methodologies. We proceed in two steps. Firstly, we clean and enhance the signal by using PDE based image processing algorithms applied to the signal spectrogram. Secondly, assuming that the wolves chorus may be modelled as an addition of nonlinear chirps, we use the quadratic energy distribution corresponding to the Chirplet Transform of the signal to produce estimates of the corresponding instantaneous frequencies, chirp-rates and amplitudes at each instant of the recording. We finally establish suitable criteria to decide how such estimates are connected in time.
Barathi, M; Kumar, A Santhana Krishna; Rajesh, N
2014-05-01
In the present work, we propose for the first time a novel ultrasound assisted methodology involving the impregnation of zirconium in a cellulose matrix. Fluoride from aqueous solution interacts with the cellulose hydroxyl groups and the cationic zirconium hydroxide. Ultrasonication ensures a green and quick alternative to the conventional time intensive method of preparation. The effectiveness of this process was confirmed by comprehensive characterization of zirconium impregnated cellulose (ZrIC) adsorbent using Fourier transform infrared spectroscopy (FT-IR), energy dispersive X-ray spectrometry (EDX) and X-ray diffraction (XRD) studies. The study of various adsorption isotherm models, kinetics and thermodynamics of the interaction validated the method. Copyright © 2013 Elsevier B.V. All rights reserved.
Issues surrounding the health economic evaluation of genomic technologies
Buchanan, James; Wordsworth, Sarah; Schuh, Anna
2014-01-01
Aim Genomic interventions could enable improved disease stratification and individually tailored therapies. However, they have had a limited impact on clinical practice to date due to a lack of evidence, particularly economic evidence. This is partly because health economists are yet to reach consensus on whether existing methods are sufficient to evaluate genomic technologies. As different approaches may produce conflicting adoption decisions, clarification is urgently required. This article summarizes the methodological issues associated with conducting economic evaluations of genomic interventions. Materials & methods A structured literature review was conducted to identify references that considered the methodological challenges faced when conducting economic evaluations of genomic interventions. Results Methodological challenges related to the analytical approach included the choice of comparator, perspective and timeframe. Challenges in costing centered around the need to collect a broad range of costs, frequently, in a data-limited environment. Measuring outcomes is problematic as standard measures have limited applicability, however, alternative metrics (e.g., personal utility) are underdeveloped and alternative approaches (e.g., cost–benefit analysis) underused. Effectiveness data quality is weak and challenging to incorporate into standard economic analyses, while little is known about patient and clinician behavior in this context. Comprehensive value of information analyses are likely to be helpful. Conclusion Economic evaluations of genomic technologies present a particular challenge for health economists. New methods may be required to resolve these issues, but the evidence to justify alternative approaches is yet to be produced. This should be the focus of future work in this field. PMID:24236483
Darvishi Cheshmeh Soltani, Reza; Safari, Mahdi
2016-09-01
The improvement of sonocatalytic treatment of real textile wastewater in the presence of MgO nanoparticles was the main goal of the present study. According to our preliminary results, the application of pulse mode of sonication, together with the addition of periodate ions, produced the greatest sonocatalytic activity and consequently, the highest chemical oxygen demand (COD) removal efficiency (73.95%) among all the assessed options. In the following, pulsed sonocatalysis of real textile wastewater in the presence of periodate ions was evaluated response surface methodologically on the basis of central composite design. Accordingly, a high correlation coefficient of 0.95 was attained for the applied statistical strategy to optimize the process. As results, a pulsed sonication time of 141min, MgO dosage of 2.4g/L, solution temperature of 314K and periodate concentration of 0.11M gave the maximum COD removal of about 85%. Under aforementioned operational conditions, the removal of total organic carbon (TOC) was obtained to be 63.34% with the reaction rate constant of 7.1×10(-3)min(-1) based on the pseudo-first order kinetic model (R(2)=0.99). Overall, periodate-assisted pulsed sonocatalysis over MgO nanoparticles can be applied as an efficient alternative process for treating and mineralizing real textile wastewater with good reusability potential. Copyright © 2016 Elsevier B.V. All rights reserved.
Evidence on public policy: methodological issues, political issues and examples.
Attanasio, Orazio P
2014-03-01
In this paper I discuss how evidence on public policy is generated and in particular the issue of evaluation of public policies. In economics, the issue of attribution and the identification of causal links has recently received considerable attention. Important methodological issues have been tackled and new techniques have been proposed and used. Randomized Control Trials have become some sort of gold standard. However, they are not exempt from problems and have important limitations: in some case they cannot be constructed and, more generally, problems of external validity and transferability of results can be important. The paper then moves on to discuss the political economy of policy evaluations for policy evaluations to have an impact for the conduct of actual policy, it is important that the demand for evaluation comes directly from the policy making process and is generated endogenously within it. In this sense it is important that the institutional design of policy making is such that policy making institutions are incentivized to use rigorous evaluation in the process of designing policies and allocating resources to alternative options. Economists are currently involved in the design and evaluation of many policies, including policies about health, nutrition and education. The role they can play in these fields is not completely obvious. The paper argues that their main contribution is in the modelling of how individual reacts to incentives (including those provided by public policies).