Sample records for bioventing optimization test

  1. OPTIMIZING BIOVENTING IN SHALLOW VADOSE ZONES AND COLD CLIMATES

    EPA Science Inventory

    This paper describes a bioventing study design and initial activities applied to a JP-4 jet fuel spill at Eielson Air Force Base, Alaska. The primary objectives of the project were to investigate the feasibility of using bioventing technology to remediate JP-4 jet fuel contaminat...

  2. In situ bioventing at a natural gas dehydrator site: Field demonstration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, A.W.; Miller, D.L.; Miller, J.A.

    1995-12-31

    This paper describes a bioventing/biosparging field demonstration that was conducted over a 10-month period at a former glycol dehydrator site located near Traverse City, Michigan. The goal of the project was to determine the feasibility of this technology for dehydrator site remediation and to develop engineering design concepts for applying bioventing/biosparging at similar sites. The chemicals of interest are benzene, toluene, ethylbenzene, and xylenes (BTEX) and alkanes. Soil sampling indicated that the capillary fringe and saturated zones were heavily contaminated, but that the unsaturated zone was relatively free of the contaminants. A pump-and-treat system has operated since 1991 to treatmore » the groundwater BTEX plume. Bioventing/biosparging was installed in September 1993 to treat the contaminant source area. Three different air sparging operating modes were tested to determine an optimal process configuration for site remediation. These operational modes were compared through in situ respirometry studies. Respirometry measurements were used to estimate biodegradation rates. Dissolved oxygen and carbon dioxide were monitored in the groundwater.« less

  3. Bioventing remediation and ecotoxicity evaluation of phenanthrene-contaminated soil.

    PubMed

    García Frutos, F Javier; Escolano, Olga; García, Susana; Babín, Mar; Fernández, M Dolores

    2010-11-15

    The objectives of soil remediation processes are usually based on threshold levels of soil contaminants. However, during remediation processes, changes in bioavailability and metabolite production can occur, making it necessary to incorporate an ecotoxicity assessment to estimate the risk to ecological receptors. The evolution of contaminants and soil ecotoxicity of artificially phenanthrene-contaminated soil (1000 mg/kg soil) during soil treatment through bioventing was studied in this work. Bioventing was performed in glass columns containing 5.5 kg of phenanthrene-contaminated soil and uncontaminated natural soil over a period of 7 months. Optimum conditions of mineralisation (humidity=60% WHC; C/N/P=100:20:1) were determined in a previous work. The evolution of oxygen consumption, carbon dioxide production, phenanthrene concentration and soil toxicity were studied on sacrificed columns at periods of 0, 3 and 7 months. Toxicity to soil and aquatic organisms was determined using a multispecies system in the soil columns (MS-3). In the optimal bioventing treatability test, we obtained a reduction rate in phenanthrene concentration higher that 93% after 7 months of treatment. The residual toxicity obtained at the end of the treatment was not attributed to the low phenanthrene concentration, but to the ammonia used to restore the optimal C/N ratio. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Sustainable Horizontal Bioventing and Vertical Biosparging Implementation (Invited)

    NASA Astrophysics Data System (ADS)

    Leu, J.; Lin, J.; Ferris, S.

    2013-12-01

    A former natural gas processing site with total petroleum hydrocarbons (TPH) and benzene, toluene, ethylbenzene, and xylene (BTEX) impacts in both soil and groundwater was partially excavated to remove 2,400 cubic yards of impacted soil. However, due to active natural gas pipelines within the impacted footprint, excavation was discontinued and an area of impacted soil containing maximum concentrations of 5,000 mg/kg gasoline-range organics (GRO), 8,600 mg/kg diesel-range organics (DRO), and 130 mg/kg motor oil-range organics (ORO). Groundwater was impacted with concentrations up to 2,300 μg/L GRO and 4,200 μg/L DRO remained in place. Taking advantage of the open excavation, horizontal-screened piping was placed in the backfill to deliver air for bioventing, which resulted in successful remediation of soil in a physically inaccessible area. The combined use of excavation of the source area, bioventing of surrounding inaccessible soil, and biosparging of the groundwater and smear zone resulted in nearing a no-further-action status at the site. The sustainable bioventing system consisted of one 3-HP blower and eight horizontal air injection wells. Five dual-depth nested vapor monitoring points (VMPs) were installed at 5 feet and 10 feet below ground surface as part of the monitoring system for human health and system performance. The bioventing system operated for one year followed by a three-month rebound test. During the one-year operation, air flow was periodically adjusted to maximize removal of volatile organic compounds (VOCs) from the vent wells with elevated photo-ionization detector readings. After the bioventing successfully remediated the inaccessible impacted soil, the biosparging system incorporated the pre-existing bioventing unit with an upgraded 5-HP blower and three vertical biosparging wells to biodegrade dissolved phase impacts in the groundwater. The subsequent monitoring system includes the VMPs, the air injection wells, and four groundwater monitoring wells including three existing wells. The system is scheduled to operate for at least one year followed by a three-month rebound test. The flow rate was adjusted between 5 and 10 scfm during operations to focus the biosparging in the impacted area of the site. After the bioventing system was operated and optimized for a year, average VOC concentrations were reduced from approximately 120 to 5 ppmv in the vadose zone. TPH gasoline and BTEX concentrations experienced reductions up to 99%. Fugitive VOCs were not detected outside the property boundary or at possible fugitive gas monitoring points. During the rebound test, no significant rebound of VOC concentrations was observed. The average hydrocarbon biodegradation rate was estimated to be approximately 2.5 mg TPH/kg soil/day. During biosparging, the migration of injected air also stimulated biodegradation in the vadose zone. Within six months of operation, the groundwater GRO and DRO concentrations decreased approximately 70% and 50%, respectively, at the monitoring well within the excavation/backfill area. Bioventing followed by biosparging has proven to be successful in decreasing soil vapor chemicals of concern in the native soil of the inaccessible area and in groundwater of the excavation/backfill area.

  5. FIELD TEST OF NONFUEL HYDROCARBON BIOVENTING IN CLAYEY-SAND SOIL

    EPA Science Inventory

    A pilot-scale bioventing test was conducted at the Greenwood Chemical Superfund Site in Virginia. The characteristics of the site included clayey-sand soils and nonfuel organic contamination such as acetone, toluene, and naphthalene in the vadose zone. Based on the results of an...

  6. Natural Pressure-Driven Passive Bioventing

    DTIC Science & Technology

    2000-09-01

    8217 300’ PFFA SCALE : 1 "= 300’ LEGEND 0 ABOVE GROUND STORAGE TANK I BUILDING FENCE = = = : DRAINAGE CHANNEL \\731272\\REPORT\\FINAL\\GRA PHICS...preparation for full- scale design of a conventional bioventing system at the PFFA, a bioventing pilot test was conducted in the demonstration area prior...PFFAVW02 @ @ PFFABOS02 PFFAVMP14..6. @ PFFABOS04 • PFFABOS06 CPT-BOSSA @ PFFABOS08 ~ JM11 ~? r 1,o v SCALE IN FEET FIGURE 6 SITE PLAN PFFA

  7. MANUAL: BIOVENTING PRINCIPLES AND PRACTICE VOLUME II. BIOVENTING DESIGN

    EPA Science Inventory

    The results from bioventing research and development efforts and from the pilot-scale bioventing systems have been used to produce this two-volume manual. Although this design manual has been written based on extensive experience with petroleum hydrocarbons (and thus, many exampl...

  8. MANUAL: BIOVENTING PRINCIPLES AND PRACTICE VOLUME I. BIOVENTING PRINCIPLES

    EPA Science Inventory

    Bioventing is the process of aerating soils to stimulate in situ biological activity and promote bioremediation. Bioventing typically is applied in situ to the vadose zone and is applicable to any chemical that can be aerobically biodegraded but to date has primarily been impleme...

  9. Case Study: del Amo Bioventing

    EPA Science Inventory

    The attached presentation discusses the fundamentals of bioventing in the vadose zone. The basics of bioventing are presented. The experience to date with the del Amo Superfund Site is presented as a case study.

  10. BIOVENTING DEVELOPMENT PROGRAM (TREATMENT AND DESTRUCTION BRANCH, LRPCD, NRMRL)

    EPA Science Inventory

    In a continuing effort to develop environment-friendly and cost-effective remediation technologies, the Land Remediation and Pollution Control Division (LRPCD) conducts an aggressive research and development program in bioventing. LRPCD's bioventing program is multi-faceted, with...

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brownlow, D.T.; Escude, S.; Johanneson, O.H.

    The 1500 Area at Kelly Air Force Base (AFB) was the site of a subsurface release of approximately 1,000 gallons of JP-4 jet fuel. Preliminary studies found evidence of hydrocarbon contamination extending from 10 feet below ground surface (bgs) down to the shallow water table, at 20 to 25 feet bgs. In June of 1993, Kelly AFB authorized the installation and evaluation of a bioventing system at this site to aid in the cleanup of the hydrocarbon contaminated soils. The purpose of the bioventing system is to aerate subsurface soils within and immediately surrounding the release area, in order tomore » stimulate in-situ biological activity and enhance the natural bioremediation capacity of the soil. Augmenting oxygen to the indigenous soil microorganisms promotes the aerobic metabolism of fuel hydrocarbons in the soil. In vadose zone soils exhibiting relatively good permeability, bioventing has proven to be a highly cost effective remediation technology for treating fuel contaminated soils. In November, 1993, a Start-Up Test program consisting of an In-Situ Respiration Test (ISRT) and an Air Permeability Test was performed at the 1500 Area Spill Site.« less

  12. Investigations into the application of a combination of bioventing and biotrickling filter technologies for soil decontamination processes--a transition regime between bioventing and soil vapour extraction.

    PubMed

    Magalhães, S M C; Ferreira Jorge, R M; Castro, P M L

    2009-10-30

    Bioventing has emerged as one of the most cost-effective in situ technologies available to address petroleum light-hydrocarbon spills, one of the most common sources of soil pollution. However, the major drawback associated with this technology is the extended treatment time often required. The present study aimed to illustrate how an intended air-injection bioventing technology can be transformed into a soil vapour extraction effort when the air flow rates are pushed to a stripping mode, thus leading to the treatment of the off-gas resulting from volatilisation. As such, a combination of an air-injection bioventing system and a biotrickling filter was applied for the treatment of contaminated soil, the latter aiming at the treatment of the emissions resulting from the bioventing process. With a moisture content of 10%, soil contaminated with toluene at two different concentrations, namely 2 and 14 mg g soil(-1), were treated successfully using an air-injection bioventing system at a constant air flow rate of ca. 0.13 dm(3) min(-1), which led to the removal of ca. 99% toluene, after a period of ca. 5 days of treatment. A biotrickling filter was simultaneously used to treat the outlet gas emissions, which presented average removal efficiencies of ca. 86%. The proposed combination of biotechnologies proved to be an efficient solution for the decontamination process, when an excessive air flow rate was applied, reducing both the soil contamination and the outlet gas emissions, whilst being able to reduce the treatment time required by bioventing only.

  13. COMPARISON OF FIELD AEROBIC BIODEGRADATION RATES TO LABORATORY

    EPA Science Inventory

    It is common to use bioventing as a polishing step for soil vapor extraction. It was originally planned to use soil vapor extraction and bioventing at a former landfill site in Delaware but laboratory scale biodegradation studies indicated that most of the volatile organic compou...

  14. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPORT EXTRACTION AND BIOVENTING OF ORGANIC MATERIALS IN UNSATURATED GEOLOGICAL MATERIAL

    EPA Science Inventory

    This report describes the formulation, numerical development, and use of a multiphase, multicomponent, biodegradation model designed to simulate physical, chemical, and biological interactions occurring primarily in field scale soil vapor extraction (SVE) and bioventing (B...

  15. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPOR EXTRACTION AND BIOVENTING OF ORGANIC CHEMICALS IN UNSATURATED GEOLOGICAL MATERIAL

    EPA Science Inventory

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  16. MICHIGAN SOIL VAPOR EXTRACTION REMEDIATION (MISER) MODEL: A COMPUTER PROGRAM TO MODEL SOIL VAPORT EXTRACTION AND BIOVENTING OF ORGANIC MATERIALS IN UNSATURATED GEOLO-GICAL MATERIAL (EPA/600/SR-97/099)

    EPA Science Inventory

    Soil vapor extraction (SVE) and bioventing (BV) are proven strategies for remediation of unsaturated zone soils. Mathematical models are powerful tools that can be used to integrate and quantify the interaction of physical, chemical, and biological processes occurring in field sc...

  17. In situ bioremediation of a former natural gas dehydrator site using bioventing/biosparging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shamory, B.D.; Lawrence, A.W.; Miller, D.L.

    1995-12-01

    The Gas Research Institute (GRI) is conducting a research program on site remediation and residuals management for natural gas exploration and production (E&P) activities. Biological processes are considered to be a key component of the GRI remedial strategy since most of the chemicals-of-interest in soils and groundwater at E&P sites have been reported to be biodegradable. A bioventing/biosparging field demonstration was conducted over a ten month period at a former glycol dehydrator site, located near Traverse City, Michigan. The chemicals-of-interest at this site were benzene, toluene, ethylbenzene, and xylenes; and alkanes (primarily C{sub 4} through C{sub 10}). The goal ofmore » the project was to determine the feasibility of using this technology for dehydrator site remediation and to develop engineering basis of design concepts for applying bioventing/biosparging at other similar sites. Three different air sparging operational modes (pulsed, continuous, and offgas recycle) were tested to determine the optimum process configuration for site remediation. Biodegradation was also evaluated. Operational mode performance was evaluated by situ conducting in situ respirometry studies. Depletion of oxygen and hydrocarbons and production of carbon dioxide were used to calculated biodegradation rates in the vadose and saturated zones. The mass of hydrocarbons biologically degraded was estimated based on these biokinetic rates. In addition, biodegradation was also estimated based on contaminant removal shown by analytical sampling of soil and groundwater and based on other losses attributed to pump and treat and soil vapor extraction systems. In addition, an engineering evaluation of the operating modes is presented. The results of this study suggest that bioventing/biosparging is a feasible technology for in situ remediation of soil and groundwater at gas industry glycol dehydrator sites and that the pulsed operating mode may have an advantage over the other modes.« less

  18. Final Treatability Study in Support of Remediation by Natural Attenuation Site FT-1 at Fairchild Air Force Base, Spokane, Washington

    DTIC Science & Technology

    1997-10-01

    and xylene (BTEX) in the shallow groundwater system at the site. Dissolved chlorinated aliphatic hydrocarbons (CAHs) also are present in the shallow...micrograms per liter (gg/L)], RNA with LTM I should be used to complement the ROD-mandated bioventing and air sparging systems . 0 When bioventing and...The ROD identifies benzene as the primary contaminant of concern (COC) for FT-i and specifies the use of air sparging in the remediation system

  19. APPLICATION, PERFORMANCE, AND COSTS OF ...

    EPA Pesticide Factsheets

    A critical review of biological treatment processes for remediation of contaminated soils is presented. The focus of the review is on documented cost and performance of biological treatment technologies demonstrated at full- or field-scale. Some of the data were generated by the U.S. Environmental Protection Agency's (EPA's) Bioremediation in the Field Program, jointly supported by EPA's Office of Research and Development, EPA's Office of Solid Waste and Emergency Waste, and the EPA Regions through the Superfund Innovative Technology Evaluation Program (SITE) Program. Military sites proved to be another fertile data source. Technologies reviewed in this report include both ex-situ processes, (land treatment, biopile/biocell treatment, composting, and bioslurry reactor treatment) and in-situ alternatives (conventional bioventing, enhanced or cometabolic bioventing, anaerobic bioventing, bioslurping, phytoremediation, and natural attenuation). Targeted soil contaminants at the documented sites were primarily organic chemicals, including BTEX, petroleum hydrocarbons, polycyclic aromatic hydrocarbons (PAHs), chlorinated aliphatic hydrocarbons (CAHs), organic solvents, polychlorinated biphenyls (PCBs), pesticides, dioxin, and energetics. The advantages, limitations, and major cost drivers for each technology are discussed. Box and whisker plots are used to summarize before and after concentrations of important contaminant groups for those technologies consider

  20. Monitoring biodegradation of diesel fuel in bioventing processes using in situ respiration rate.

    PubMed

    Lee, T H; Byun, I G; Kim, Y O; Hwang, I S; Park, T J

    2006-01-01

    An in situ measuring system of respiration rate was applied for monitoring biodegradation of diesel fuel in a bioventing process for bioremediation of diesel contaminated soil. Two laboratory-scale soil columns were packed with 5 kg of soil that was artificially contaminated by diesel fuel as final TPH (total petroleum hydrocarbon) concentration of 8,000 mg/kg soil. Nutrient was added to make a relative concentration of C:N:P = 100:10:1. One soil column was operated with continuous venting mode, and the other one with intermittent (6 h venting/6 h rest) venting mode. On-line O2 and CO2 gas measuring system was applied to measure O2 utilisation and CO2 production during biodegradation of diesel for 5 months. Biodegradation rate of TPH was calculated from respiration rate measured by the on-line gas measuring system. There were no apparent differences between calculated biodegradation rates from two columns with different venting modes. The variation of biodegradation rates corresponded well with trend of the remaining TPH concentrations comparing other biodegradation indicators, such as C17/pristane and C18/phytane ratio, dehydrogenase activity, and the ratio of hydrocarbon utilising bacteria to total heterotrophic bacteria. These results suggested that the on-line measuring system of respiration rate would be applied to monitoring biodegradation rate and to determine the potential applicability of bioventing process for bioremediation of oil contaminated soil.

  1. Pilot-scale studies of soil vapor extraction and bioventing for remediation of a gasoline spill at Cameron Station, Alexandria, Virginia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, W.; Joss, C.J.; Martino, L.E.

    Approximately 10,000 gal of spilled gasoline and unknown amounts Of trichloroethylene and benzene were discovered at the US Army`s Cameron Station facility. Because the base is to be closed and turned over to the city of Alexandria in 1995, the Army sought the most rapid and cost-effective means of spill remediation. At the request of the Baltimore District of the US Army Corps of Engineers, Argonne conducted a pilot-scale study to determine the feasibility of vapor extraction and bioventing for resolving remediation problems and to critique a private firm`s vapor-extraction design. Argonne staff, working with academic and private-sector participants, designedmore » and implemented a new systems approach to sampling, analysis and risk assessment. The US Geological Survey`s AIRFLOW model was adapted for the study to simulate the performance of possible remediation designs. A commercial vapor-extraction machine was used to remove nearly 500 gal of gasoline from Argonne-installed horizontal wells. By incorporating numerous design comments from the Argonne project team, field personnel improved the system`s performance. Argonne staff also determined that bioventing stimulated indigenous bacteria to bioremediate the gasoline spin. The Corps of Engineers will use Argonne`s pilot-study approach to evaluate remediation systems at field operation sites in several states.« less

  2. BIOREMEDIATION TRAINING

    EPA Science Inventory

    Bioremediation encompasses a collection of technologies which use microbes to degrade or transform contaminants. Three technologies have an established track record of acceptable performance: aerobic bioventing for fuels; enhanced reductive dechlorination for chlorinated solvent...

  3. Monitoring of Gasoline-ethanol Degradation In Undisturbed Soil

    NASA Astrophysics Data System (ADS)

    Österreicher-Cunha, P.; Nunes, C. M. F.; Vargas, E. A.; Guimarães, J. R. D.; Costa, A.

    Environmental contamination problems are greatly emphasised nowadays because of the direct threat they represent for human health. Traditional remediation methods fre- quently present low efficiency and high costs; therefore, biological treatment is being considered as an accessible and efficient alternative for soil and water remediation. Bioventing, commonly used to remediate petroleum hydrocarbon spills, stimulates the degradation capacity of indigenous microorganisms by providing better subsur- face oxygenation. In Brazil, gasoline and ethanol are mixed (78:22 v/v); some authors indicate that despite gasoline high degradability, its degradation in subsurface is hin- dered by the presence of much more rapidly degrading ethanol. Contaminant distribu- tion and degradation in the subsurface can be monitored by several physical, chemical and microbiological methodologies. This study aims to evaluate and follow the degra- dation of a gasoline-ethanol mixture in a residual undisturbed tropical soil from Rio de Janeiro. Bioventing was used to enhance microbial degradation. Shifts in bacte- rial culturable populations due to contamination and treatment effects were followed by conventional microbiology methods. Ground Penetrating Radar (GPR) measure- ments, which consist of the emission of electro-magnetic waves into the soil, yield a visualisation of contaminant degradation because of changes in soil conductivity due to microbial action on the pollutants. Chemical analyses will measure contaminant residue in soil. Our results disclosed contamination impact as well as bioventing stim- ulation on soil culturable heterotrophic bacterial populations. This multidisciplinary approach allows for a wider evaluation of processes occurring in soil.

  4. BIOREMEDIATION OF PETROLEUM HYDROCARBONS: A FLEXIBLE VARIABLE SPEED TECHNOLOGY

    EPA Science Inventory

    The bioremediation of petroleum hydrocarbons has evolved into a number of different processes. These processes include in-situ aquifer bioremediation, bioventing, biosparging, passive bioremediation with oxygen release compounds, and intrinsic bioremediation. Although often viewe...

  5. IN SITU BIOREMEDIATION STRATEGIES FOR ORGANIC WOOD PRESERVATIVES

    EPA Science Inventory

    Laboratory biotreatability studies evaluated the use of bioventing and biosparging plus groundwater circulation (UVB technology) for their potential abililty to treat soil and groundwater containing creosote and pentachlorophenol. Soils from two former wood-treatment facilities w...

  6. BIOVENTING OF CHLORINATED SOLVENTS FOR GROUND-WATER CLEANUP THROUGH BIOREMEDIATION

    EPA Science Inventory

    Chlorinated solvents such as tetrachloroethylene, trichloroethylene, carbon tetrachloride, chloroform, 1,2-dichloroethane, and dichloromethane (methylene chloride) can exist in contaminated subsurface material as (1) the neat oil, (2) a component of a mixed oily waste, (3) a solu...

  7. VAPOR PHASE TREATMENT OF PCE IN A SOIL COLUMN BY LAB-SCALE ANAEROBIC BIOVENTING

    EPA Science Inventory

    Microbial destruction of highly chlorinated organic compounds must be initiated by anaerobic followed by aerobic dechlorination. In-situ dechlorination of vadose zone soil contaminated with these compounds requires, among other factors, the establishment of highly reductive anaer...

  8. APPLICATION STRATEGIES AND DESIGN CRITERIA FOR IN SITU BIOREMEDIATION OF SOIL AND GROUNDWATER IMPACTED BY PAHS

    EPA Science Inventory

    Biotreatability studies conducted in our laboratory used soils from two former wood-treatment facilities to evaluate the use of in situ bioventing and biosparging applications for their potential ability to remediate soil and groundwater containing creosote. The combination of ph...

  9. Toluene removal from sandy soils via in situ technologies with an emphasis on factors influencing soil vapor extraction.

    PubMed

    Amin, Mohammad Mehdi; Hatamipour, Mohammad Sadegh; Momenbeik, Fariborz; Nourmoradi, Heshmatollah; Farhadkhani, Marzieh; Mohammadi-Moghadam, Fazel

    2014-01-01

    The integration of bioventing (BV) and soil vapor extraction (SVE) appears to be an effective combination method for soil decontamination. This paper serves two main purposes: it evaluates the effects of soil water content (SWC) and air flow rate on SVE and it investigates the transition regime between BV and SVE for toluene removal from sandy soils. 96 hours after air injection, more than 97% removal efficiency was achieved in all five experiments (carried out for SVE) including 5, 10, and 15% for SWC and 250 and 500 mL/min for air flow rate on SVE. The highest removal efficiency (>99.5%) of toluene was obtained by the combination of BV and SVE (AIBV: Air Injection Bioventing) after 96 h of air injection at a constant flow rate of 250 mL/min. It was found that AIBV has the highest efficiency for toluene removal from sandy soils and can remediate the vadose zone effectively to meet the soil guideline values for protection of groundwater.

  10. Toluene Removal from Sandy Soils via In Situ Technologies with an Emphasis on Factors Influencing Soil Vapor Extraction

    PubMed Central

    Amin, Mohammad Mehdi; Hatamipour, Mohammad Sadegh; Nourmoradi, Heshmatollah; Farhadkhani, Marzieh; Mohammadi-Moghadam, Fazel

    2014-01-01

    The integration of bioventing (BV) and soil vapor extraction (SVE) appears to be an effective combination method for soil decontamination. This paper serves two main purposes: it evaluates the effects of soil water content (SWC) and air flow rate on SVE and it investigates the transition regime between BV and SVE for toluene removal from sandy soils. 96 hours after air injection, more than 97% removal efficiency was achieved in all five experiments (carried out for SVE) including 5, 10, and 15% for SWC and 250 and 500 mL/min for air flow rate on SVE. The highest removal efficiency (>99.5%) of toluene was obtained by the combination of BV and SVE (AIBV: Air Injection Bioventing) after 96 h of air injection at a constant flow rate of 250 mL/min. It was found that AIBV has the highest efficiency for toluene removal from sandy soils and can remediate the vadose zone effectively to meet the soil guideline values for protection of groundwater. PMID:24587723

  11. REMEDIATION OF A MAJOR JET FUEL SPILL BY BIOSLURPER AND NATURAL BIOVENTING TECHNOLOGY ON AN ISLAND AIR BASE

    EPA Science Inventory

    The Indian Ocean island of Diego Garcia has served as a base for B-52 bombers. In 1991 an underground transfer pipeline fracture was discovered after a spill exceeding 200,000 gallons occurred. The hydrogeology is fresh water at less than ten feet down overlying more dense salt...

  12. NATURAL ATTENUATION OF FUEL AND SOLVENT SPILLS ON AIR FORCE BASES: BIOSLURPING AND NATURAL BIOVENTING TO REMEDIATE A JET FUEL SPILL. EVALUATE PERFORMANCE OF NEW PUSH PROBES TO ASSAY FOR BIOREMEDIATION

    EPA Science Inventory

    Frequently both the subsurface vadose zone and underlying aquifer at Air Force Base spill locations are contaminated with fuel hydrocarbons such as benzene and degreasing solvents such as trichloroethene. In many instances these concentrations exceed regulatory limits mandated by...

  13. Field Demonstration of Rhizosphere-Enhanced Treatment of Organics-Contaminated Soils on Native American Lands with Application to Northern FUD Sites

    DTIC Science & Technology

    2004-11-01

    Phytoremediation ...................................................................................... 36 4.4.3 Bioventing and Biosparging...Rhizosphere-enhanced remediation is a developing technology. It is a subset of phytoremediation —a term often used in a broad sense and sometimes...inappropriately or too generally—because phytoremediation encompasses a wide range of processes. The operative process in phytoremediation depends largely on

  14. ESTCP Cost and Performance Report: Field Demonstration of Rhizosphere-Enhanced Treatment of Organics-Contaminated Soils on Native American Lands with Application to Northern FUD Sites

    DTIC Science & Technology

    2004-06-01

    Phytoremediation ................................................................................................. 23 4.4.3 Bioventing and Biosparging...remediation is a developing technology. It is a subset of phytoremediation —a term that is often used in a broad sense, and sometimes used...inappropriately or too generally because phytoremediation encompasses a wide range of processes. The operative process in phytoremediation depends largely on the

  15. Phase 1 remediation of jet fuel contaminated soil and groundwater at JFK International Airport using dual phase extraction and bioventing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roth, R.; Bianco, P. Rizzo, M.; Pressly, N.

    1995-12-31

    Soil and groundwater contaminated with jet fuel at Terminal One of the JFK International Airport in New York have been remediated using dual phase extraction (DPE) and bioventing. Two areas were remediated using 51 DPE wells and 20 air sparging/air injection wells. The total area remediated by the DPE wells is estimated to be 4.8 acres. Groundwater was extracted to recover nonaqueous phase and aqueous phase jet fuel from the shallow aquifer and treated above ground by the following processes; oil/water separation, iron-oxidation, flocculation, sedimentation, filtration, air stripping and liquid-phase granular activated carbon (LPGAC) adsorption. The extracted vapors were treatedmore » by vapor-phase granular activated carbon (VPGAC) adsorption in one area, and catalytic oxidation and VPGAC adsorption in another area. After 6 months of remediation, approximately 5,490 lbs. of volatile organic compounds (VOCs) were removed by soil vapor extraction (SVE), 109,650 lbs. of petroleum hydrocarbons were removed from the extracted groundwater, and 60,550 lbs. of petroleum hydrocarbons were biologically oxidized by subsurface microorganisms. Of these three mechanisms, the rate of petroleum hydrocarbon removal was the highest for biological oxidation in one area and by groundwater extraction in another area.« less

  16. Engineering and Design: Soil Vapor Extraction and Bioventing

    DTIC Science & Technology

    2002-06-03

    and Basile 1992). The most notable success of steam injection for remediation has been the Southern California Edison wood treating site in Visalia... pesticides and dioxins. Removal efficiencies using ISTD are typically very high, and since this technology relies on conduction of heat through the soil...Aroclor - 1242 c Pesticides Chlordane c Dioxins/furans 2,3,7,8-Tetrachlorodibenzo-p-dioxin c Organic cyanides c Organic corrosives c Explosives 2,4,6

  17. Analysis of Soil Vapor Extraction Expenses to Estimate Bioventing Expenses

    DTIC Science & Technology

    1995-11-01

    Performance and Cost Summary. Brooks Air Force Base, Texas, July 1994. 2. Atlas , Ronald M, and Richard Bartha . Microbial Ecology : Fundamentals and...and straight-chain alkanes is highly dependent on molecular weight (carbon chain length) and the degree of branching. The book " Microbial Ecology ...must first be the presence of lower- molecular-weight aromatics (Heitkamp and Cerniglia 1988). The " Microbial Ecology " book also points out, on page

  18. Venting test analysis using Jacob`s approximation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Edwards, K.B.

    1996-03-01

    There are many sites contaminated by volatile organic compounds (VOCs) in the US and worldwide. Several technologies are available for remediation of these sites, including excavation, pump and treat, biological treatment, air sparging, steam injection, bioventing, and soil vapor extraction (SVE). SVE is also known as soil venting or vacuum extraction. Field venting tests were conducted in alluvial sands residing between the water table and a clay layer. Flow rate, barometric pressure, and well-pressure data were recorded using pressure transmitters and a personal computer. Data were logged as frequently as every second during periods of rapid change in pressure. Testsmore » were conducted at various extraction rates. The data from several tests were analyzed concurrently by normalizing the well pressures with respect to extraction rate. The normalized pressures vary logarithmically with time and fall on one line allowing a single match of the Jacob approximation to all tests. Though the Jacob approximation was originally developed for hydraulic pump test analysis, it is now commonly used for venting test analysis. Only recently, however, has it been used to analyze several transient tests simultaneously. For the field venting tests conducted in the alluvial sands, the air permeability and effective porosity determined from the concurrent analysis are 8.2 {times} 10{sup {minus}7} cm{sup 2} and 20%, respectively.« less

  19. In Situ Warming and Soil Venting to Enhance the Biodegradation of JP-4 in Cold Climates: A Critical Study and Analysis

    DTIC Science & Technology

    1995-12-01

    1178-1180 (1991). Atlas , Ronald M. and Richard Bartha . Microbial Ecology : Fundamentals and Applications. 3d ed. Redwood City CA: The Benjamin/Cummings...technique called bioventing. In cold climates, in situ bioremediation is limited to the summer when soil temperatures are sufficient to support microbial ...actively warmed the soil -- warm water circulation and heat tape; the other passively warmed the plot with insulatory covers. Microbial respiration (02

  20. Surfactant-aided recovery/in situ bioremediation for oil-contaminated sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ducreaux, J.; Baviere, M.; Seabra, P.

    1995-12-31

    Bioremediation has been the most commonly used method way for in situ cleaning of soils contaminated with low-volatility petroleum products such as diesel oil. However, whatever the process (bioventing, bioleaching, etc.), it is a time-consuming technique that may be efficiency limited by both accessibility and too high concentrations of contaminants. A currently developed process aims at quickly recovering part of the residual oil in the vadose and capillary zones by surfactant flushing, then activating in situ biodegradation of the remaining oil in the presence of the same or other surfactants. The process has been tested in laboratory columns and inmore » an experimental pool, located at the Institut Franco-Allemand de Recherche sur l`Environnement (IFARE) in Strasbourg, France. Laboratory column studies were carried out to fit physico-chemical and hydraulic parameters of the process to the field conditions. The possibility of recovering more than 80% of the oil in the flushing step was shown. For the biodegradation step, forced aeration as a mode of oxygen supply, coupled with nutrient injection aided by surfactants, was tested.« less

  1. Estimation of rates of aerobic hydrocarbon biodegradation by simulation of gas transport in the unsaturated zone

    USGS Publications Warehouse

    Lahvis, Matthew A.; Baehr, Arthur L.

    1996-01-01

    The distribution of oxygen and carbon dioxide gases in the unsaturated zone provides a geochemical signature of aerobic hydrocarbon degradation at petroleum product spill sites. The fluxes of these gases are proportional to the rate of aerobic biodegradation and are quantified by calibrating a mathematical transport model to the oxygen and carbon dioxide gas concentration data. Reaction stoichiometry is assumed to convert the gas fluxes to a corresponding rate of hydrocarbon degradation. The method is applied at a gasoline spill site in Galloway Township, New Jersey, to determine the rate of aerobic degradation of hydrocarbons associated with passive and bioventing remediation field experiments. At the site, microbial degradation of hydrocarbons near the water table limits the migration of hydrocarbon solutes in groundwater and prevents hydrocarbon volatilization into the unsaturated zone. In the passive remediation experiment a site-wide degradation rate estimate of 34,400 g yr−1 (11.7 gal. yr−1) of hydrocarbon was obtained by model calibration to carbon dioxide gas concentration data collected in December 1989. In the bioventing experiment, degradation rate estimates of 46.0 and 47.9 g m−2 yr−1(1.45 × 10−3 and 1.51 × 10−3 gal. ft.−2yr−1) of hydrocarbon were obtained by model calibration to oxygen and carbon dioxide gas concentration data, respectively. Method application was successful in quantifying the significance of a naturally occurring process that can effectively contribute to plume stabilization.

  2. Immunological techniques as tools to characterize the subsurface microbial community at a trichloroethylene contaminated site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fliermans, C.B.; Dougherty, J.M.; Franck, M.M.

    Effective in situ bioremediation strategies require an understanding of the effects pollutants and remediation techniques have on subsurface microbial communities. Therefore, detailed characterization of a site`s microbial communities is important. Subsurface sediment borings and water samples were collected from a trichloroethylene (TCE) contaminated site, before and after horizontal well in situ air stripping and bioventing, as well as during methane injection for stimulation of methane-utilizing microorganisms. Subsamples were processed for heterotrophic plate counts, acridine orange direct counts (AODC), community diversity, direct fluorescent antibodies (DFA) enumeration for several nitrogen-transforming bacteria, and Biolog {reg_sign} evaluation of enzyme activity in collected water samples.more » Plate counts were higher in near-surface depths than in the vadose zone sediment samples. During the in situ air stripping and bioventing, counts increased at or near the saturated zone, remained elevated throughout the aquifer, but did not change significantly after the air stripping. Sporadic increases in plate counts at different depths as well as increased diversity appeared to be linked to differing lithologies. AODCs were orders of magnitude higher than plate counts and remained relatively constant with depth except for slight increases near the surface depths and the capillary fringe. Nitrogen-transforming bacteria, as measured by serospecific DFA, were greatly affected both by the in situ air stripping and the methane injection. Biolog{reg_sign} activity appeared to increase with subsurface stimulation both by air and methane. The complexity of subsurface systems makes the use of selective monitoring tools imperative.« less

  3. Enhanced Remediation of Toluene in the Vadose Zone via a Nitrate-Rich Nutrient Solution: Field Study

    NASA Astrophysics Data System (ADS)

    Tindall, J. A.; Friedel, M. J.

    2003-12-01

    The objective of this study was to test the effectiveness of nitrate-rich nutrient solutions and hydrogen peroxide (H202) to enhance in-situ microbial remediation of toluene. Three sand filled plots (2 m2 surface area and 1.5 meters deep) were tested in three phases (each phase lasting approximately 2 weeks). During each phase, toluene (21.6 mol as an emulsion in 50L of water) was applied uniformly via sprinkler irrigation. Passive remediation was allowed to occur during the first (control) phase. A nutrient solution (modified Hoagland), concentrated in 40L of water, was tested during the second phase. The final phase involved addition of 230 moles of H202 in 50L of water to increase the available oxygen needed for aerobic biodegradation. During the first phase, toluene concentrations in soil gas were reduced from 120 ppm to 25 ppm in 14 days. After the addition of nutrients during the second phase, concentrations were reduced from 90 ppm to about 8 ppm within 14 days, and for the third phase (H202), toluene concentrations were about 1 ppm after only five days. Initial results suggest that this method could be an effective means of remediating a contaminated site, directly after a BTEX spill, without the intrusiveness and high cost of other abatement technologies such as bioventing and soil vapor extraction. However, further tests need to be completed to determine the effect of each of the BTEX components.

  4. Part 2: A field study of enhanced remediation of Toluene in the vadose zone using a nutrient solution

    USGS Publications Warehouse

    Tindall, J.A.; Weeks, E.P.; Friedel, M.

    2005-01-01

    The objective of this study was to test the effectiveness of a nitrate-rich nutrient solution and hydrogen peroxide (H2O2) to enhance in-situ microbial remediation of toluene in the unsaturated zone. Three sand-filled plots were tested in three phases (each phase lasting approximately 2 weeks). During the control phase, toluene was applied uniformly via sprinkler irrigation. Passive remediation was allowed to occur during this phase. A modified Hoagland nutrient solution, concentrated in 150 L of water, was tested during the second phase. The final phase involved addition of 230 moles of H2O2 in 150 L of water to increase the available oxygen needed for aerobic biodegradation. During the first phase, measured toluene concentrations in soil gas were reduced from 120 ppm to 25 ppm in 14 days. After the addition of nutrients during the second phase, concentrations were reduced from 90 ppm to about 8 ppm within 14 days, and for the third phase (H 2O2), toluene concentrations were about 1 ppm after only 5 days. Initial results suggest that this method could be an effective means of remediating a contaminated site, directly after a BTEX spill, without the intrusiveness and high cost of other abatement technologies such as bioventing or soil-vapor extraction. However, further tests need to be completed to determine the effect of each of the BTEX components. ?? Springer 2005.

  5. Bioremediation of Petroleum and Radiological Contaminated Soils at the Savannah River Site: Laboratory to Field Scale Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BRIGMON, ROBINL.

    In the process of Savannah River Site (SRS) operations limited amounts of waste are generated containing petroleum, and radiological contaminated soils. Currently, this combination of radiological and petroleum contaminated waste does not have an immediate disposal route and is being stored in low activity vaults. SRS developed and implemented a successful plan for clean up of the petroleum portion of the soils in situ using simple, inexpensive, bioreactor technology. Treatment in a bioreactor removes the petroleum contamination from the soil without spreading radiological contamination to the environment. This bioreactor uses the bioventing process and bioaugmentation or the addition of themore » select hydrocarbon degrading bacteria. Oxygen is usually the initial rate-limiting factor in the biodegradation of petroleum hydrocarbons. Using the bioventing process allowed control of the supply of nutrients and moisture based on petroleum contamination concentrations and soil type. The results of this work have proven to be a safe and cost-effective means of cleaning up low level radiological and petroleum-contaminated soil. Many of the other elements of the bioreactor design were developed or enhanced during the demonstration of a ''biopile'' to treat the soils beneath a Polish oil refinery's waste disposal lagoons. Aerobic microorganisms were isolated from the aged refinery's acidic sludge contaminated with polycyclic aromatic hydrocarbons (PAHs). Twelve hydrocarbon-degrading bacteria were isolated from the sludge. The predominant PAH degraders were tentatively identified as Achromobacter, Pseudomonas Burkholderia, and Sphingomonas spp. Several Ralstonia spp were also isolated that produce biosurfactants. Biosurfactants can enhance bioremediation by increasing the bioavailability of hydrophobic contaminants including hydrocarbons. The results indicated that the diversity of acid-tolerant PAH-degrading microorganisms in acidic oil wastes may be much greater than previously demonstrated and they have numerous applications to environmental restoration. Twelve of the isolates were subsequently added to the bioreactor to enhance bioremediation. In this study we showed that a bioreactor could be bioaugmented with select bacteria to enhance bioremediation of petroleum-contaminated soils under radiological conditions.« less

  6. Enhancement of the microbial community biomass and diversity during air sparging bioremediation of a soil highly contaminated with kerosene and BTEX.

    PubMed

    Kabelitz, Nadja; Machackova, Jirina; Imfeld, Gwenaël; Brennerova, Maria; Pieper, Dietmar H; Heipieper, Hermann J; Junca, Howard

    2009-03-01

    In order to obtain insights in complexity shifts taking place in natural microbial communities under strong selective pressure, soils from a former air force base in the Czech Republic, highly contaminated with jet fuel and at different stages of a bioremediation air sparging treatment, were analyzed. By tracking phospholipid fatty acids and 16S rRNA genes, a detailed monitoring of the changes in quantities and composition of the microbial communities developed at different stages of the bioventing treatment progress was performed. Depending on the length of the air sparging treatment that led to a significant reduction in the contamination level, we observed a clear shift in the soil microbial community being dominated by Pseudomonads under the harsh conditions of high aromatic contamination to a status of low aromatic concentrations, increased biomass content, and a complex composition with diverse bacterial taxonomical branches.

  7. Soil vapor extraction and bioventing: Applications, limitations, and future research directions

    NASA Astrophysics Data System (ADS)

    Rathfelder, K.; Lang, J. R.; Abriola, L. M.

    1995-07-01

    Soil vapor extraction (SVE) has evolved over the past decade as an attractive in situ remediation method for unsaturated soils contaminated with volatile organic compounds (VOCs). SVE involves the generation of air flow through the pores of the contaminated soil to induce transfer of VOCs to the air stream. Air flow is established by pumping from vadose zone wells through which contaminant vapors are collected and transported above ground where they are treated, if required, and discharged to the atmosphere. The popularity of SVE technologies stems from their proven effectiveness for removing large quantities of VOCs from the soil, their cost competitiveness, and their relatively simple non-intrusive implementation. Widespread field application of SVE has occurred following the success of early laboratory and field scale feasibility studies [Texas Research Institute, 1980, 1984; Thornton and Wootan, 1982; Marley and Hoag, 1984; Crow et al., 1985, 1987]. As many as 18% of Superfund sites employ SVE remediation technologies [Travis and Macinnis, 1992] and numerous articles and reports have documented the application of SVE [e.g. Hutzler et al., 1989; Downey and Elliott, 1990; U.S. EPA, 1991; Sanderson et al, 1993; Gerbasi and Menoli, 1994; McCann et al., 1994;].

  8. Using Optimization to Improve Test Planning

    DTIC Science & Technology

    2017-09-01

    friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool for the test and... evaluation schedulers. 14. SUBJECT TERMS schedule optimization, test planning 15. NUMBER OF PAGES 223 16. PRICE CODE 17. SECURITY CLASSIFICATION OF...make the input more user-friendly and to display the output differently, the test and evaluation test schedule optimization model would be a good tool

  9. Optimal Bayesian Adaptive Design for Test-Item Calibration.

    PubMed

    van der Linden, Wim J; Ren, Hao

    2015-06-01

    An optimal adaptive design for test-item calibration based on Bayesian optimality criteria is presented. The design adapts the choice of field-test items to the examinees taking an operational adaptive test using both the information in the posterior distributions of their ability parameters and the current posterior distributions of the field-test parameters. Different criteria of optimality based on the two types of posterior distributions are possible. The design can be implemented using an MCMC scheme with alternating stages of sampling from the posterior distributions of the test takers' ability parameters and the parameters of the field-test items while reusing samples from earlier posterior distributions of the other parameters. Results from a simulation study demonstrated the feasibility of the proposed MCMC implementation for operational item calibration. A comparison of performances for different optimality criteria showed faster calibration of substantial numbers of items for the criterion of D-optimality relative to A-optimality, a special case of c-optimality, and random assignment of items to the test takers.

  10. Traveling-Wave Tube Cold-Test Circuit Optimization Using CST MICROWAVE STUDIO

    NASA Technical Reports Server (NTRS)

    Chevalier, Christine T.; Kory, Carol L.; Wilson, Jeffrey D.; Wintucky, Edwin G.; Dayton, James A., Jr.

    2003-01-01

    The internal optimizer of CST MICROWAVE STUDIO (MWS) was used along with an application-specific Visual Basic for Applications (VBA) script to develop a method to optimize traveling-wave tube (TWT) cold-test circuit performance. The optimization procedure allows simultaneous optimization of circuit specifications including on-axis interaction impedance, bandwidth or geometric limitations. The application of Microwave Studio to TWT cold-test circuit optimization is described.

  11. Flight Test Validation of Optimal Input Design and Comparison to Conventional Inputs

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1997-01-01

    A technique for designing optimal inputs for aerodynamic parameter estimation was flight tested on the F-18 High Angle of Attack Research Vehicle (HARV). Model parameter accuracies calculated from flight test data were compared on an equal basis for optimal input designs and conventional inputs at the same flight condition. In spite of errors in the a priori input design models and distortions of the input form by the feedback control system, the optimal inputs increased estimated parameter accuracies compared to conventional 3-2-1-1 and doublet inputs. In addition, the tests using optimal input designs demonstrated enhanced design flexibility, allowing the optimal input design technique to use a larger input amplitude to achieve further increases in estimated parameter accuracy without departing from the desired flight test condition. This work validated the analysis used to develop the optimal input designs, and demonstrated the feasibility and practical utility of the optimal input design technique.

  12. Optimal periodic proof test based on cost-effective and reliability criteria

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1976-01-01

    An exploratory study for the optimization of periodic proof tests for fatigue-critical structures is presented. The optimal proof load level and the optimal number of periodic proof tests are determined by minimizing the total expected (statistical average) cost, while the constraint on the allowable level of structural reliability is satisfied. The total expected cost consists of the expected cost of proof tests, the expected cost of structures destroyed by proof tests, and the expected cost of structural failure in service. It is demonstrated by numerical examples that significant cost saving and reliability improvement for fatigue-critical structures can be achieved by the application of the optimal periodic proof test. The present study is relevant to the establishment of optimal maintenance procedures for fatigue-critical structures.

  13. Optimal Assembly of Psychological and Educational Tests.

    ERIC Educational Resources Information Center

    van der Linden, Wim J.

    1998-01-01

    Reviews optimal test-assembly literature and introduces the contributions to this special issue. Discusses four approaches to computerized test assembly: (1) heuristic-based test assembly; (2) 0-1 linear programming; (3) network-flow programming; and (4) an optimal design approach. Contains a bibliography of 90 sources on test assembly.…

  14. Integrating Test-Form Formatting into Automated Test Assembly

    ERIC Educational Resources Information Center

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  15. Optimized Non-Obstructive Particle Damping (NOPD) Treatment for Composite Honeycomb Structures

    NASA Technical Reports Server (NTRS)

    Panossian, H.

    2008-01-01

    Non-Obstructive Particle Damping (NOPD) technology is a passive vibration damping approach whereby metallic or non-metallic particles in spherical or irregular shapes, of heavy or light consistency, and even liquid particles are placed inside cavities or attached to structures by an appropriate means at strategic locations, to absorb vibration energy. The objective of the work described herein is the development of a design optimization procedure and discussion of test results for such a NOPD treatment on honeycomb (HC) composite structures, based on finite element modeling (FEM) analyses, optimization and tests. Modeling and predictions were performed and tests were carried out to correlate the test data with the FEM. The optimization procedure consisted of defining a global objective function, using finite difference methods, to determine the optimal values of the design variables through quadratic linear programming. The optimization process was carried out by targeting the highest dynamic displacements of several vibration modes of the structure and finding an optimal treatment configuration that will minimize them. An optimal design was thus derived and laboratory tests were conducted to evaluate its performance under different vibration environments. Three honeycomb composite beams, with Nomex core and aluminum face sheets, empty (untreated), uniformly treated with NOPD, and optimally treated with NOPD, according to the analytically predicted optimal design configuration, were tested in the laboratory. It is shown that the beam with optimal treatment has the lowest response amplitude. Described below are results of modal vibration tests and FEM analyses from predictions of the modal characteristics of honeycomb beams under zero, 50% uniform treatment and an optimal NOPD treatment design configuration and verification with test data.

  16. Execution of Multidisciplinary Design Optimization Approaches on Common Test Problems

    NASA Technical Reports Server (NTRS)

    Balling, R. J.; Wilkinson, C. A.

    1997-01-01

    A class of synthetic problems for testing multidisciplinary design optimization (MDO) approaches is presented. These test problems are easy to reproduce because all functions are given as closed-form mathematical expressions. They are constructed in such a way that the optimal value of all variables and the objective is unity. The test problems involve three disciplines and allow the user to specify the number of design variables, state variables, coupling functions, design constraints, controlling design constraints, and the strength of coupling. Several MDO approaches were executed on two sample synthetic test problems. These approaches included single-level optimization approaches, collaborative optimization approaches, and concurrent subspace optimization approaches. Execution results are presented, and the robustness and efficiency of these approaches an evaluated for these sample problems.

  17. Data mining-based coefficient of influence factors optimization of test paper reliability

    NASA Astrophysics Data System (ADS)

    Xu, Peiyao; Jiang, Huiping; Wei, Jieyao

    2018-05-01

    Test is a significant part of the teaching process. It demonstrates the final outcome of school teaching through teachers' teaching level and students' scores. The analysis of test paper is a complex operation that has the characteristics of non-linear relation in the length of the paper, time duration and the degree of difficulty. It is therefore difficult to optimize the coefficient of influence factors under different conditions in order to get text papers with clearly higher reliability with general methods [1]. With data mining techniques like Support Vector Regression (SVR) and Genetic Algorithm (GA), we can model the test paper analysis and optimize the coefficient of impact factors for higher reliability. It's easy to find that the combination of SVR and GA can get an effective advance in reliability from the test results. The optimal coefficient of influence factors optimization has a practicability in actual application, and the whole optimizing operation can offer model basis for test paper analysis.

  18. An improved marriage in honey bees optimization algorithm for single objective unconstrained optimization.

    PubMed

    Celik, Yuksel; Ulker, Erkan

    2013-01-01

    Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm's performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms.

  19. An Improved Marriage in Honey Bees Optimization Algorithm for Single Objective Unconstrained Optimization

    PubMed Central

    Celik, Yuksel; Ulker, Erkan

    2013-01-01

    Marriage in honey bees optimization (MBO) is a metaheuristic optimization algorithm developed by inspiration of the mating and fertilization process of honey bees and is a kind of swarm intelligence optimizations. In this study we propose improved marriage in honey bees optimization (IMBO) by adding Levy flight algorithm for queen mating flight and neighboring for worker drone improving. The IMBO algorithm's performance and its success are tested on the well-known six unconstrained test functions and compared with other metaheuristic optimization algorithms. PMID:23935416

  20. Quality of life, optimism/pessimism, and knowledge and attitudes toward HIV Screening among pregnant women in Ghana.

    PubMed

    Moyer, Cheryl A; Ekpo, Geraldine; Calhoun, Cecilia L; Greene, Jonathan; Naik, Sujata; Sippola, Emily; Stern, David T; Adanu, Richard M; Koranteng, Isaac O; Kwawukume, Enyonam Yao; Anderson, Frank J

    2008-01-01

    We sought to explore optimism/pessimism, knowledge of HIV, and attitudes toward HIV screening and treatment among Ghanaian pregnant women. Pregnant women in Accra, Ghana, completed a self-administered questionnaire including the Life Orientation Test-Revised (LOT-R, an optimism/pessimism measure), an HIV knowledge and screening attitudes questionnaire, the Short Form 12 (SF-12, a measure of health-related quality of life [HRQOL]), and a demographic questionnaire. Data were analyzed using t-tests, ANOVA, correlations, and the chi2 test. There were 101 participants; 28% were nulliparous. Mean age was 29.7 years, and mean week of gestation was 31.8. All women had heard of AIDS, 27.7% had been tested for HIV before this pregnancy, 46.5% had been tested during this pregnancy, and 59.4% of the sample had ever been tested for HIV. Of those not tested during this pregnancy, 64.2% were willing to be tested. Of all respondents, 89% said they would get tested if antiretroviral drugs (ARVs) were readily available and might prevent maternal-to-child transmission. Neither optimism/pessimism nor HRQOL was associated with attitudes toward HIV screening. Optimism was negatively correlated with HIV knowledge (p = .001) and was positively correlated with having never been tested before this pregnancy (p = .007). The relationship between optimism/pessimism and HIV knowledge and screening behavior is worthy of further study using larger samples and objective measures of testing beyond self-report.

  1. Experimental test of an online ion-optics optimizer

    NASA Astrophysics Data System (ADS)

    Amthor, A. M.; Schillaci, Z. M.; Morrissey, D. J.; Portillo, M.; Schwarz, S.; Steiner, M.; Sumithrarachchi, Ch.

    2018-07-01

    A technique has been developed and tested to automatically adjust multiple electrostatic or magnetic multipoles on an ion optical beam line - according to a defined optimization algorithm - until an optimal tune is found. This approach simplifies the process of determining high-performance optical tunes, satisfying a given set of optical properties, for an ion optical system. The optimization approach is based on the particle swarm method and is entirely model independent, thus the success of the optimization does not depend on the accuracy of an extant ion optical model of the system to be optimized. Initial test runs of a first order optimization of a low-energy (<60 keV) all-electrostatic beamline at the NSCL show reliable convergence of nine quadrupole degrees of freedom to well-performing tunes within a reasonable number of trial solutions, roughly 500, with full beam optimization run times of roughly two hours. Improved tunes were found both for quasi-local optimizations and for quasi-global optimizations, indicating a good ability of the optimizer to find a solution with or without a well defined set of initial multipole settings.

  2. Vertical radar profiles for the calibration of unsaturated flow models under dynamic water table conditions

    NASA Astrophysics Data System (ADS)

    Cassiani, G.; Gallotti, L.; Ventura, V.; Andreotti, G.

    2003-04-01

    The identification of flow and transport characteristics in the vadose zone is a fundamental step towards understanding the dynamics of contaminated sites and the resulting risk of groundwater pollution. Borehole radar has gained popularity for the monitoring of moisture content changes, thanks to its apparent simplicity and its high resolution characteristics. However, cross-hole radar requires closely spaced (a few meters), plastic-cased boreholes, that are rarely available as a standard feature in sites of practical interest. Unlike cross-hole applications, Vertical Radar Profiles (VRP) require only one borehole, with practical and financial benefits. High-resolution, time-lapse VRPs have been acquired at a crude oil contaminated site in Trecate, Northern Italy, on a few existing boreholes originally developed for remediation via bioventing. The dynamic water table conditions, with yearly oscillations of roughly 5 m from 6 to 11 m bgl, offers a good opportunity to observe via VRP a field scale drainage-imbibition process. Arrival time inversion has been carried out using a regularized tomographic algorithm, in order to overcome the noise introduced by first arrival picking. Interpretation of the vertical profiles in terms of moisture content has been based on standard models (Topp et al., 1980; Roth et al., 1990). The sedimentary sequence manifests itself as a cyclic pattern in moisture content over most of the profiles. We performed preliminary Richards' equation simulations with time varying later table boundary conditions, in order to estimate the unsaturated flow parameters, and the results have been compared with laboratory evidence from cores.

  3. GMOtrack: generator of cost-effective GMO testing strategies.

    PubMed

    Novak, Petra Krau; Gruden, Kristina; Morisset, Dany; Lavrac, Nada; Stebih, Dejan; Rotter, Ana; Zel, Jana

    2009-01-01

    Commercialization of numerous genetically modified organisms (GMOs) has already been approved worldwide, and several additional GMOs are in the approval process. Many countries have adopted legislation to deal with GMO-related issues such as food safety, environmental concerns, and consumers' right of choice, making GMO traceability a necessity. The growing extent of GMO testing makes it important to study optimal GMO detection and identification strategies. This paper formally defines the problem of routine laboratory-level GMO tracking as a cost optimization problem, thus proposing a shift from "the same strategy for all samples" to "sample-centered GMO testing strategies." An algorithm (GMOtrack) for finding optimal two-phase (screening-identification) testing strategies is proposed. The advantages of cost optimization with increasing GMO presence on the market are demonstrated, showing that optimization approaches to analytic GMO traceability can result in major cost reductions. The optimal testing strategies are laboratory-dependent, as the costs depend on prior probabilities of local GMO presence, which are exemplified on food and feed samples. The proposed GMOtrack approach, publicly available under the terms of the General Public License, can be extended to other domains where complex testing is involved, such as safety and quality assurance in the food supply chain.

  4. Optimal Test Design with Rule-Based Item Generation

    ERIC Educational Resources Information Center

    Geerlings, Hanneke; van der Linden, Wim J.; Glas, Cees A. W.

    2013-01-01

    Optimal test-design methods are applied to rule-based item generation. Three different cases of automated test design are presented: (a) test assembly from a pool of pregenerated, calibrated items; (b) test generation on the fly from a pool of calibrated item families; and (c) test generation on the fly directly from calibrated features defining…

  5. Social support, acculturation, and optimism: understanding positive health practices in Asian American college students.

    PubMed

    Ayres, Cynthia G; Mahat, Ganga

    2012-07-01

    This study developed and tested a theory to better understand positive health practices (PHP) among Asian Americans aged 18 to 21 years. It tested theoretical relationships postulated between PHP and (a) social support (SS), (b) optimism, and (c) acculturation, and between SS and optimism and acculturation. Optimism and acculturation were also tested as possible mediators in the relationship between SS and PHP. A correlational study design was used. A convenience sample of 163 Asian college students in an urban setting completed four questionnaires assessing SS, PHP, optimism, and acculturation and one demographic questionnaire. There were statistically significant positive relationships between SS and optimism with PHP, between acculturation and PHP, and between optimism and SS. Optimism mediated the relationship between SS and PHP, whereas acculturation did not. Findings extend knowledge regarding these relationships to a defined population of Asian Americans aged 18 to 21 years. Findings contribute to a more comprehensive knowledge base regarding health practices among Asian Americans. The theoretical and empirical findings of this study provide the direction for future research as well. Further studies need to be conducted to identify and test other mediators in order to better understand the relationship between these two variables.

  6. Design of Quiet Rotorcraft Approach Trajectories: Verification Phase

    NASA Technical Reports Server (NTRS)

    Padula, Sharon L.

    2010-01-01

    Flight testing that is planned for October 2010 will provide an opportunity to evaluate rotorcraft trajectory optimization techniques. The flight test will involve a fully instrumented MD-902 helicopter, which will be flown over an array of microphones. In this work, the helicopter approach trajectory is optimized via a multiobjective genetic algorithm to improve community noise, passenger comfort, and pilot acceptance. Previously developed optimization strategies are modified to accommodate new helicopter data and to increase pilot acceptance. This paper describes the MD-902 trajectory optimization plus general optimization strategies and modifications that are needed to reduce the uncertainty in noise predictions. The constraints that are imposed by the flight test conditions and characteristics of the MD-902 helicopter limit the testing possibilities. However, the insights that will be gained through this research will prove highly valuable.

  7. Artificial Bee Colony Optimization for Short-Term Hydrothermal Scheduling

    NASA Astrophysics Data System (ADS)

    Basu, M.

    2014-12-01

    Artificial bee colony optimization is applied to determine the optimal hourly schedule of power generation in a hydrothermal system. Artificial bee colony optimization is a swarm-based algorithm inspired by the food foraging behavior of honey bees. The algorithm is tested on a multi-reservoir cascaded hydroelectric system having prohibited operating zones and thermal units with valve point loading. The ramp-rate limits of thermal generators are taken into consideration. The transmission losses are also accounted for through the use of loss coefficients. The algorithm is tested on two hydrothermal multi-reservoir cascaded hydroelectric test systems. The results of the proposed approach are compared with those of differential evolution, evolutionary programming and particle swarm optimization. From numerical results, it is found that the proposed artificial bee colony optimization based approach is able to provide better solution.

  8. A Multi-Verse Optimizer with Levy Flights for Numerical Optimization and Its Application in Test Scheduling for Network-on-Chip.

    PubMed

    Hu, Cong; Li, Zhi; Zhou, Tian; Zhu, Aijun; Xu, Chuanpei

    2016-01-01

    We propose a new meta-heuristic algorithm named Levy flights multi-verse optimizer (LFMVO), which incorporates Levy flights into multi-verse optimizer (MVO) algorithm to solve numerical and engineering optimization problems. The Original MVO easily falls into stagnation when wormholes stochastically re-span a number of universes (solutions) around the best universe achieved over the course of iterations. Since Levy flights are superior in exploring unknown, large-scale search space, they are integrated into the previous best universe to force MVO out of stagnation. We test this method on three sets of 23 well-known benchmark test functions and an NP complete problem of test scheduling for Network-on-Chip (NoC). Experimental results prove that the proposed LFMVO is more competitive than its peers in both the quality of the resulting solutions and convergence speed.

  9. A Multi-Verse Optimizer with Levy Flights for Numerical Optimization and Its Application in Test Scheduling for Network-on-Chip

    PubMed Central

    Hu, Cong; Li, Zhi; Zhou, Tian; Zhu, Aijun; Xu, Chuanpei

    2016-01-01

    We propose a new meta-heuristic algorithm named Levy flights multi-verse optimizer (LFMVO), which incorporates Levy flights into multi-verse optimizer (MVO) algorithm to solve numerical and engineering optimization problems. The Original MVO easily falls into stagnation when wormholes stochastically re-span a number of universes (solutions) around the best universe achieved over the course of iterations. Since Levy flights are superior in exploring unknown, large-scale search space, they are integrated into the previous best universe to force MVO out of stagnation. We test this method on three sets of 23 well-known benchmark test functions and an NP complete problem of test scheduling for Network-on-Chip (NoC). Experimental results prove that the proposed LFMVO is more competitive than its peers in both the quality of the resulting solutions and convergence speed. PMID:27926946

  10. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE--A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnis Judzis

    2004-07-01

    This document details the progress to date on the ''OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE--A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING'' contract for the quarter starting April 2004 through June 2004. The DOE and TerraTek continue to wait for Novatek on the optimization portion of the testing program (they are completely rebuilding their fluid hammer). The latest indication is that the Novatek tool would be ready for retesting only 4Q 2004 or later. Smith International's hammer was tested in April of 2004 (2Q 2004 report). Accomplishments included the following: (1) TerraTek re-tested the ''optimized'' fluid hammermore » provided by Smith International during April 2004. Many improvements in mud hammer rates of penetration were noted over Phase 1 benchmark testing from November 2002. (2) Shell Exploration and Production in The Hague was briefed on various drilling performance projects including Task 8 ''Cutter Impact Testing''. Shell interest and willingness to assist in the test matrix as an Industry Advisor is appreciated. (3) TerraTek participated in a DOE/NETL Review meeting at Morgantown on April 15, 2004. The discussions were very helpful and a program related to the Mud Hammer optimization project was noted--Terralog modeling work on percussion tools. (4) Terralog's Dr. Gang Han witnessed some of the full-scale optimization testing of the Smith International hammer in order to familiarize him with downhole tools. TerraTek recommends that modeling first start with single cutters/inserts and progress in complexity. (5) The final equipment problem on the impact testing task was resolved through the acquisition of a high data rate laser based displacement instrument. (6) TerraTek provided Novatek much engineering support for the future re-testing of their optimized tool. Work was conducted on slip ring [electrical] specifications and tool collar sealing in the testing vessel with a reconfigured flow system on Novatek's collar.« less

  11. Integrated testing strategies can be optimal for chemical risk classification.

    PubMed

    Raseta, Marko; Pitchford, Jon; Cussens, James; Doe, John

    2017-08-01

    There is an urgent need to refine strategies for testing the safety of chemical compounds. This need arises both from the financial and ethical costs of animal tests, but also from the opportunities presented by new in-vitro and in-silico alternatives. Here we explore the mathematical theory underpinning the formulation of optimal testing strategies in toxicology. We show how the costs and imprecisions of the various tests, and the variability in exposures and responses of individuals, can be assembled rationally to form a Markov Decision Problem. We compute the corresponding optimal policies using well developed theory based on Dynamic Programming, thereby identifying and overcoming some methodological and logical inconsistencies which may exist in the current toxicological testing. By illustrating our methods for two simple but readily generalisable examples we show how so-called integrated testing strategies, where information of different precisions from different sources is combined and where different initial test outcomes lead to different sets of future tests, can arise naturally as optimal policies. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Motivations for genetic testing for lung cancer risk among young smokers.

    PubMed

    O'Neill, Suzanne C; Lipkus, Isaac M; Sanderson, Saskia C; Shepperd, James; Docherty, Sharron; McBride, Colleen M

    2013-11-01

    To examine why young people might want to undergo genetic susceptibility testing for lung cancer despite knowing that tested gene variants are associated with small increases in disease risk. The authors used a mixed-method approach to evaluate motives for and against genetic testing and the association between these motivations and testing intentions in 128 college students who smoke. Exploratory factor analysis yielded four reliable factors: Test Scepticism, Test Optimism, Knowledge Enhancement and Smoking Optimism. Test Optimism and Knowledge Enhancement correlated positively with intentions to test in bivariate and multivariate analyses (ps<0.001). Test Scepticism correlated negatively with testing intentions in multivariate analyses (p<0.05). Open-ended questions assessing testing motivations generally replicated themes of the quantitative survey. In addition to learning about health risks, young people may be motivated to seek genetic testing for reasons, such as gaining knowledge about new genetic technologies more broadly.

  13. Fuel management optimization using genetic algorithms and expert knowledge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeChaine, M.D.; Feltus, M.A.

    1996-09-01

    The CIGARO fuel management optimization code based on genetic algorithms is described and tested. The test problem optimized the core lifetime for a pressurized water reactor with a penalty function constraint on the peak normalized power. A bit-string genotype encoded the loading patterns, and genotype bias was reduced with additional bits. Expert knowledge about fuel management was incorporated into the genetic algorithm. Regional crossover exchanged physically adjacent fuel assemblies and improved the optimization slightly. Biasing the initial population toward a known priority table significantly improved the optimization.

  14. Multiobjective optimization approach: thermal food processing.

    PubMed

    Abakarov, A; Sushkov, Y; Almonacid, S; Simpson, R

    2009-01-01

    The objective of this study was to utilize a multiobjective optimization technique for the thermal sterilization of packaged foods. The multiobjective optimization approach used in this study is based on the optimization of well-known aggregating functions by an adaptive random search algorithm. The applicability of the proposed approach was illustrated by solving widely used multiobjective test problems taken from the literature. The numerical results obtained for the multiobjective test problems and for the thermal processing problem show that the proposed approach can be effectively used for solving multiobjective optimization problems arising in the food engineering field.

  15. Optimization applications in aircraft engine design and test

    NASA Technical Reports Server (NTRS)

    Pratt, T. K.

    1984-01-01

    Starting with the NASA-sponsored STAEBL program, optimization methods based primarily upon the versatile program COPES/CONMIN were introduced over the past few years to a broad spectrum of engineering problems in structural optimization, engine design, engine test, and more recently, manufacturing processes. By automating design and testing processes, many repetitive and costly trade-off studies have been replaced by optimization procedures. Rather than taking engineers and designers out of the loop, optimization has, in fact, put them more in control by providing sophisticated search techniques. The ultimate decision whether to accept or reject an optimal feasible design still rests with the analyst. Feedback obtained from this decision process has been invaluable since it can be incorporated into the optimization procedure to make it more intelligent. On several occasions, optimization procedures have produced novel designs, such as the nonsymmetric placement of rotor case stiffener rings, not anticipated by engineering designers. In another case, a particularly difficult resonance contraint could not be satisfied using hand iterations for a compressor blade, when the STAEBL program was applied to the problem, a feasible solution was obtained in just two iterations.

  16. VDLLA: A virtual daddy-long legs optimization

    NASA Astrophysics Data System (ADS)

    Yaakub, Abdul Razak; Ghathwan, Khalil I.

    2016-08-01

    Swarm intelligence is a strong optimization algorithm based on a biological behavior of insects or animals. The success of any optimization algorithm is depending on the balance between exploration and exploitation. In this paper, we present a new swarm intelligence algorithm, which is based on daddy long legs spider (VDLLA) as a new optimization algorithm with virtual behavior. In VDLLA, each agent (spider) has nine positions which represent the legs of spider and each position represent one solution. The proposed VDLLA is tested on four standard functions using average fitness, Medium fitness and standard deviation. The results of proposed VDLLA have been compared against Particle Swarm Optimization (PSO), Differential Evolution (DE) and Bat Inspired Algorithm (BA). Additionally, the T-Test has been conducted to show the significant deference between our proposed and other algorithms. VDLLA showed very promising results on benchmark test functions for unconstrained optimization problems and also significantly improved the original swarm algorithms.

  17. Advanced in-duct sorbent injection for SO{sub 2} control. Topical report No. 2, Subtask 2.2: Design optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rosenhoover, W.A.; Stouffer, M.R.; Withum, J.A.

    1994-12-01

    The objective of this research project is to develop second-generation duct injection technology as a cost-effective SO{sub 2} control option for the 1990 Clean Air Act Amendments. Research is focused on the Advanced Coolside process, which has shown the potential for achieving the performance targets of 90% SO{sub 2} removal and 60% sorbent utilization. In Subtask 2.2, Design Optimization, process improvement was sought by optimizing sorbent recycle and by optimizing process equipment for reduced cost. The pilot plant recycle testing showed that 90% SO{sub 2} removal could be achieved at sorbent utilizations up to 75%. This testing also showed thatmore » the Advanced Coolside process has the potential to achieve very high removal efficiency (90 to greater than 99%). Two alternative contactor designs were developed, tested and optimized through pilot plant testing; the improved designs will reduce process costs significantly, while maintaining operability and performance essential to the process. Also, sorbent recycle handling equipment was optimized to reduce cost.« less

  18. Gaussian process regression for geometry optimization

    NASA Astrophysics Data System (ADS)

    Denzel, Alexander; Kästner, Johannes

    2018-03-01

    We implemented a geometry optimizer based on Gaussian process regression (GPR) to find minimum structures on potential energy surfaces. We tested both a two times differentiable form of the Matérn kernel and the squared exponential kernel. The Matérn kernel performs much better. We give a detailed description of the optimization procedures. These include overshooting the step resulting from GPR in order to obtain a higher degree of interpolation vs. extrapolation. In a benchmark against the Limited-memory Broyden-Fletcher-Goldfarb-Shanno optimizer of the DL-FIND library on 26 test systems, we found the new optimizer to generally reduce the number of required optimization steps.

  19. [Simulation on remediation of benzene contaminated groundwater by air sparging].

    PubMed

    Fan, Yan-Ling; Jiang, Lin; Zhang, Dan; Zhong, Mao-Sheng; Jia, Xiao-Yang

    2012-11-01

    Air sparging (AS) is one of the in situ remedial technologies which are used in groundwater remediation for pollutions with volatile organic compounds (VOCs). At present, the field design of air sparging system was mainly based on experience due to the lack of field data. In order to obtain rational design parameters, the TMVOC module in the Petrasim software package, combined with field test results on a coking plant in Beijing, is used to optimize the design parameters and simulate the remediation process. The pilot test showed that the optimal injection rate was 23.2 m3 x h(-1), while the optimal radius of influence (ROI) was 5 m. The simulation results revealed that the pressure response simulated by the model matched well with the field test results, which indicated a good representation of the simulation. The optimization results indicated that the optimal injection location was at the bottom of the aquifer. Furthermore, simulated at the optimized injection location, the optimal injection rate was 20 m3 x h(-1), which was in accordance with the field test result. Besides, 3 m was the optimal ROI, less than the field test results, and the main reason was that field test reflected the flow behavior at the upper space of groundwater and unsaturated area, in which the width of flow increased rapidly, and became bigger than the actual one. With the above optimized operation parameters, in addition to the hydro-geological parameters measured on site, the model simulation result revealed that 90 days were needed to remediate the benzene from 371 000 microg x L(-1) to 1 microg x L(-1) for the site, and that the opeation model in which the injection wells were progressively turned off once the groundwater around them was "clean" was better than the one in which all the wells were kept operating throughout the remediation process.

  20. Standardization and validation of a cytometric bead assay to assess antibodies to multiple Plasmodium falciparum recombinant antigens.

    PubMed

    Ondigo, Bartholomew N; Park, Gregory S; Gose, Severin O; Ho, Benjamin M; Ochola, Lyticia A; Ayodo, George O; Ofulla, Ayub V; John, Chandy C

    2012-12-21

    Multiplex cytometric bead assay (CBA) have a number of advantages over ELISA for antibody testing, but little information is available on standardization and validation of antibody CBA to multiple Plasmodium falciparum antigens. The present study was set to determine optimal parameters for multiplex testing of antibodies to P. falciparum antigens, and to compare results of multiplex CBA to ELISA. Antibodies to ten recombinant P. falciparum antigens were measured by CBA and ELISA in samples from 30 individuals from a malaria endemic area of Kenya and compared to known positive and negative control plasma samples. Optimal antigen amounts, monoplex vs multiplex testing, plasma dilution, optimal buffer, number of beads required were assessed for CBA testing, and results from CBA vs. ELISA testing were compared. Optimal amounts for CBA antibody testing differed according to antigen. Results for monoplex CBA testing correlated strongly with multiplex testing for all antigens (r = 0.88-0.99, P values from <0.0001 - 0.004), and antibodies to variants of the same antigen were accurately distinguished within a multiplex reaction. Plasma dilutions of 1:100 or 1:200 were optimal for all antigens for CBA testing. Plasma diluted in a buffer containing 0.05% sodium azide, 0.5% polyvinylalcohol, and 0.8% polyvinylpyrrolidone had the lowest background activity. CBA median fluorescence intensity (MFI) values with 1,000 antigen-conjugated beads/well did not differ significantly from MFI with 5,000 beads/well. CBA and ELISA results correlated well for all antigens except apical membrane antigen-1 (AMA-1). CBA testing produced a greater range of values in samples from malaria endemic areas and less background reactivity for blank samples than ELISA. With optimization, CBA may be the preferred method of testing for antibodies to P. falciparum antigens, as CBA can test for antibodies to multiple recombinant antigens from a single plasma sample and produces a greater range of values in positive samples and lower background readings for blank samples than ELISA.

  1. An approach for Ewing test selection to support the clinical assessment of cardiac autonomic neuropathy.

    PubMed

    Stranieri, Andrew; Abawajy, Jemal; Kelarev, Andrei; Huda, Shamsul; Chowdhury, Morshed; Jelinek, Herbert F

    2013-07-01

    This article addresses the problem of determining optimal sequences of tests for the clinical assessment of cardiac autonomic neuropathy (CAN). We investigate the accuracy of using only one of the recommended Ewing tests to classify CAN and the additional accuracy obtained by adding the remaining tests of the Ewing battery. This is important as not all five Ewing tests can always be applied in each situation in practice. We used new and unique database of the diabetes screening research initiative project, which is more than ten times larger than the data set used by Ewing in his original investigation of CAN. We utilized decision trees and the optimal decision path finder (ODPF) procedure for identifying optimal sequences of tests. We present experimental results on the accuracy of using each one of the recommended Ewing tests to classify CAN and the additional accuracy that can be achieved by adding the remaining tests of the Ewing battery. We found the best sequences of tests for cost-function equal to the number of tests. The accuracies achieved by the initial segments of the optimal sequences for 2, 3 and 4 categories of CAN are 80.80, 91.33, 93.97 and 94.14, and respectively, 79.86, 89.29, 91.16 and 91.76, and 78.90, 86.21, 88.15 and 88.93. They show significant improvement compared to the sequence considered previously in the literature and the mathematical expectations of the accuracies of a random sequence of tests. The complete outcomes obtained for all subsets of the Ewing features are required for determining optimal sequences of tests for any cost-function with the use of the ODPF procedure. We have also found two most significant additional features that can increase the accuracy when some of the Ewing attributes cannot be obtained. The outcomes obtained can be used to determine the optimal sequences of tests for each individual cost-function by following the ODPF procedure. The results show that the best single Ewing test for diagnosing CAN is the deep breathing heart rate variation test. Optimal sequences found for the cost-function equal to the number of tests guarantee that the best accuracy is achieved after any number of tests and provide an improvement in comparison with the previous ordering of tests or a random sequence. Copyright © 2013 Elsevier B.V. All rights reserved.

  2. Optimization of structures on the basis of fracture mechanics and reliability criteria

    NASA Technical Reports Server (NTRS)

    Heer, E.; Yang, J. N.

    1973-01-01

    Systematic summary of factors which are involved in optimization of given structural configuration is part of report resulting from study of analysis of objective function. Predicted reliability of performance of finished structure is sharply dependent upon results of coupon tests. Optimization analysis developed by study also involves expected cost of proof testing.

  3. Solving mixed integer nonlinear programming problems using spiral dynamics optimization algorithm

    NASA Astrophysics Data System (ADS)

    Kania, Adhe; Sidarto, Kuntjoro Adji

    2016-02-01

    Many engineering and practical problem can be modeled by mixed integer nonlinear programming. This paper proposes to solve the problem with modified spiral dynamics inspired optimization method of Tamura and Yasuda. Four test cases have been examined, including problem in engineering and sport. This method succeeds in obtaining the optimal result in all test cases.

  4. Academic Optimism and Collective Responsibility: An Organizational Model of the Dynamics of Student Achievement

    ERIC Educational Resources Information Center

    Wu, Jason H.

    2013-01-01

    This study was designed to examine the construct of academic optimism and its relationship with collective responsibility in a sample of Taiwan elementary schools. The construct of academic optimism was tested using confirmatory factor analysis, and the whole structural model was tested with a structural equation modeling analysis. The data were…

  5. A Modified Mean Gray Wolf Optimization Approach for Benchmark and Biomedical Problems.

    PubMed

    Singh, Narinder; Singh, S B

    2017-01-01

    A modified variant of gray wolf optimization algorithm, namely, mean gray wolf optimization algorithm has been developed by modifying the position update (encircling behavior) equations of gray wolf optimization algorithm. The proposed variant has been tested on 23 standard benchmark well-known test functions (unimodal, multimodal, and fixed-dimension multimodal), and the performance of modified variant has been compared with particle swarm optimization and gray wolf optimization. Proposed algorithm has also been applied to the classification of 5 data sets to check feasibility of the modified variant. The results obtained are compared with many other meta-heuristic approaches, ie, gray wolf optimization, particle swarm optimization, population-based incremental learning, ant colony optimization, etc. The results show that the performance of modified variant is able to find best solutions in terms of high level of accuracy in classification and improved local optima avoidance.

  6. Urine sampling and collection system optimization and testing

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Geating, J. A.; Koesterer, M. G.

    1975-01-01

    A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.

  7. Improved Ant Algorithms for Software Testing Cases Generation

    PubMed Central

    Yang, Shunkun; Xu, Jiaqi

    2014-01-01

    Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391

  8. Design optimization studies using COSMIC NASTRAN

    NASA Technical Reports Server (NTRS)

    Pitrof, Stephen M.; Bharatram, G.; Venkayya, Vipperla B.

    1993-01-01

    The purpose of this study is to create, test and document a procedure to integrate mathematical optimization algorithms with COSMIC NASTRAN. This procedure is very important to structural design engineers who wish to capitalize on optimization methods to ensure that their design is optimized for its intended application. The OPTNAST computer program was created to link NASTRAN and design optimization codes into one package. This implementation was tested using two truss structure models and optimizing their designs for minimum weight, subject to multiple loading conditions and displacement and stress constraints. However, the process is generalized so that an engineer could design other types of elements by adding to or modifying some parts of the code.

  9. Soft-Decision Decoding of Binary Linear Block Codes Based on an Iterative Search Algorithm

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Moorthy, H. T.

    1997-01-01

    This correspondence presents a suboptimum soft-decision decoding scheme for binary linear block codes based on an iterative search algorithm. The scheme uses an algebraic decoder to iteratively generate a sequence of candidate codewords one at a time using a set of test error patterns that are constructed based on the reliability information of the received symbols. When a candidate codeword is generated, it is tested based on an optimality condition. If it satisfies the optimality condition, then it is the most likely (ML) codeword and the decoding stops. If it fails the optimality test, a search for the ML codeword is conducted in a region which contains the ML codeword. The search region is determined by the current candidate codeword and the reliability of the received symbols. The search is conducted through a purged trellis diagram for the given code using the Viterbi algorithm. If the search fails to find the ML codeword, a new candidate is generated using a new test error pattern, and the optimality test and search are renewed. The process of testing and search continues until either the MEL codeword is found or all the test error patterns are exhausted and the decoding process is terminated. Numerical results show that the proposed decoding scheme achieves either practically optimal performance or a performance only a fraction of a decibel away from the optimal maximum-likelihood decoding with a significant reduction in decoding complexity compared with the Viterbi decoding based on the full trellis diagram of the codes.

  10. Predicting Short-Term Remembering as Boundedly Optimal Strategy Choice.

    PubMed

    Howes, Andrew; Duggan, Geoffrey B; Kalidindi, Kiran; Tseng, Yuan-Chi; Lewis, Richard L

    2016-07-01

    It is known that, on average, people adapt their choice of memory strategy to the subjective utility of interaction. What is not known is whether an individual's choices are boundedly optimal. Two experiments are reported that test the hypothesis that an individual's decisions about the distribution of remembering between internal and external resources are boundedly optimal where optimality is defined relative to experience, cognitive constraints, and reward. The theory makes predictions that are tested against data, not fitted to it. The experiments use a no-choice/choice utility learning paradigm where the no-choice phase is used to elicit a profile of each participant's performance across the strategy space and the choice phase is used to test predicted choices within this space. They show that the majority of individuals select strategies that are boundedly optimal. Further, individual differences in what people choose to do are successfully predicted by the analysis. Two issues are discussed: (a) the performance of the minority of participants who did not find boundedly optimal adaptations, and (b) the possibility that individuals anticipate what, with practice, will become a bounded optimal strategy, rather than what is boundedly optimal during training. Copyright © 2015 Cognitive Science Society, Inc.

  11. Internationally comparable screening tests for listening in noise in several European languages: the German digit triplet test as an optimization prototype.

    PubMed

    Zokoll, Melanie A; Wagener, Kirsten C; Brand, Thomas; Buschermöhle, Michael; Kollmeier, Birger

    2012-09-01

    A review is given of internationally comparable speech-in-noise tests for hearing screening purposes that were part of the European HearCom project. This report describes the development, optimization, and evaluation of such tests for headphone and telephone presentation, using the example of the German digit triplet test. In order to achieve the highest possible comparability, language- and speaker-dependent factors in speech intelligibility should be compensated for. The tests comprise spoken numbers in background noise and estimate the speech reception threshold (SRT), i.e. the signal-to-noise ratio (SNR) yielding 50% speech intelligibility. The respective reference speech intelligibility functions for headphone and telephone presentation of the German version for 15 and 10 normal-hearing listeners are described by a SRT of -9.3 ± 0.2 and -6.5 ± 0.4 dB SNR, and slopes of 19.6 and 17.9%/dB, respectively. Reference speech intelligibility functions of all digit triplet tests optimized within the HearCom project allow for investigation of the comparability due to language specificities. The optimization criteria established here should be used for similar screening tests in other languages.

  12. Online optimal obstacle avoidance for rotary-wing autonomous unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Kang, Keeryun

    This thesis presents an integrated framework for online obstacle avoidance of rotary-wing unmanned aerial vehicles (UAVs), which can provide UAVs an obstacle field navigation capability in a partially or completely unknown obstacle-rich environment. The framework is composed of a LIDAR interface, a local obstacle grid generation, a receding horizon (RH) trajectory optimizer, a global shortest path search algorithm, and a climb rate limit detection logic. The key feature of the framework is the use of an optimization-based trajectory generation in which the obstacle avoidance problem is formulated as a nonlinear trajectory optimization problem with state and input constraints over the finite range of the sensor. This local trajectory optimization is combined with a global path search algorithm which provides a useful initial guess to the nonlinear optimization solver. Optimization is the natural process of finding the best trajectory that is dynamically feasible, safe within the vehicle's flight envelope, and collision-free at the same time. The optimal trajectory is continuously updated in real time by the numerical optimization solver, Nonlinear Trajectory Generation (NTG), which is a direct solver based on the spline approximation of trajectory for dynamically flat systems. In fact, the overall approach of this thesis to finding the optimal trajectory is similar to the model predictive control (MPC) or the receding horizon control (RHC), except that this thesis followed a two-layer design; thus, the optimal solution works as a guidance command to be followed by the controller of the vehicle. The framework is implemented in a real-time simulation environment, the Georgia Tech UAV Simulation Tool (GUST), and integrated in the onboard software of the rotary-wing UAV test-bed at Georgia Tech. Initially, the 2D vertical avoidance capability of real obstacles was tested in flight. The flight test evaluations were extended to the benchmark tests for 3D avoidance capability over the virtual obstacles, and finally it was demonstrated on real obstacles located at the McKenna MOUT site in Fort Benning, Georgia. Simulations and flight test evaluations demonstrate the feasibility of the developed framework for UAV applications involving low-altitude flight in an urban area.

  13. Error analysis and system optimization of non-null aspheric testing system

    NASA Astrophysics Data System (ADS)

    Luo, Yongjie; Yang, Yongying; Liu, Dong; Tian, Chao; Zhuo, Yongmo

    2010-10-01

    A non-null aspheric testing system, which employs partial null lens (PNL for short) and reverse iterative optimization reconstruction (ROR for short) technique, is proposed in this paper. Based on system modeling in ray tracing software, the parameter of each optical element is optimized and this makes system modeling more precise. Systematic error of non-null aspheric testing system is analyzed and can be categorized into two types, the error due to surface parameters of PNL in the system modeling and the rest from non-null interferometer by the approach of error storage subtraction. Experimental results show that, after systematic error is removed from testing result of non-null aspheric testing system, the aspheric surface is precisely reconstructed by ROR technique and the consideration of systematic error greatly increase the test accuracy of non-null aspheric testing system.

  14. Intelligent Network Flow Optimization (INFLO) prototype acceptance test summary.

    DOT National Transportation Integrated Search

    2015-05-01

    This report summarizes the results of System Acceptance Testing for the implementation of the Intelligent Network Flow Optimization (INFLO) Prototype bundle within the Dynamic Mobility Applications (DMA) portion of the Connected Vehicle Program. This...

  15. Test scheduling optimization for 3D network-on-chip based on cloud evolutionary algorithm of Pareto multi-objective

    NASA Astrophysics Data System (ADS)

    Xu, Chuanpei; Niu, Junhao; Ling, Jing; Wang, Suyan

    2018-03-01

    In this paper, we present a parallel test strategy for bandwidth division multiplexing under the test access mechanism bandwidth constraint. The Pareto solution set is combined with a cloud evolutionary algorithm to optimize the test time and power consumption of a three-dimensional network-on-chip (3D NoC). In the proposed method, all individuals in the population are sorted in non-dominated order and allocated to the corresponding level. Individuals with extreme and similar characteristics are then removed. To increase the diversity of the population and prevent the algorithm from becoming stuck around local optima, a competition strategy is designed for the individuals. Finally, we adopt an elite reservation strategy and update the individuals according to the cloud model. Experimental results show that the proposed algorithm converges to the optimal Pareto solution set rapidly and accurately. This not only obtains the shortest test time, but also optimizes the power consumption of the 3D NoC.

  16. The extension of the thermal-vacuum test optimization program to multiple flights

    NASA Technical Reports Server (NTRS)

    Williams, R. E.; Byrd, J.

    1981-01-01

    The thermal vacuum test optimization model developed to provide an approach to the optimization of a test program based on prediction of flight performance with a single flight option in mind is extended to consider reflight as in space shuttle missions. The concept of 'utility', developed under the name of 'availability', is used to follow performance through the various options encountered when the capabilities of reflight and retrievability of space shuttle are available. Also, a 'lost value' model is modified to produce a measure of the probability of a mission's success, achieving a desired utility using a minimal cost test strategy. The resulting matrix of probabilities and their associated costs provides a means for project management to evaluate various test and reflight strategies.

  17. A hybrid Jaya algorithm for reliability-redundancy allocation problems

    NASA Astrophysics Data System (ADS)

    Ghavidel, Sahand; Azizivahed, Ali; Li, Li

    2018-04-01

    This article proposes an efficient improved hybrid Jaya algorithm based on time-varying acceleration coefficients (TVACs) and the learning phase introduced in teaching-learning-based optimization (TLBO), named the LJaya-TVAC algorithm, for solving various types of nonlinear mixed-integer reliability-redundancy allocation problems (RRAPs) and standard real-parameter test functions. RRAPs include series, series-parallel, complex (bridge) and overspeed protection systems. The search power of the proposed LJaya-TVAC algorithm for finding the optimal solutions is first tested on the standard real-parameter unimodal and multi-modal functions with dimensions of 30-100, and then tested on various types of nonlinear mixed-integer RRAPs. The results are compared with the original Jaya algorithm and the best results reported in the recent literature. The optimal results obtained with the proposed LJaya-TVAC algorithm provide evidence for its better and acceptable optimization performance compared to the original Jaya algorithm and other reported optimal results.

  18. Aggregation Pheromone System: A Real-parameter Optimization Algorithm using Aggregation Pheromones as the Base Metaphor

    NASA Astrophysics Data System (ADS)

    Tsutsui, Shigeyosi

    This paper proposes an aggregation pheromone system (APS) for solving real-parameter optimization problems using the collective behavior of individuals which communicate using aggregation pheromones. APS was tested on several test functions used in evolutionary computation. The results showed APS could solve real-parameter optimization problems fairly well. The sensitivity analysis of control parameters of APS is also studied.

  19. Optimal Stratification of Item Pools in a-Stratified Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Chang, Hua-Hua; van der Linden, Wim J.

    2003-01-01

    Developed a method based on 0-1 linear programming to stratify an item pool optimally for use in alpha-stratified adaptive testing. Applied the method to a previous item pool from the computerized adaptive test of the Graduate Record Examinations. Results show the new method performs well in practical situations. (SLD)

  20. Preliminary supersonic flight test evaluation of performance seeking control

    NASA Technical Reports Server (NTRS)

    Orme, John S.; Gilyard, Glenn B.

    1993-01-01

    Digital flight and engine control, powerful onboard computers, and sophisticated controls techniques may improve aircraft performance by maximizing fuel efficiency, maximizing thrust, and extending engine life. An adaptive performance seeking control system for optimizing the quasi-steady state performance of an F-15 aircraft was developed and flight tested. This system has three optimization modes: minimum fuel, maximum thrust, and minimum fan turbine inlet temperature. Tests of the minimum fuel and fan turbine inlet temperature modes were performed at a constant thrust. Supersonic single-engine flight tests of the three modes were conducted using varied after burning power settings. At supersonic conditions, the performance seeking control law optimizes the integrated airframe, inlet, and engine. At subsonic conditions, only the engine is optimized. Supersonic flight tests showed improvements in thrust of 9 percent, increases in fuel savings of 8 percent, and reductions of up to 85 deg R in turbine temperatures for all three modes. The supersonic performance seeking control structure is described and preliminary results of supersonic performance seeking control tests are given. These findings have implications for improving performance of civilian and military aircraft.

  1. Standardization and validation of a cytometric bead assay to assess antibodies to multiple Plasmodium falciparum recombinant antigens

    PubMed Central

    2012-01-01

    Background Multiplex cytometric bead assay (CBA) have a number of advantages over ELISA for antibody testing, but little information is available on standardization and validation of antibody CBA to multiple Plasmodium falciparum antigens. The present study was set to determine optimal parameters for multiplex testing of antibodies to P. falciparum antigens, and to compare results of multiplex CBA to ELISA. Methods Antibodies to ten recombinant P. falciparum antigens were measured by CBA and ELISA in samples from 30 individuals from a malaria endemic area of Kenya and compared to known positive and negative control plasma samples. Optimal antigen amounts, monoplex vs multiplex testing, plasma dilution, optimal buffer, number of beads required were assessed for CBA testing, and results from CBA vs. ELISA testing were compared. Results Optimal amounts for CBA antibody testing differed according to antigen. Results for monoplex CBA testing correlated strongly with multiplex testing for all antigens (r = 0.88-0.99, P values from <0.0001 - 0.004), and antibodies to variants of the same antigen were accurately distinguished within a multiplex reaction. Plasma dilutions of 1:100 or 1:200 were optimal for all antigens for CBA testing. Plasma diluted in a buffer containing 0.05% sodium azide, 0.5% polyvinylalcohol, and 0.8% polyvinylpyrrolidone had the lowest background activity. CBA median fluorescence intensity (MFI) values with 1,000 antigen-conjugated beads/well did not differ significantly from MFI with 5,000 beads/well. CBA and ELISA results correlated well for all antigens except apical membrane antigen-1 (AMA-1). CBA testing produced a greater range of values in samples from malaria endemic areas and less background reactivity for blank samples than ELISA. Conclusion With optimization, CBA may be the preferred method of testing for antibodies to P. falciparum antigens, as CBA can test for antibodies to multiple recombinant antigens from a single plasma sample and produces a greater range of values in positive samples and lower background readings for blank samples than ELISA. PMID:23259607

  2. Preliminary research on eddy current bobbin quantitative test for heat exchange tube in nuclear power plant

    NASA Astrophysics Data System (ADS)

    Qi, Pan; Shao, Wenbin; Liao, Shusheng

    2016-02-01

    For quantitative defects detection research on heat transfer tube in nuclear power plants (NPP), two parts of work are carried out based on the crack as the main research objects. (1) Production optimization of calibration tube. Firstly, ASME, RSEM and homemade crack calibration tubes are applied to quantitatively analyze the defects depth on other designed crack test tubes, and then the judgment with quantitative results under crack calibration tube with more accuracy is given. Base on that, weight analysis of influence factors for crack depth quantitative test such as crack orientation, length, volume and so on can be undertaken, which will optimize manufacture technology of calibration tubes. (2) Quantitative optimization of crack depth. Neural network model with multi-calibration curve adopted to optimize natural crack test depth generated in in-service tubes shows preliminary ability to improve quantitative accuracy.

  3. Experimental Optimization Methods for Multi-Element Airfoils

    NASA Technical Reports Server (NTRS)

    Landman, Drew; Britcher, Colin P.

    1996-01-01

    A modern three element airfoil model with a remotely activated flap was used to investigate optimum flap testing position using an automated optimization algorithm in wind tunnel tests. Detailed results for lift coefficient versus flap vertical and horizontal position are presented for two angles of attack: 8 and 14 degrees. An on-line first order optimizer is demonstrated which automatically seeks the optimum lift as a function of flap position. Future work with off-line optimization techniques is introduced and aerodynamic hysteresis effects due to flap movement with flow on are discussed.

  4. Optimization of Asphalt Mixture Design for the Louisiana ALF Test Sections

    DOT National Transportation Integrated Search

    2018-05-01

    This research presents an extensive study on the design and characterization of asphalt mixtures used in road pavements. Both mixture volumetrics and physical properties obtained from several laboratory tests were considered in optimizing the mixture...

  5. Optimizing countershading camouflage.

    PubMed

    Cuthill, Innes C; Sanghera, N Simon; Penacchio, Olivier; Lovell, Paul George; Ruxton, Graeme D; Harris, Julie M

    2016-11-15

    Countershading, the widespread tendency of animals to be darker on the side that receives strongest illumination, has classically been explained as an adaptation for camouflage: obliterating cues to 3D shape and enhancing background matching. However, there have only been two quantitative tests of whether the patterns observed in different species match the optimal shading to obliterate 3D cues, and no tests of whether optimal countershading actually improves concealment or survival. We use a mathematical model of the light field to predict the optimal countershading for concealment that is specific to the light environment and then test this prediction with correspondingly patterned model "caterpillars" exposed to avian predation in the field. We show that the optimal countershading is strongly illumination-dependent. A relatively sharp transition in surface patterning from dark to light is only optimal under direct solar illumination; if there is diffuse illumination from cloudy skies or shade, the pattern provides no advantage over homogeneous background-matching coloration. Conversely, a smoother gradation between dark and light is optimal under cloudy skies or shade. The demonstration of these illumination-dependent effects of different countershading patterns on predation risk strongly supports the comparative evidence showing that the type of countershading varies with light environment.

  6. High-Frequency Axial Fatigue Test Procedures for Spectrum Loading

    DTIC Science & Technology

    2016-07-20

    histories can be performed at frequencies much higher than standard servo-hydraulic test frames by using a test frame that is optimized to run at higher...by using a test frame that is optimized to run at higher frequencies. AIR 4.3 has conducted a research program to develop a test capability for...Applied Research (BAR) program (219BAR-10-008) was initiated in 2010. The program investigated the influence of a generic rotorcraft main rotor blade root

  7. Optimizing point-of-care testing in clinical systems management.

    PubMed

    Kost, G J

    1998-01-01

    The goal of improving medical and economic outcomes calls for leadership based on fundamental principles. The manager of clinical systems works collaboratively within the acute care center to optimize point-of-care testing through systematic approaches such as integrative strategies, algorithms, and performance maps. These approaches are effective and efficacious for critically ill patients. Optimizing point-of-care testing throughout the entire health-care system is inherently more difficult. There is potential to achieve high-quality testing, integrated disease management, and equitable health-care delivery. Despite rapid change and economic uncertainty, a macro-strategic, information-integrated, feedback-systems, outcomes-oriented approach is timely, challenging, effective, and uplifting to the creative human spirit.

  8. Study on optimization method of test conditions for fatigue crack detection using lock-in vibrothermography

    NASA Astrophysics Data System (ADS)

    Min, Qing-xu; Zhu, Jun-zhen; Feng, Fu-zhou; Xu, Chao; Sun, Ji-wei

    2017-06-01

    In this paper, the lock-in vibrothermography (LVT) is utilized for defect detection. Specifically, for a metal plate with an artificial fatigue crack, the temperature rise of the defective area is used for analyzing the influence of different test conditions, i.e. engagement force, excitation intensity, and modulated frequency. The multivariate nonlinear and logistic regression models are employed to estimate the POD (probability of detection) and POA (probability of alarm) of fatigue crack, respectively. The resulting optimal selection of test conditions is presented. The study aims to provide an optimized selection method of the test conditions in the vibrothermography system with the enhanced detection ability.

  9. A Cascade Optimization Strategy for Solution of Difficult Multidisciplinary Design Problems

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A research project to comparatively evaluate 10 nonlinear optimization algorithms was recently completed. A conclusion was that no single optimizer could successfully solve all 40 problems in the test bed, even though most optimizers successfully solved at least one-third of the problems. We realized that improved search directions and step lengths, available in the 10 optimizers compared, were not likely to alleviate the convergence difficulties. For the solution of those difficult problems we have devised an alternative approach called cascade optimization strategy. The cascade strategy uses several optimizers, one followed by another in a specified sequence, to solve a problem. A pseudorandom scheme perturbs design variables between the optimizers. The cascade strategy has been tested successfully in the design of supersonic and subsonic aircraft configurations and air-breathing engines for high-speed civil transport applications. These problems could not be successfully solved by an individual optimizer. The cascade optimization strategy, however, generated feasible optimum solutions for both aircraft and engine problems. This paper presents the cascade strategy and solutions to a number of these problems.

  10. Optimal Number of Gaps in C-Test Passages

    ERIC Educational Resources Information Center

    Baghaei, Purya

    2011-01-01

    This study addresses the issue of the optimal number of gaps in C-Test passages. An English C-Test battery containing four passages each having 40 blanks was given to 104 undergraduate students of English. The data were entered into SPSS spreadsheet. Out of the complete data with 160 blanks seven additional datasets were constructed. In the first…

  11. An Automated DAKOTA and VULCAN-CFD Framework with Application to Supersonic Facility Nozzle Flowpath Optimization

    NASA Technical Reports Server (NTRS)

    Axdahl, Erik L.

    2015-01-01

    Removing human interaction from design processes by using automation may lead to gains in both productivity and design precision. This memorandum describes efforts to incorporate high fidelity numerical analysis tools into an automated framework and applying that framework to applications of practical interest. The purpose of this effort was to integrate VULCAN-CFD into an automated, DAKOTA-enabled framework with a proof-of-concept application being the optimization of supersonic test facility nozzles. It was shown that the optimization framework could be deployed on a high performance computing cluster with the flow of information handled effectively to guide the optimization process. Furthermore, the application of the framework to supersonic test facility nozzle flowpath design and optimization was demonstrated using multiple optimization algorithms.

  12. Optimization and Improvement of Test Processes on a Production Line

    NASA Astrophysics Data System (ADS)

    Sujová, Erika; Čierna, Helena

    2018-06-01

    The paper deals with increasing processes efficiency at a production line of cylinder heads of engines in a production company operating in the automotive industry. The goal is to achieve improvement and optimization of test processes on a production line. It analyzes options for improving capacity, availability and productivity of processes of an output test by using modern technology available on the market. We have focused on analysis of operation times before and after optimization of test processes at specific production sections. By analyzing measured results we have determined differences in time before and after improvement of the process. We have determined a coefficient of efficiency OEE and by comparing outputs we have confirmed real improvement of the process of the output test of cylinder heads.

  13. Development and optimization of an energy-regenerative suspension system under stochastic road excitation

    NASA Astrophysics Data System (ADS)

    Huang, Bo; Hsieh, Chen-Yu; Golnaraghi, Farid; Moallem, Mehrdad

    2015-11-01

    In this paper a vehicle suspension system with energy harvesting capability is developed, and an analytical methodology for the optimal design of the system is proposed. The optimization technique provides design guidelines for determining the stiffness and damping coefficients aimed at the optimal performance in terms of ride comfort and energy regeneration. The corresponding performance metrics are selected as root-mean-square (RMS) of sprung mass acceleration and expectation of generated power. The actual road roughness is considered as the stochastic excitation defined by ISO 8608:1995 standard road profiles and used in deriving the optimization method. An electronic circuit is proposed to provide variable damping in the real-time based on the optimization rule. A test-bed is utilized and the experiments under different driving conditions are conducted to verify the effectiveness of the proposed method. The test results suggest that the analytical approach is credible in determining the optimality of system performance.

  14. Reliability based design optimization: Formulations and methodologies

    NASA Astrophysics Data System (ADS)

    Agarwal, Harish

    Modern products ranging from simple components to complex systems should be designed to be optimal and reliable. The challenge of modern engineering is to ensure that manufacturing costs are reduced and design cycle times are minimized while achieving requirements for performance and reliability. If the market for the product is competitive, improved quality and reliability can generate very strong competitive advantages. Simulation based design plays an important role in designing almost any kind of automotive, aerospace, and consumer products under these competitive conditions. Single discipline simulations used for analysis are being coupled together to create complex coupled simulation tools. This investigation focuses on the development of efficient and robust methodologies for reliability based design optimization in a simulation based design environment. Original contributions of this research are the development of a novel efficient and robust unilevel methodology for reliability based design optimization, the development of an innovative decoupled reliability based design optimization methodology, the application of homotopy techniques in unilevel reliability based design optimization methodology, and the development of a new framework for reliability based design optimization under epistemic uncertainty. The unilevel methodology for reliability based design optimization is shown to be mathematically equivalent to the traditional nested formulation. Numerical test problems show that the unilevel methodology can reduce computational cost by at least 50% as compared to the nested approach. The decoupled reliability based design optimization methodology is an approximate technique to obtain consistent reliable designs at lesser computational expense. Test problems show that the methodology is computationally efficient compared to the nested approach. A framework for performing reliability based design optimization under epistemic uncertainty is also developed. A trust region managed sequential approximate optimization methodology is employed for this purpose. Results from numerical test studies indicate that the methodology can be used for performing design optimization under severe uncertainty.

  15. A thermal vacuum test optimization procedure

    NASA Technical Reports Server (NTRS)

    Kruger, R.; Norris, H. P.

    1979-01-01

    An analytical model was developed that can be used to establish certain parameters of a thermal vacuum environmental test program based on an optimization of program costs. This model is in the form of a computer program that interacts with a user insofar as the input of certain parameters. The program provides the user a list of pertinent information regarding an optimized test program and graphs of some of the parameters. The model is a first attempt in this area and includes numerous simplifications. The model appears useful as a general guide and provides a way for extrapolating past performance to future missions.

  16. Comparison of multiobjective evolutionary algorithms: empirical results.

    PubMed

    Zitzler, E; Deb, K; Thiele, L

    2000-01-01

    In this paper, we provide a systematic comparison of various evolutionary approaches to multiobjective optimization using six carefully chosen test functions. Each test function involves a particular feature that is known to cause difficulty in the evolutionary optimization process, mainly in converging to the Pareto-optimal front (e.g., multimodality and deception). By investigating these different problem features separately, it is possible to predict the kind of problems to which a certain technique is or is not well suited. However, in contrast to what was suspected beforehand, the experimental results indicate a hierarchy of the algorithms under consideration. Furthermore, the emerging effects are evidence that the suggested test functions provide sufficient complexity to compare multiobjective optimizers. Finally, elitism is shown to be an important factor for improving evolutionary multiobjective search.

  17. Thermal-Aware Test Access Mechanism and Wrapper Design Optimization for System-on-Chips

    NASA Astrophysics Data System (ADS)

    Yu, Thomas Edison; Yoneda, Tomokazu; Chakrabarty, Krishnendu; Fujiwara, Hideo

    Rapid advances in semiconductor manufacturing technology have led to higher chip power densities, which places greater emphasis on packaging and temperature control during testing. For system-on-chips, peak power-based scheduling algorithms have been used to optimize tests under specified power constraints. However, imposing power constraints does not always solve the problem of overheating due to the non-uniform distribution of power across the chip. This paper presents a TAM/Wrapper co-design methodology for system-on-chips that ensures thermal safety while still optimizing the test schedule. The method combines a simplified thermal-cost model with a traditional bin-packing algorithm to minimize test time while satisfying temperature constraints. Furthermore, for temperature checking, thermal simulation is done using cycle-accurate power profiles for more realistic results. Experiments show that even a minimal sacrifice in test time can yield a considerable decrease in test temperature as well as the possibility of further lowering temperatures beyond those achieved using traditional power-based test scheduling.

  18. 20 Meter Solar Sail Analysis and Correlation

    NASA Technical Reports Server (NTRS)

    Taleghani, B.; Lively, P.; Banik, J.; Murphy, D.; Trautt, T.

    2005-01-01

    This presentation discusses studies conducted to determine the element type and size that best represents a 20-meter solar sail under ground-test load conditions, the performance of test/Analysis correlation by using Static Shape Optimization Method for Q4 sail, and system dynamic. TRIA3 elements better represent wrinkle patterns than do QUAD3 elements Baseline, ten-inch elements are small enough to accurately represent sail shape, and baseline TRIA3 mesh requires a reasonable computation time of 8 min. 21 sec. In the test/analysis correlation by using Static shape optimization method for Q4 sail, ten parameters were chosen and varied during optimization. 300 sail models were created with random parameters. A response surfaces for each targets which were created based on the varied parameters. Parameters were optimized based on response surface. Deflection shape comparison for 0 and 22.5 degrees yielded a 4.3% and 2.1% error respectively. For the system dynamic study testing was done on the booms without the sails attached. The nominal boom properties produced a good correlation to test data the frequencies were within 10%. Boom dominated analysis frequencies and modes compared well with the test results.

  19. Vector-model-supported approach in prostate plan optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Eva Sau Fan; Department of Health Technology and Informatics, The Hong Kong Polytechnic University; Wu, Vincent Wing Cheung

    Lengthy time consumed in traditional manual plan optimization can limit the use of step-and-shoot intensity-modulated radiotherapy/volumetric-modulated radiotherapy (S&S IMRT/VMAT). A vector model base, retrieving similar radiotherapy cases, was developed with respect to the structural and physiologic features extracted from the Digital Imaging and Communications in Medicine (DICOM) files. Planning parameters were retrieved from the selected similar reference case and applied to the test case to bypass the gradual adjustment of planning parameters. Therefore, the planning time spent on the traditional trial-and-error manual optimization approach in the beginning of optimization could be reduced. Each S&S IMRT/VMAT prostate reference database comprised 100more » previously treated cases. Prostate cases were replanned with both traditional optimization and vector-model-supported optimization based on the oncologists' clinical dose prescriptions. A total of 360 plans, which consisted of 30 cases of S&S IMRT, 30 cases of 1-arc VMAT, and 30 cases of 2-arc VMAT plans including first optimization and final optimization with/without vector-model-supported optimization, were compared using the 2-sided t-test and paired Wilcoxon signed rank test, with a significance level of 0.05 and a false discovery rate of less than 0.05. For S&S IMRT, 1-arc VMAT, and 2-arc VMAT prostate plans, there was a significant reduction in the planning time and iteration with vector-model-supported optimization by almost 50%. When the first optimization plans were compared, 2-arc VMAT prostate plans had better plan quality than 1-arc VMAT plans. The volume receiving 35 Gy in the femoral head for 2-arc VMAT plans was reduced with the vector-model-supported optimization compared with the traditional manual optimization approach. Otherwise, the quality of plans from both approaches was comparable. Vector-model-supported optimization was shown to offer much shortened planning time and iteration number without compromising the plan quality.« less

  20. Optimization of OT-MACH Filter Generation for Target Recognition

    NASA Technical Reports Server (NTRS)

    Johnson, Oliver C.; Edens, Weston; Lu, Thomas T.; Chao, Tien-Hsin

    2009-01-01

    An automatic Optimum Trade-off Maximum Average Correlation Height (OT-MACH) filter generator for use in a gray-scale optical correlator (GOC) has been developed for improved target detection at JPL. While the OT-MACH filter has been shown to be an optimal filter for target detection, actually solving for the optimum is too computationally intensive for multiple targets. Instead, an adaptive step gradient descent method was tested to iteratively optimize the three OT-MACH parameters, alpha, beta, and gamma. The feedback for the gradient descent method was a composite of the performance measures, correlation peak height and peak to side lobe ratio. The automated method generated and tested multiple filters in order to approach the optimal filter quicker and more reliably than the current manual method. Initial usage and testing has shown preliminary success at finding an approximation of the optimal filter, in terms of alpha, beta, gamma values. This corresponded to a substantial improvement in detection performance where the true positive rate increased for the same average false positives per image.

  1. Shape optimization of shear fracture specimen considering plastic anisotropy

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Yoon, J. W.; Lee, S.; Lou, Y.

    2017-10-01

    It is important to fabricate fracture specimens with minimum variation of triaxiality in order to characterize the failure behaviors experimentally. Fracture in ductile materials is usually calibrated by uniaxial tensile, shear and plane strain tests. However, it is often observed that triaxiality for shear specimen changes severely during shear fracture test. The nonlinearity of triaxiality is most critical for shear test. In this study, a simple in-plane shear specimen is optimized by minimizing the variation of stress triaxiality in the shear zone. In the optimization, the Hill48 and Yld2000-2d criteria are employed to model the anisotropic plastic deformation of an aluminum alloy of 6k21. The evolution of the stress triaxiality of the optimized shear specimen is compared with that of the initial design of the shear specimen. The comparison reveals that the stress triaxiality changes much less for the optimized shear specimen than the evolution of the stress triaxiality with the original design of the shear specimen.

  2. A Standard Platform for Testing and Comparison of MDAO Architectures

    NASA Technical Reports Server (NTRS)

    Gray, Justin S.; Moore, Kenneth T.; Hearn, Tristan A.; Naylor, Bret A.

    2012-01-01

    The Multidisciplinary Design Analysis and Optimization (MDAO) community has developed a multitude of algorithms and techniques, called architectures, for performing optimizations on complex engineering systems which involve coupling between multiple discipline analyses. These architectures seek to efficiently handle optimizations with computationally expensive analyses including multiple disciplines. We propose a new testing procedure that can provide a quantitative and qualitative means of comparison among architectures. The proposed test procedure is implemented within the open source framework, OpenMDAO, and comparative results are presented for five well-known architectures: MDF, IDF, CO, BLISS, and BLISS-2000. We also demonstrate how using open source soft- ware development methods can allow the MDAO community to submit new problems and architectures to keep the test suite relevant.

  3. Experimental design methodologies in the optimization of chiral CE or CEC separations: an overview.

    PubMed

    Dejaegher, Bieke; Mangelings, Debby; Vander Heyden, Yvan

    2013-01-01

    In this chapter, an overview of experimental designs to develop chiral capillary electrophoresis (CE) and capillary electrochromatographic (CEC) methods is presented. Method development is generally divided into technique selection, method optimization, and method validation. In the method optimization part, often two phases can be distinguished, i.e., a screening and an optimization phase. In method validation, the method is evaluated on its fit for purpose. A validation item, also applying experimental designs, is robustness testing. In the screening phase and in robustness testing, screening designs are applied. During the optimization phase, response surface designs are used. The different design types and their application steps are discussed in this chapter and illustrated by examples of chiral CE and CEC methods.

  4. Optimization of high-throughput nanomaterial developmental toxicity testing in zebrafish embryos

    EPA Science Inventory

    Nanomaterial (NM) developmental toxicities are largely unknown. With an extensive variety of NMs available, high-throughput screening methods may be of value for initial characterization of potential hazard. We optimized a zebrafish embryo test as an in vivo high-throughput assay...

  5. Optimal configuration of power grid sources based on optimal particle swarm algorithm

    NASA Astrophysics Data System (ADS)

    Wen, Yuanhua

    2018-04-01

    In order to optimize the distribution problem of power grid sources, an optimized particle swarm optimization algorithm is proposed. First, the concept of multi-objective optimization and the Pareto solution set are enumerated. Then, the performance of the classical genetic algorithm, the classical particle swarm optimization algorithm and the improved particle swarm optimization algorithm are analyzed. The three algorithms are simulated respectively. Compared with the test results of each algorithm, the superiority of the algorithm in convergence and optimization performance is proved, which lays the foundation for subsequent micro-grid power optimization configuration solution.

  6. Comparison of genetic algorithm methods for fuel management optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeChaine, M.D.; Feltus, M.A.

    1995-12-31

    The CIGARO system was developed for genetic algorithm fuel management optimization. Tests are performed to find the best fuel location swap mutation operator probability and to compare genetic algorithm to a truly random search method. Tests showed the fuel swap probability should be between 0% and 10%, and a 50% definitely hampered the optimization. The genetic algorithm performed significantly better than the random search method, which did not even satisfy the peak normalized power constraint.

  7. Sandia Internship Fall 2014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhn, Joshua

    2015-01-01

    While working at Sandia National Laboratories as a graduate intern from September 2014 to January 2015, most of my time was spent on two projects. The first project involved designing a test fixture for circuit boards used in a recording device. The test fixture was needed to decrease test set up time. The second project was to use optimization techniques to determine the optimal G-Switch for given acceleration profiles.

  8. MHK Hydrofoils Design, Wind Tunnel Optimization and CFD Analysis Report for the Aquantis 2.5MW Ocean Current Generation Device

    DOE Data Explorer

    Shiu, Henry; Swales, Henry; Van Damn, Case

    2015-06-03

    Dataset contains MHK Hydrofoils Design and Optimization and CFD Analysis Report for the Aquantis 2.5 MW Ocean Current Generation Device, as well as MHK Hydrofoils Wind Tunnel Test Plan and Checkout Test Report.

  9. Priority design parameters of industrialized optical fiber sensors in civil engineering

    NASA Astrophysics Data System (ADS)

    Wang, Huaping; Jiang, Lizhong; Xiang, Ping

    2018-03-01

    Considering the mechanical effects and the different paths for transferring deformation, optical fiber sensors commonly used in civil engineering have been systematically classified. Based on the strain transfer theory, the relationship between the strain transfer coefficient and allowable testing error is established. The proposed relationship is regarded as the optimal control equation to obtain the optimal value of sensors that satisfy the requirement of measurement precision. Furthermore, specific optimization design methods and priority design parameters of the classified sensors are presented. This research indicates that (1) strain transfer theory-based optimization design method is much suitable for the sensor that depends on the interfacial shear stress to transfer the deformation; (2) the priority design parameters are bonded (sensing) length, interfacial bonded strength, elastic modulus and radius of protective layer and thickness of adhesive layer; (3) the optimization design of sensors with two anchor pieces at two ends is independent of strain transfer theory as the strain transfer coefficient can be conveniently calibrated by test, and this kind of sensors has no obvious priority design parameters. Improved calibration test is put forward to enhance the accuracy of the calibration coefficient of end-expanding sensors. By considering the practical state of sensors and the testing accuracy, comprehensive and systematic analyses on optical fiber sensors are provided from the perspective of mechanical actions, which could scientifically instruct the application design and calibration test of industrialized optical fiber sensors.

  10. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE - A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnis Judzis

    2003-07-01

    This document details the progress to date on the ''OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE--A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING'' contract for the quarter starting April 2003 through June 2003. The DOE and TerraTek continue to wait for Novatek on the optimization portion of the testing program (they are completely rebuilding their fluid hammer). Accomplishments included the following: (1) Hughes Christensen has recently expressed interest in the possibility of a program to examine cutter impact testing, which would be useful in a better understanding of the physics of rock impact. Their interest however is notmore » necessarily fluid hammers, but to use the information for drilling bit development. (2) Novatek (cost sharing supplier of tools) has informed the DOE project manager that their tool may not be ready for ''optimization'' testing late summer 2003 (August-September timeframe) as originally anticipated. During 3Q Novatek plans to meet with TerraTek to discuss progress with their tool for 4Q 2003 testing. (3) A task for an addendum to the hammer project related to cutter impact studies was written during 2Q 2003. (4) Smith International internally is upgrading their hammer for the optimization testing phase. One currently known area of improvement is their development program to significantly increase the hammer blow energy.« less

  11. Optimizing urine drug testing for monitoring medication compliance in pain management.

    PubMed

    Melanson, Stacy E F; Ptolemy, Adam S; Wasan, Ajay D

    2013-12-01

    It can be challenging to successfully monitor medication compliance in pain management. Clinicians and laboratorians need to collaborate to optimize patient care and maximize operational efficiency. The test menu, assay cutoffs, and testing algorithms utilized in the urine drug testing panels should be periodically reviewed and tailored to the patient population to effectively assess compliance and avoid unnecessary testing and cost to the patient. Pain management and pathology collaborated on an important quality improvement initiative to optimize urine drug testing for monitoring medication compliance in pain management. We retrospectively reviewed 18 months of data from our pain management center. We gathered data on test volumes, positivity rates, and the frequency of false positive results. We also reviewed the clinical utility of our testing algorithms, assay cutoffs, and adulterant panel. In addition, the cost of each component was calculated. The positivity rate for ethanol and 3,4-methylenedioxymethamphetamine were <1% so we eliminated this testing from our panel. We also lowered the screening cutoff for cocaine to meet the clinical needs of the pain management center. In addition, we changed our testing algorithm for 6-acetylmorphine, benzodiazepines, and methadone. For example, due the high rate of false negative results using our immunoassay-based benzodiazepine screen, we removed the screening portion of the algorithm and now perform benzodiazepine confirmation up front in all specimens by liquid chromatography-tandem mass spectrometry. Conducting an interdisciplinary quality improvement project allowed us to optimize our testing panel for monitoring medication compliance in pain management and reduce cost. Wiley Periodicals, Inc.

  12. Swarm based mean-variance mapping optimization (MVMOS) for solving economic dispatch

    NASA Astrophysics Data System (ADS)

    Khoa, T. H.; Vasant, P. M.; Singh, M. S. Balbir; Dieu, V. N.

    2014-10-01

    The economic dispatch (ED) is an essential optimization task in the power generation system. It is defined as the process of allocating the real power output of generation units to meet required load demand so as their total operating cost is minimized while satisfying all physical and operational constraints. This paper introduces a novel optimization which named as Swarm based Mean-variance mapping optimization (MVMOS). The technique is the extension of the original single particle mean-variance mapping optimization (MVMO). Its features make it potentially attractive algorithm for solving optimization problems. The proposed method is implemented for three test power systems, including 3, 13 and 20 thermal generation units with quadratic cost function and the obtained results are compared with many other methods available in the literature. Test results have indicated that the proposed method can efficiently implement for solving economic dispatch.

  13. Guided particle swarm optimization method to solve general nonlinear optimization problems

    NASA Astrophysics Data System (ADS)

    Abdelhalim, Alyaa; Nakata, Kazuhide; El-Alem, Mahmoud; Eltawil, Amr

    2018-04-01

    The development of hybrid algorithms is becoming an important topic in the global optimization research area. This article proposes a new technique in hybridizing the particle swarm optimization (PSO) algorithm and the Nelder-Mead (NM) simplex search algorithm to solve general nonlinear unconstrained optimization problems. Unlike traditional hybrid methods, the proposed method hybridizes the NM algorithm inside the PSO to improve the velocities and positions of the particles iteratively. The new hybridization considers the PSO algorithm and NM algorithm as one heuristic, not in a sequential or hierarchical manner. The NM algorithm is applied to improve the initial random solution of the PSO algorithm and iteratively in every step to improve the overall performance of the method. The performance of the proposed method was tested over 20 optimization test functions with varying dimensions. Comprehensive comparisons with other methods in the literature indicate that the proposed solution method is promising and competitive.

  14. Selection of optimal sensors for predicting performance of polymer electrolyte membrane fuel cell

    NASA Astrophysics Data System (ADS)

    Mao, Lei; Jackson, Lisa

    2016-10-01

    In this paper, sensor selection algorithms are investigated based on a sensitivity analysis, and the capability of optimal sensors in predicting PEM fuel cell performance is also studied using test data. The fuel cell model is developed for generating the sensitivity matrix relating sensor measurements and fuel cell health parameters. From the sensitivity matrix, two sensor selection approaches, including the largest gap method, and exhaustive brute force searching technique, are applied to find the optimal sensors providing reliable predictions. Based on the results, a sensor selection approach considering both sensor sensitivity and noise resistance is proposed to find the optimal sensor set with minimum size. Furthermore, the performance of the optimal sensor set is studied to predict fuel cell performance using test data from a PEM fuel cell system. Results demonstrate that with optimal sensors, the performance of PEM fuel cell can be predicted with good quality.

  15. Dispositional optimism and sleep quality: a test of mediating pathways

    PubMed Central

    Cribbet, Matthew; Kent de Grey, Robert G.; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W.

    2016-01-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways. PMID:27592128

  16. Dispositional optimism and sleep quality: a test of mediating pathways.

    PubMed

    Uchino, Bert N; Cribbet, Matthew; de Grey, Robert G Kent; Cronan, Sierra; Trettevik, Ryan; Smith, Timothy W

    2017-04-01

    Dispositional optimism has been related to beneficial influences on physical health outcomes. However, its links to global sleep quality and the psychological mediators responsible for such associations are less studied. This study thus examined if trait optimism predicted global sleep quality, and if measures of subjective well-being were statistical mediators of such links. A community sample of 175 participants (93 men, 82 women) completed measures of trait optimism, depression, and life satisfaction. Global sleep quality was assessed using the Pittsburgh Sleep Quality Index. Results indicated that trait optimism was a strong predictor of better PSQI global sleep quality. Moreover, this association was mediated by depression and life satisfaction in both single and multiple mediator models. These results highlight the importance of optimism for the restorative process of sleep, as well as the utility of multiple mediator models in testing distinct psychological pathways.

  17. Intervention to Match Young Black Men and Transwomen Who Have Sex With Men or Transwomen to HIV Testing Options (All About Me): Protocol for a Randomized Controlled Trial.

    PubMed

    Koblin, Beryl; Hirshfield, Sabina; Chiasson, Mary Ann; Wilton, Leo; Usher, DaShawn; Nandi, Vijay; Hoover, Donald R; Frye, Victoria

    2017-12-19

    HIV testing is a critical component of HIV prevention and care. Interventions to increase HIV testing rates among young black men who have sex with men (MSM) and black transgender women (transwomen) are needed. Personalized recommendations for an individual's optimal HIV testing approach may increase testing. This randomized trial tests the hypothesis that a personalized recommendation of an optimal HIV testing approach will increase HIV testing more than standard HIV testing information. A randomized trial among 236 young black men and transwomen who have sex with men or transwomen is being conducted. Participants complete a computerized baseline assessment and are randomized to electronically receive a personalized HIV testing recommendation or standard HIV testing information. Follow-up surveys are conducted online at 3 and 6 months after baseline. The All About Me randomized trial was launched in June 2016. Enrollment is completed and 3-month retention is 92.4% (218/236) and has exceeded study target goals. The All About Me intervention is an innovative approach to increase HIV testing by providing a personalized recommendation of a person's optimal HIV testing approach. If successful, optimizing this intervention for mobile devices will widen access to large numbers of individuals. ClinicalTrial.gov NCT02834572; https://clinicaltrials.gov/ct2/show/NCT02834572 (Archived by WebCite at http://www.webcitation.org/6vLJWOS1B). ©Beryl Koblin, Sabina Hirshfield, Mary Ann Chiasson, Leo Wilton, DaShawn Usher, Vijay Nandi, Donald R Hoover, Victoria Frye. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 19.12.2017.

  18. A test of ecological optimality for semiarid vegetation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Salvucci, Guido D.; Eagleson, Peter S.; Turner, Edmund K.

    1992-01-01

    Three ecological optimality hypotheses which have utility in parameter reduction and estimation in a climate-soil-vegetation water balance model are reviewed and tested. The first hypothesis involves short term optimization of vegetative canopy density through equilibrium soil moisture maximization. The second hypothesis involves vegetation type selection again through soil moisture maximization, and the third involves soil genesis through plant induced modification of soil hydraulic properties to values which result in a maximum rate of biomass productivity.

  19. A Rational Analysis of the Selection Task as Optimal Data Selection.

    ERIC Educational Resources Information Center

    Oaksford, Mike; Chater, Nick

    1994-01-01

    Experimental data on human reasoning in hypothesis-testing tasks is reassessed in light of a Bayesian model of optimal data selection in inductive hypothesis testing. The rational analysis provided by the model suggests that reasoning in such tasks may be rational rather than subject to systematic bias. (SLD)

  20. SU-F-T-187: Quantifying Normal Tissue Sparing with 4D Robust Optimization of Intensity Modulated Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Newpower, M; Ge, S; Mohan, R

    Purpose: To report an approach to quantify the normal tissue sparing for 4D robustly-optimized versus PTV-optimized IMPT plans. Methods: We generated two sets of 90 DVHs from a patient’s 10-phase 4D CT set; one by conventional PTV-based optimization done in the Eclipse treatment planning system, and the other by an in-house robust optimization algorithm. The 90 DVHs were created for the following scenarios in each of the ten phases of the 4DCT: ± 5mm shift along x, y, z; ± 3.5% range uncertainty and a nominal scenario. A Matlab function written by Gay and Niemierko was modified to calculate EUDmore » for each DVH for the following structures: esophagus, heart, ipsilateral lung and spinal cord. An F-test determined whether or not the variances of each structure’s DVHs were statistically different. Then a t-test determined if the average EUDs for each optimization algorithm were statistically significantly different. Results: T-test results showed each structure had a statistically significant difference in average EUD when comparing robust optimization versus PTV-based optimization. Under robust optimization all structures except the spinal cord received lower EUDs than PTV-based optimization. Using robust optimization the average EUDs decreased 1.45% for the esophagus, 1.54% for the heart and 5.45% for the ipsilateral lung. The average EUD to the spinal cord increased 24.86% but was still well below tolerance. Conclusion: This work has helped quantify a qualitative relationship noted earlier in our work: that robust optimization leads to plans with greater normal tissue sparing compared to PTV-based optimization. Except in the case of the spinal cord all structures received a lower EUD under robust optimization and these results are statistically significant. While the average EUD to the spinal cord increased to 25.06 Gy under robust optimization it is still well under the TD50 value of 66.5 Gy from Emami et al. Supported in part by the NCI U19 CA021239.« less

  1. Dispositional and Explanatory Style Optimism as Potential Moderators of the Relationship between Hopelessness and Suicidal Ideation

    ERIC Educational Resources Information Center

    Hirsch, Jameson K.; Conner, Kenneth R.

    2006-01-01

    To test the hypothesis that higher levels of optimism reduce the association between hopelessness and suicidal ideation, 284 college students completed self-report measures of optimism and Beck scales for hopelessness, suicidal ideation, and depression. A statistically significant interaction between hopelessness and one measure of optimism was…

  2. Geometrical optimization of sensors for eddy currents nondestructive testing and evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thollon, F.; Burais, N.

    1995-05-01

    Design of Non Destructive Testing (NDT) and Non Destructive Evaluation (NDE) sensors is possible by solving Maxwell`s relations with FEM or BIM. But the large number of geometrical and electrical parameters of sensor and tested material implies many results that don`t give necessarily a well adapted sensor. The authors have used a genetic algorithm for automatic optimization. After having tested this algorithm with analytical solution of Maxwell`s relations for cladding thickness measurement, the method has been implemented in finite element package.

  3. UHPC for Blast and Ballistic Protection, Explosion Testing and Composition Optimization

    NASA Astrophysics Data System (ADS)

    Bibora, P.; Drdlová, M.; Prachař, V.; Sviták, O.

    2017-10-01

    The realization of high performance concrete resistant to detonation is the aim and expected outcome of the presented project, which is oriented to development of construction materials for larger objects as protective walls and bunkers. Use of high-strength concrete (HSC / HPC - “high strength / performance concrete”) and high-fiber reinforced concrete (UHPC / UHPFC -“Ultra High Performance Fiber Reinforced Concrete”) seems to be optimal for this purpose of research. The paper describes the research phase of the project, in which we focused on the selection of specific raw materials and chemical additives, including determining the most suitable type and amount of distributed fiber reinforcement. Composition of UHPC was optimized during laboratory manufacture of test specimens to obtain the best desired physical- mechanical properties of developed high performance concretes. In connection with laboratory testing, explosion field tests of UHPC specimens were performed and explosion resistance of laboratory produced UHPC testing boards was investigated.

  4. The optimal power puzzle: scrutiny of the monotone likelihood ratio assumption in multiple testing.

    PubMed

    Cao, Hongyuan; Sun, Wenguang; Kosorok, Michael R

    2013-01-01

    In single hypothesis testing, power is a non-decreasing function of type I error rate; hence it is desirable to test at the nominal level exactly to achieve optimal power. The puzzle lies in the fact that for multiple testing, under the false discovery rate paradigm, such a monotonic relationship may not hold. In particular, exact false discovery rate control may lead to a less powerful testing procedure if a test statistic fails to fulfil the monotone likelihood ratio condition. In this article, we identify different scenarios wherein the condition fails and give caveats for conducting multiple testing in practical settings.

  5. Chapter 8 optimized test design for identification of the variation of elastic stiffness properties of Loblolly Pine (Pinus taeda) pith to bark

    Treesearch

    David Kretschmann; John Considine; F. Pierron

    2016-01-01

    This article presents the design optimization of an un-notched Iosipescu test specimen whose goal is the characterization of the material elastic stiffnesses of a Loblolly (Pinus taeda) or Lodgepole pine (Pinus contorta) sample in one single test. A series of finite element (FE) and grid simulations were conducted to determine displacement and strain fields for various...

  6. An enzyme-linked immunosorbent assay for detection of botulinum toxin-antibodies.

    PubMed

    Dressler, Dirk; Gessler, Frank; Tacik, Pawel; Bigalke, Hans

    2014-09-01

    Antibodies against botulinum neurotoxin (BNT-AB) can be detected by the mouse protection assay (MPA), the hemidiaphragm assay (HDA), and by enzyme-linked immunosorbent assays (ELISA). Both MPA and HDA require sacrifice of experimental animals, and they are technically delicate and labor intensive. We introduce a specially developed ELISA for detection of BNT-A-AB and evaluate it against the HDA. Thirty serum samples were tested by HDA and by the new ELISA. Results were compared, and receiver operating characteristic analyses were used to optimize ELISA parameter constellation to obtain either maximal overall accuracy, maximal test sensitivity, or maximal test specificity. When the ELISA is optimized for sensitivity, a sensitivity of 100% and a specificity of 55% can be reached. When it is optimized for specificity, a specificity of 100% and a sensitivity of 90% can be obtained. We present an ELISA for BNT-AB detection that can be-for the first time-customized for special purposes. Adjusted for optimal sensitivity, it reaches the best sensitivity of all BNT-AB tests available. Using the new ELISA together with the HDA as a confirmation test allows testing for BNT-AB in large numbers of patients receiving BT drugs in an economical, fast, and more animal-friendly way. © 2014 International Parkinson and Movement Disorder Society.

  7. Anatomical Thin Titanium Mesh Plate Structural Optimization for Zygomatic-Maxillary Complex Fracture under Fatigue Testing.

    PubMed

    Wang, Yu-Tzu; Huang, Shao-Fu; Fang, Yu-Ting; Huang, Shou-Chieh; Cheng, Hwei-Fang; Chen, Chih-Hao; Wang, Po-Fang; Lin, Chun-Li

    2018-01-01

    This study performs a structural optimization of anatomical thin titanium mesh (ATTM) plate and optimal designed ATTM plate fabricated using additive manufacturing (AM) to verify its stabilization under fatigue testing. Finite element (FE) analysis was used to simulate the structural bending resistance of a regular ATTM plate. The Taguchi method was employed to identify the significance of each design factor in controlling the deflection and determine an optimal combination of designed factors. The optimal designed ATTM plate with patient-matched facial contour was fabricated using AM and applied to a ZMC comminuted fracture to evaluate the resting maxillary micromotion/strain under fatigue testing. The Taguchi analysis found that the ATTM plate required a designed internal hole distance to be 0.9 mm, internal hole diameter to be 1 mm, plate thickness to be 0.8 mm, and plate height to be 10 mm. The designed plate thickness factor primarily dominated the bending resistance up to 78% importance. The averaged micromotion (displacement) and strain of the maxillary bone showed that ZMC fracture fixation using the miniplate was significantly higher than those using the AM optimal designed ATTM plate. This study concluded that the optimal designed ATTM plate with enough strength to resist the bending effect can be obtained by combining FE and Taguchi analyses. The optimal designed ATTM plate with patient-matched facial contour fabricated using AM provides superior stabilization for ZMC comminuted fractured bone segments.

  8. Supercritical tests of a self-optimizing, variable-Camber wind tunnel model

    NASA Technical Reports Server (NTRS)

    Levinsky, E. S.; Palko, R. L.

    1979-01-01

    A testing procedure was used in a 16-foot Transonic Propulsion Wind Tunnel which leads to optimum wing airfoil sections without stopping the tunnel for model changes. Being experimental, the optimum shapes obtained incorporate various three-dimensional and nonlinear viscous and transonic effects not included in analytical optimization methods. The method is a closed-loop, computer-controlled, interactive procedure and employs a Self-Optimizing Flexible Technology wing semispan model that conformally adapts the airfoil section at two spanwise control stations to maximize or minimize various prescribed merit functions subject to both equality and inequality constraints. The model, which employed twelve independent hydraulic actuator systems and flexible skins, was also used for conventional testing. Although six of seven optimizations attempted were at least partially convergent, further improvements in model skin smoothness and hydraulic reliability are required to make the technique fully operational.

  9. Inverse problem of flame surface properties of wood using a repulsive particle swarm optimization algorithm

    NASA Astrophysics Data System (ADS)

    Yoon, Kyung-Beom; Park, Won-Hee

    2015-04-01

    The convective heat transfer coefficient and surface emissivity before and after flame occurrence on a wood specimen surface and the flame heat flux were estimated using the repulsive particle swarm optimization algorithm and cone heater test results. The cone heater specified in the ISO 5660 standards was used, and six cone heater heat fluxes were tested. Preservative-treated Douglas fir 21 mm in thickness was used as the wood specimen in the tests. This study confirmed that the surface temperature of the specimen, which was calculated using the convective heat transfer coefficient, surface emissivity and flame heat flux on the wood specimen by a repulsive particle swarm optimization algorithm, was consistent with the measured temperature. Considering the measurement errors in the surface temperature of the specimen, the applicability of the optimization method considered in this study was evaluated.

  10. CONORBIT: constrained optimization by radial basis function interpolation in trust regions

    DOE PAGES

    Regis, Rommel G.; Wild, Stefan M.

    2016-09-26

    Here, this paper presents CONORBIT (CONstrained Optimization by Radial Basis function Interpolation in Trust regions), a derivative-free algorithm for constrained black-box optimization where the objective and constraint functions are computationally expensive. CONORBIT employs a trust-region framework that uses interpolating radial basis function (RBF) models for the objective and constraint functions, and is an extension of the ORBIT algorithm. It uses a small margin for the RBF constraint models to facilitate the generation of feasible iterates, and extensive numerical tests confirm that such a margin is helpful in improving performance. CONORBIT is compared with other algorithms on 27 test problems, amore » chemical process optimization problem, and an automotive application. Numerical results show that CONORBIT performs better than COBYLA, a sequential penalty derivative-free method, an augmented Lagrangian method, a direct search method, and another RBF-based algorithm on the test problems and on the automotive application.« less

  11. New knowledge-based genetic algorithm for excavator boom structural optimization

    NASA Astrophysics Data System (ADS)

    Hua, Haiyan; Lin, Shuwen

    2014-03-01

    Due to the insufficiency of utilizing knowledge to guide the complex optimal searching, existing genetic algorithms fail to effectively solve excavator boom structural optimization problem. To improve the optimization efficiency and quality, a new knowledge-based real-coded genetic algorithm is proposed. A dual evolution mechanism combining knowledge evolution with genetic algorithm is established to extract, handle and utilize the shallow and deep implicit constraint knowledge to guide the optimal searching of genetic algorithm circularly. Based on this dual evolution mechanism, knowledge evolution and population evolution can be connected by knowledge influence operators to improve the configurability of knowledge and genetic operators. Then, the new knowledge-based selection operator, crossover operator and mutation operator are proposed to integrate the optimal process knowledge and domain culture to guide the excavator boom structural optimization. Eight kinds of testing algorithms, which include different genetic operators, are taken as examples to solve the structural optimization of a medium-sized excavator boom. By comparing the results of optimization, it is shown that the algorithm including all the new knowledge-based genetic operators can more remarkably improve the evolutionary rate and searching ability than other testing algorithms, which demonstrates the effectiveness of knowledge for guiding optimal searching. The proposed knowledge-based genetic algorithm by combining multi-level knowledge evolution with numerical optimization provides a new effective method for solving the complex engineering optimization problem.

  12. On the suitability of different representations of solid catalysts for combinatorial library design by genetic algorithms.

    PubMed

    Gobin, Oliver C; Schüth, Ferdi

    2008-01-01

    Genetic algorithms are widely used to solve and optimize combinatorial problems and are more often applied for library design in combinatorial chemistry. Because of their flexibility, however, their implementation can be challenging. In this study, the influence of the representation of solid catalysts on the performance of genetic algorithms was systematically investigated on the basis of a new, constrained, multiobjective, combinatorial test problem with properties common to problems in combinatorial materials science. Constraints were satisfied by penalty functions, repair algorithms, or special representations. The tests were performed using three state-of-the-art evolutionary multiobjective algorithms by performing 100 optimization runs for each algorithm and test case. Experimental data obtained during the optimization of a noble metal-free solid catalyst system active in the selective catalytic reduction of nitric oxide with propene was used to build up a predictive model to validate the results of the theoretical test problem. A significant influence of the representation on the optimization performance was observed. Binary encodings were found to be the preferred encoding in most of the cases, and depending on the experimental test unit, repair algorithms or penalty functions performed best.

  13. Establishment of a new in vitro test method for evaluation of eye irritancy using a reconstructed human corneal epithelial model, LabCyte CORNEA-MODEL.

    PubMed

    Katoh, Masakazu; Hamajima, Fumiyasu; Ogasawara, Takahiro; Hata, Ken-ichiro

    2013-12-01

    Finding in vitro eye irritation testing alternatives to animal testing such as the Draize eye test, which uses rabbits, is essential from the standpoint of animal welfare. It has been developed a reconstructed human corneal epithelial model, the LabCyte CORNEA-MODEL, which has a representative corneal epithelium-like structure. Protocol optimization (pre-validation study) was examined in order to establish a new alternative method for eye irritancy evaluation with this model. From the results of the optimization experiments, the application periods for chemicals were set at 1min for liquid chemicals or 24h for solid chemicals, and the post-exposure incubation periods were set at 24h for liquids or zero for solids. If the viability was less than 50%, the chemical was judged to be an eye irritant. Sixty-one chemicals were applied in the optimized protocol using the LabCyte CORNEA-MODEL and these results were evaluated in correlation with in vivo results. The predictions of the optimized LabCyte CORNEA-MODEL eye irritation test methods were highly correlated with in vivo eye irritation (sensitivity 100%, specificity 80.0%, and accuracy 91.8%). These results suggest that the LabCyte CORNEA-MODEL eye irritation test could be useful as an alternative method to the Draize eye test. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. Efficient logistic regression designs under an imperfect population identifier.

    PubMed

    Albert, Paul S; Liu, Aiyi; Nansel, Tonja

    2014-03-01

    Motivated by actual study designs, this article considers efficient logistic regression designs where the population is identified with a binary test that is subject to diagnostic error. We consider the case where the imperfect test is obtained on all participants, while the gold standard test is measured on a small chosen subsample. Under maximum-likelihood estimation, we evaluate the optimal design in terms of sample selection as well as verification. We show that there may be substantial efficiency gains by choosing a small percentage of individuals who test negative on the imperfect test for inclusion in the sample (e.g., verifying 90% test-positive cases). We also show that a two-stage design may be a good practical alternative to a fixed design in some situations. Under optimal and nearly optimal designs, we compare maximum-likelihood and semi-parametric efficient estimators under correct and misspecified models with simulations. The methodology is illustrated with an analysis from a diabetes behavioral intervention trial. © 2013, The International Biometric Society.

  15. Influence of cellulose derivative and ethylene glycol on optimization of lornoxicam transdermal formulation.

    PubMed

    Shahzad, Yasser; Khan, Qalandar; Hussain, Talib; Shah, Syed Nisar Hussain

    2013-10-01

    Lornoxicam containing topically applied lotions were formulated and optimized with the aim to deliver it transdermally. The formulated lotions were evaluated for pH, viscosity and in vitro permeation studies through silicone membrane using Franz diffusion cells. Data were fitted to linear, quadratic and cubic models and best fit model was selected to investigate the influence of variables, namely hydroxypropyl methylcellulose (HPMC) and ethylene glycol (EG) on permeation of lornoxicam from topically applied lotion formulations. The best fit quadratic model revealed that low level of HPMC and intermediate level of EG in the formulation was optimum for enhancing the drug flux across silicone membrane. FT-IR analysis confirmed absence of drug-polymer interactions. Selected optimized lotion formulation was then subjected to accelerated stability testing, sensatory perception testing and in vitro permeation across rabbit skin. The drug flux from the optimized lotion across rabbit skin was significantly better that that from the control formulation. Furthermore, sensatory perception test rated a higher acceptability while lotion was stable over stability testing period. Therefore, use of Box-Wilson statistical design successfully elaborated the influence of formulation variables on permeation of lornoxicam form topical formulations, thus, helped in optimization of the lotion formulation. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. A principled approach to setting optimal diagnostic thresholds: where ROC and indifference curves meet.

    PubMed

    Irwin, R John; Irwin, Timothy C

    2011-06-01

    Making clinical decisions on the basis of diagnostic tests is an essential feature of medical practice and the choice of the decision threshold is therefore crucial. A test's optimal diagnostic threshold is the threshold that maximizes expected utility. It is given by the product of the prior odds of a disease and a measure of the importance of the diagnostic test's sensitivity relative to its specificity. Choosing this threshold is the same as choosing the point on the Receiver Operating Characteristic (ROC) curve whose slope equals this product. We contend that a test's likelihood ratio is the canonical decision variable and contrast diagnostic thresholds based on likelihood ratio with two popular rules of thumb for choosing a threshold. The two rules are appealing because they have clear graphical interpretations, but they yield optimal thresholds only in special cases. The optimal rule can be given similar appeal by presenting indifference curves, each of which shows a set of equally good combinations of sensitivity and specificity. The indifference curve is tangent to the ROC curve at the optimal threshold. Whereas ROC curves show what is feasible, indifference curves show what is desirable. Together they show what should be chosen. Copyright © 2010 European Federation of Internal Medicine. Published by Elsevier B.V. All rights reserved.

  17. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE - A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnis Judzis

    2002-10-01

    This document details the progress to date on the OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE -- A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING contract for the quarter starting July 2002 through September 2002. Even though we are awaiting the optimization portion of the testing program, accomplishments include the following: (1) Smith International agreed to participate in the DOE Mud Hammer program. (2) Smith International chromed collars for upcoming benchmark tests at TerraTek, now scheduled for 4Q 2002. (3) ConocoPhillips had a field trial of the Smith fluid hammer offshore Vietnam. The hammer functioned properly, though themore » well encountered hole conditions and reaming problems. ConocoPhillips plan another field trial as a result. (4) DOE/NETL extended the contract for the fluid hammer program to allow Novatek to ''optimize'' their much delayed tool to 2003 and to allow Smith International to add ''benchmarking'' tests in light of SDS Digger Tools' current financial inability to participate. (5) ConocoPhillips joined the Industry Advisors for the mud hammer program. (6) TerraTek acknowledges Smith International, BP America, PDVSA, and ConocoPhillips for cost-sharing the Smith benchmarking tests allowing extension of the contract to complete the optimizations.« less

  18. Affordable Development and Optimization of CERMET Fuels for NTP Ground Testing

    NASA Technical Reports Server (NTRS)

    Hickman, Robert R.; Broadway, Jeramie W.; Mireles, Omar R.

    2014-01-01

    CERMET fuel materials for Nuclear Thermal Propulsion (NTP) are currently being developed at NASA's Marshall Space Flight Center. The work is part of NASA's Advanced Space Exploration Systems Nuclear Cryogenic Propulsion Stage (NCPS) Project. The goal of the FY12-14 project is to address critical NTP technology challenges and programmatic issues to establish confidence in the affordability and viability of an NTP system. A key enabling technology for an NCPS system is the fabrication of a stable high temperature nuclear fuel form. Although much of the technology was demonstrated during previous programs, there are currently no qualified fuel materials or processes. The work at MSFC is focused on developing critical materials and process technologies for manufacturing robust, full-scale CERMET fuels. Prototypical samples are being fabricated and tested in flowing hot hydrogen to understand processing and performance relationships. As part of this initial demonstration task, a final full scale element test will be performed to validate robust designs. The next phase of the project will focus on continued development and optimization of the fuel materials to enable future ground testing. The purpose of this paper is to provide a detailed overview of the CERMET fuel materials development plan. The overall CERMET fuel development path is shown in Figure 2. The activities begin prior to ATP for a ground reactor or engine system test and include materials and process optimization, hot hydrogen screening, material property testing, and irradiation testing. The goal of the development is to increase the maturity of the fuel form and reduce risk. One of the main accomplishmens of the current AES FY12-14 project was to develop dedicated laboratories at MSFC for the fabrication and testing of full length fuel elements. This capability will enable affordable, near term development and optimization of the CERMET fuels for future ground testing. Figure 2 provides a timeline of the development and optimization tasks for the AES FY15-17 follow on program.

  19. Comparison of Genetic Algorithm and Hill Climbing for Shortest Path Optimization Mapping

    NASA Astrophysics Data System (ADS)

    Fronita, Mona; Gernowo, Rahmat; Gunawan, Vincencius

    2018-02-01

    Traveling Salesman Problem (TSP) is an optimization to find the shortest path to reach several destinations in one trip without passing through the same city and back again to the early departure city, the process is applied to the delivery systems. This comparison is done using two methods, namely optimization genetic algorithm and hill climbing. Hill Climbing works by directly selecting a new path that is exchanged with the neighbour's to get the track distance smaller than the previous track, without testing. Genetic algorithms depend on the input parameters, they are the number of population, the probability of crossover, mutation probability and the number of generations. To simplify the process of determining the shortest path supported by the development of software that uses the google map API. Tests carried out as much as 20 times with the number of city 8, 16, 24 and 32 to see which method is optimal in terms of distance and time computation. Based on experiments conducted with a number of cities 3, 4, 5 and 6 producing the same value and optimal distance for the genetic algorithm and hill climbing, the value of this distance begins to differ with the number of city 7. The overall results shows that these tests, hill climbing are more optimal to number of small cities and the number of cities over 30 optimized using genetic algorithms.

  20. Optimal design of minimum mean-square error noise reduction algorithms using the simulated annealing technique.

    PubMed

    Bai, Mingsian R; Hsieh, Ping-Ju; Hur, Kur-Nan

    2009-02-01

    The performance of the minimum mean-square error noise reduction (MMSE-NR) algorithm in conjunction with time-recursive averaging (TRA) for noise estimation is found to be very sensitive to the choice of two recursion parameters. To address this problem in a more systematic manner, this paper proposes an optimization method to efficiently search the optimal parameters of the MMSE-TRA-NR algorithms. The objective function is based on a regression model, whereas the optimization process is carried out with the simulated annealing algorithm that is well suited for problems with many local optima. Another NR algorithm proposed in the paper employs linear prediction coding as a preprocessor for extracting the correlated portion of human speech. Objective and subjective tests were undertaken to compare the optimized MMSE-TRA-NR algorithm with several conventional NR algorithms. The results of subjective tests were processed by using analysis of variance to justify the statistic significance. A post hoc test, Tukey's Honestly Significant Difference, was conducted to further assess the pairwise difference between the NR algorithms.

  1. ODECS -- A computer code for the optimal design of S.I. engine control strategies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arsie, I.; Pianese, C.; Rizzo, G.

    1996-09-01

    The computer code ODECS (Optimal Design of Engine Control Strategies) for the design of Spark Ignition engine control strategies is presented. This code has been developed starting from the author`s activity in this field, availing of some original contributions about engine stochastic optimization and dynamical models. This code has a modular structure and is composed of a user interface for the definition, the execution and the analysis of different computations performed with 4 independent modules. These modules allow the following calculations: (1) definition of the engine mathematical model from steady-state experimental data; (2) engine cycle test trajectory corresponding to amore » vehicle transient simulation test such as ECE15 or FTP drive test schedule; (3) evaluation of the optimal engine control maps with a steady-state approach; (4) engine dynamic cycle simulation and optimization of static control maps and/or dynamic compensation strategies, taking into account dynamical effects due to the unsteady fluxes of air and fuel and the influences of combustion chamber wall thermal inertia on fuel consumption and emissions. Moreover, in the last two modules it is possible to account for errors generated by a non-deterministic behavior of sensors and actuators and the related influences on global engine performances, and compute robust strategies, less sensitive to stochastic effects. In the paper the four models are described together with significant results corresponding to the simulation and the calculation of optimal control strategies for dynamic transient tests.« less

  2. Optimization of pressure settings during adaptive servo-ventilation support using real-time heart rate variability assessment: initial case report.

    PubMed

    Imamura, Teruhiko; Nitta, Daisuke; Kinugawa, Koichiro

    2017-01-05

    Adaptive servo-ventilation (ASV) therapy is a recent non-invasive positive pressure ventilation therapy that was developed for patients with heart failure (HF) refractory to optimal medical therapy. However, it is likely that ASV therapy at relatively higher pressure setting worsens some of the patients' prognosis compared with optimal medical therapy. Therefore, identification of optimal pressure settings of ASV therapy is warranted. We present the case of a 42-year-old male with HF, which was caused by dilated cardiomyopathy, who was admitted to our institution for evaluating his eligibility for heart transplantation. To identify the optimal pressure setting [peak end-expiratory pressure (PEEP) ramp test], we performed an ASV support test, during which the PEEP settings were set at levels ranging from 4 to 8 mmHg, and a heart rate variability (HRV) analysis using the MemCalc power spectral density method. Clinical parameters varied dramatically during the PEEP ramp test. Over incremental PEEP levels, pulmonary capillary wedge pressure, cardiac index and high-frequency level (reflecting parasympathetic activity) decreased; however, the low-frequency level increased along with increase in plasma noradrenaline concentrations. An inappropriately high PEEP setting may stimulate sympathetic nerve activity accompanied by decreased cardiac output. This was the first report on the PEEP ramp test during ASV therapy. Further research is warranted to determine whether use of optimal pressure settings using HRV analyses may improve the long-term prognosis of such patients.

  3. Association between Optimism, Psychosocial Well Being and Oral Health: A Cross-Sectional Study.

    PubMed

    Thiruvenkadam, G; Asokan, Sharath; Baby John, J; Geetha Priya, P R

    The aim of the study was to assess the association of optimism and psychosocial well being of school going children on their oral health status. The study included 12- to 15-year-old school going children (N = 2014) from Tamilnadu, India. Optimism was measured using the revised version of the Life Orientation Test (LOT-R). A questionnaire was sent to the parents regarding their child's psychosocial behavior which included shyness, feeling inferiority, unhappiness and friendliness. Clinical examination for each child was done to assess the DMFT score and OHI-S score. The data obtained were statistically analyzed using Pearson Chi-Square test, Mann-Whitney test and Kruskal-Wallis test with the aid of SPSS software (version 17). Odds Ratio (OR) was calculated with 95% Confidence Interval (CI). The p value ≤ 0.05 was considered statistically significant. Boys with high optimism had significantly lesser DMFT score than the boys with low optimism (p=0.001). Girls with high optimism had significantly higher DMFT score (p=0.001). In psychosocial outcomes, inferiority (p=0.002) and friendliness (p=0.001) showed significant association with DMFT score. Among the boys, children who felt less inferior (p=0.001), less unhappy (p=0.029) and more friendly (p=0.001) had lesser DMFT score. Among the psychosocial outcomes assessed, inferiority and friendliness had significant association with oral health of the children and hence, can be used as a proxy measures oral health.

  4. PSO Algorithm Particle Filters for Improving the Performance of Lane Detection and Tracking Systems in Difficult Roads

    PubMed Central

    Cheng, Wen-Chang

    2012-01-01

    In this paper we propose a robust lane detection and tracking method by combining particle filters with the particle swarm optimization method. This method mainly uses the particle filters to detect and track the local optimum of the lane model in the input image and then seeks the global optimal solution of the lane model by a particle swarm optimization method. The particle filter can effectively complete lane detection and tracking in complicated or variable lane environments. However, the result obtained is usually a local optimal system status rather than the global optimal system status. Thus, the particle swarm optimization method is used to further refine the global optimal system status in all system statuses. Since the particle swarm optimization method is a global optimization algorithm based on iterative computing, it can find the global optimal lane model by simulating the food finding way of fish school or insects under the mutual cooperation of all particles. In verification testing, the test environments included highways and ordinary roads as well as straight and curved lanes, uphill and downhill lanes, lane changes, etc. Our proposed method can complete the lane detection and tracking more accurately and effectively then existing options. PMID:23235453

  5. Optimizing acoustical conditions for speech intelligibility in classrooms

    NASA Astrophysics Data System (ADS)

    Yang, Wonyoung

    High speech intelligibility is imperative in classrooms where verbal communication is critical. However, the optimal acoustical conditions to achieve a high degree of speech intelligibility have previously been investigated with inconsistent results, and practical room-acoustical solutions to optimize the acoustical conditions for speech intelligibility have not been developed. This experimental study validated auralization for speech-intelligibility testing, investigated the optimal reverberation for speech intelligibility for both normal and hearing-impaired listeners using more realistic room-acoustical models, and proposed an optimal sound-control design for speech intelligibility based on the findings. The auralization technique was used to perform subjective speech-intelligibility tests. The validation study, comparing auralization results with those of real classroom speech-intelligibility tests, found that if the room to be auralized is not very absorptive or noisy, speech-intelligibility tests using auralization are valid. The speech-intelligibility tests were done in two different auralized sound fields---approximately diffuse and non-diffuse---using the Modified Rhyme Test and both normal and hearing-impaired listeners. A hybrid room-acoustical prediction program was used throughout the work, and it and a 1/8 scale-model classroom were used to evaluate the effects of ceiling barriers and reflectors. For both subject groups, in approximately diffuse sound fields, when the speech source was closer to the listener than the noise source, the optimal reverberation time was zero. When the noise source was closer to the listener than the speech source, the optimal reverberation time was 0.4 s (with another peak at 0.0 s) with relative output power levels of the speech and noise sources SNS = 5 dB, and 0.8 s with SNS = 0 dB. In non-diffuse sound fields, when the noise source was between the speaker and the listener, the optimal reverberation time was 0.6 s with SNS = 4 dB and increased to 0.8 and 1.2 s with decreased SNS = 0 dB, for both normal and hearing-impaired listeners. Hearing-impaired listeners required more early energy than normal-hearing listeners. Reflective ceiling barriers and ceiling reflectors---in particular, parallel front-back rows of semi-circular reflectors---achieved the goal of decreasing reverberation with the least speech-level reduction.

  6. Multiobjective Optimization Using a Pareto Differential Evolution Approach

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    Differential Evolution is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. In this paper, the Differential Evolution algorithm is extended to multiobjective optimization problems by using a Pareto-based approach. The algorithm performs well when applied to several test optimization problems from the literature.

  7. Optimism and Pessimism of Physical Education and Non-Physical Education Students: Invariance of Structure

    ERIC Educational Resources Information Center

    Abu-Hilal, Maher M.; Zayed, Kashef

    2011-01-01

    Introduction: Optimism and pessimism are two psychological constructs that play a significant role in human mental and psychological hygiene. The two construct are strongly but negatively correlated. Optimism and pessimism can be influenced by culture and the environment. The present study attempts to test the structure of optimism and pessimism…

  8. An optimized implementation of a fault-tolerant clock synchronization circuit

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    1995-01-01

    A fault-tolerant clock synchronization circuit was designed and tested. A comparison to a previous design and the procedure followed to achieve the current optimization are included. The report also includes a description of the system and the results of tests performed to study the synchronization and fault-tolerant characteristics of the implementation.

  9. Nuclear fuel management optimization using genetic algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeChaine, M.D.; Feltus, M.A.

    1995-07-01

    The code independent genetic algorithm reactor optimization (CIGARO) system has been developed to optimize nuclear reactor loading patterns. It uses genetic algorithms (GAs) and a code-independent interface, so any reactor physics code (e.g., CASMO-3/SIMULATE-3) can be used to evaluate the loading patterns. The system is compared to other GA-based loading pattern optimizers. Tests were carried out to maximize the beginning of cycle k{sub eff} for a pressurized water reactor core loading with a penalty function to limit power peaking. The CIGARO system performed well, increasing the k{sub eff} after lowering the peak power. Tests of a prototype parallel evaluation methodmore » showed the potential for a significant speedup.« less

  10. Neural Network Prediction of New Aircraft Design Coefficients

    NASA Technical Reports Server (NTRS)

    Norgaard, Magnus; Jorgensen, Charles C.; Ross, James C.

    1997-01-01

    This paper discusses a neural network tool for more effective aircraft design evaluations during wind tunnel tests. Using a hybrid neural network optimization method, we have produced fast and reliable predictions of aerodynamical coefficients, found optimal flap settings, and flap schedules. For validation, the tool was tested on a 55% scale model of the USAF/NASA Subsonic High Alpha Research Concept aircraft (SHARC). Four different networks were trained to predict coefficients of lift, drag, moment of inertia, and lift drag ratio (C(sub L), C(sub D), C(sub M), and L/D) from angle of attack and flap settings. The latter network was then used to determine an overall optimal flap setting and for finding optimal flap schedules.

  11. A probabilistic method for the estimation of residual risk in donated blood.

    PubMed

    Bish, Ebru K; Ragavan, Prasanna K; Bish, Douglas R; Slonim, Anthony D; Stramer, Susan L

    2014-10-01

    The residual risk (RR) of transfusion-transmitted infections, including the human immunodeficiency virus and hepatitis B and C viruses, is typically estimated by the incidence[Formula: see text]window period model, which relies on the following restrictive assumptions: Each screening test, with probability 1, (1) detects an infected unit outside of the test's window period; (2) fails to detect an infected unit within the window period; and (3) correctly identifies an infection-free unit. These assumptions need not hold in practice due to random or systemic errors and individual variations in the window period. We develop a probability model that accurately estimates the RR by relaxing these assumptions, and quantify their impact using a published cost-effectiveness study and also within an optimization model. These assumptions lead to inaccurate estimates in cost-effectiveness studies and to sub-optimal solutions in the optimization model. The testing solution generated by the optimization model translates into fewer expected infections without an increase in the testing cost. © The Author 2014. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  12. Materials and process optimization for dual-shell satellite antenna reflectors

    NASA Astrophysics Data System (ADS)

    Balaski, Darcy R.; van Oyen, Hans J.; Nissan, Sorin J.

    A comprehensive, design-optimization test program was conducted for satellite antenna reflectors composed of two offset paraboloidal Kevlar-reinforced sandwich shells separated by a circular sandwich structure. In addition to standard mechanical properties testing, coefficient of thermal expansion and hygroscopic tests were conducted to predict reflector surface accuracy in the thermal cycling environment of orbital space. Attention was given to the relative placement of components during assembly, in view of reflector surface measurements.

  13. Evolutionary optimization methods for accelerator design

    NASA Astrophysics Data System (ADS)

    Poklonskiy, Alexey A.

    Many problems from the fields of accelerator physics and beam theory can be formulated as optimization problems and, as such, solved using optimization methods. Despite growing efficiency of the optimization methods, the adoption of modern optimization techniques in these fields is rather limited. Evolutionary Algorithms (EAs) form a relatively new and actively developed optimization methods family. They possess many attractive features such as: ease of the implementation, modest requirements on the objective function, a good tolerance to noise, robustness, and the ability to perform a global search efficiently. In this work we study the application of EAs to problems from accelerator physics and beam theory. We review the most commonly used methods of unconstrained optimization and describe the GATool, evolutionary algorithm and the software package, used in this work, in detail. Then we use a set of test problems to assess its performance in terms of computational resources, quality of the obtained result, and the tradeoff between them. We justify the choice of GATool as a heuristic method to generate cutoff values for the COSY-GO rigorous global optimization package for the COSY Infinity scientific computing package. We design the model of their mutual interaction and demonstrate that the quality of the result obtained by GATool increases as the information about the search domain is refined, which supports the usefulness of this model. We Giscuss GATool's performance on the problems suffering from static and dynamic noise and study useful strategies of GATool parameter tuning for these and other difficult problems. We review the challenges of constrained optimization with EAs and methods commonly used to overcome them. We describe REPA, a new constrained optimization method based on repairing, in exquisite detail, including the properties of its two repairing techniques: REFIND and REPROPT. We assess REPROPT's performance on the standard constrained optimization test problems for EA with a variety of different configurations and suggest optimal default parameter values based on the results. Then we study the performance of the REPA method on the same set of test problems and compare the obtained results with those of several commonly used constrained optimization methods with EA. Based on the obtained results, particularly on the outstanding performance of REPA on test problem that presents significant difficulty for other reviewed EAs, we conclude that the proposed method is useful and competitive. We discuss REPA parameter tuning for difficult problems and critically review some of the problems from the de-facto standard test problem set for the constrained optimization with EA. In order to demonstrate the practical usefulness of the developed method, we study several problems of accelerator design and demonstrate how they can be solved with EAs. These problems include a simple accelerator design problem (design a quadrupole triplet to be stigmatically imaging, find all possible solutions), a complex real-life accelerator design problem (an optimization of the front end section for the future neutrino factory), and a problem of the normal form defect function optimization which is used to rigorously estimate the stability of the beam dynamics in circular accelerators. The positive results we obtained suggest that the application of EAs to problems from accelerator theory can be very beneficial and has large potential. The developed optimization scenarios and tools can be used to approach similar problems.

  14. Fundamental differences between optimization code test problems in engineering applications

    NASA Technical Reports Server (NTRS)

    Eason, E. D.

    1984-01-01

    The purpose here is to suggest that there is at least one fundamental difference between the problems used for testing optimization codes and the problems that engineers often need to solve; in particular, the level of precision that can be practically achieved in the numerical evaluation of the objective function, derivatives, and constraints. This difference affects the performance of optimization codes, as illustrated by two examples. Two classes of optimization problem were defined. Class One functions and constraints can be evaluated to a high precision that depends primarily on the word length of the computer. Class Two functions and/or constraints can only be evaluated to a moderate or a low level of precision for economic or modeling reasons, regardless of the computer word length. Optimization codes have not been adequately tested on Class Two problems. There are very few Class Two test problems in the literature, while there are literally hundreds of Class One test problems. The relative performance of two codes may be markedly different for Class One and Class Two problems. Less sophisticated direct search type codes may be less likely to be confused or to waste many function evaluations on Class Two problems. The analysis accuracy and minimization performance are related in a complex way that probably varies from code to code. On a problem where the analysis precision was varied over a range, the simple Hooke and Jeeves code was more efficient at low precision while the Powell code was more efficient at high precision.

  15. Design of multi-energy Helds coupling testing system of vertical axis wind power system

    NASA Astrophysics Data System (ADS)

    Chen, Q.; Yang, Z. X.; Li, G. S.; Song, L.; Ma, C.

    2016-08-01

    The conversion efficiency of wind energy is the focus of researches and concerns as one of the renewable energy. The present methods of enhancing the conversion efficiency are mostly improving the wind rotor structure, optimizing the generator parameters and energy storage controller and so on. Because the conversion process involves in energy conversion of multi-energy fields such as wind energy, mechanical energy and electrical energy, the coupling effect between them will influence the overall conversion efficiency. In this paper, using system integration analysis technology, a testing system based on multi-energy field coupling (MEFC) of vertical axis wind power system is proposed. When the maximum efficiency of wind rotor is satisfied, it can match to the generator function parameters according to the output performance of wind rotor. The voltage controller can transform the unstable electric power to the battery on the basis of optimizing the parameters such as charging times, charging voltage. Through the communication connection and regulation of the upper computer system (UCS), it can make the coupling parameters configure to an optimal state, and it improves the overall conversion efficiency. This method can test the whole wind turbine (WT) performance systematically and evaluate the design parameters effectively. It not only provides a testing method for system structure design and parameter optimization of wind rotor, generator and voltage controller, but also provides a new testing method for the whole performance optimization of vertical axis wind energy conversion system (WECS).

  16. Numerical and Experimental Validation of the Optimization Methodologies for a Wing-Tip Structure Equipped with Conventional and Morphing Ailerons =

    NASA Astrophysics Data System (ADS)

    Koreanschi, Andreea

    In order to answer the problem of 'how to reduce the aerospace industry's environment footprint?' new morphing technologies were developed. These technologies were aimed at reducing the aircraft's fuel consumption through reduction of the wing drag. The morphing concept used in the present research consists of replacing the conventional aluminium upper surface of the wing with a flexible composite skin for morphing abilities. For the ATR-42 'Morphing wing' project, the wing models were manufactured entirely from composite materials and the morphing region was optimized for flexibility. In this project two rigid wing models and an active morphing wing model were designed, manufactured and wind tunnel tested. For the CRIAQ MDO 505 project, a full scale wing-tip equipped with two types of ailerons, conventional and morphing, was designed, optimized, manufactured, bench and wind tunnel tested. The morphing concept was applied on a real wing internal structure and incorporated aerodynamic, structural and control constraints specific to a multidisciplinary approach. Numerical optimization, aerodynamic analysis and experimental validation were performed for both the CRIAQ MDO 505 full scale wing-tip demonstrator and the ATR-42 reduced scale wing models. In order to improve the aerodynamic performances of the ATR-42 and CRIAQ MDO 505 wing airfoils, three global optimization algorithms were developed, tested and compared. The three algorithms were: the genetic algorithm, the artificial bee colony and the gradient descent. The algorithms were coupled with the two-dimensional aerodynamic solver XFoil. XFoil is known for its rapid convergence, robustness and use of the semi-empirical e n method for determining the position of the flow transition from laminar to turbulent. Based on the performance comparison between the algorithms, the genetic algorithm was chosen for the optimization of the ATR-42 and CRIAQ MDO 505 wing airfoils. The optimization algorithm was improved during the CRIAQ MDO 505 project for convergence speed by introducing a two-step cross-over function. Structural constraints were introduced in the algorithm at each aero-structural optimization interaction, allowing a better manipulation of the algorithm and giving it more capabilities of morphing combinations. The CRIAQ MDO 505 project envisioned a morphing aileron concept for the morphing upper surface wing. For this morphing aileron concept, two optimization methods were developed. The methods used the already developed genetic algorithm and each method had a different design concept. The first method was based on the morphing upper surface concept, using actuation points to achieve the desired shape. The second method was based on the hinge rotation concept of the conventional aileron but applied at multiple nodes along the aileron camber to achieve the desired shape. Both methods were constrained by manufacturing and aerodynamic requirements. The purpose of the morphing aileron methods was to obtain an aileron shape with a smoother pressure distribution gradient during deflection than the conventional aileron. The aerodynamic optimization results were used for the structural optimization and design of the wing, particularly the flexible composite skin. Due to the structural changes performed on the initial wing-tip structure, an aeroelastic behaviour analysis, more specific on flutter phenomenon, was performed. The analyses were done to ensure the structural integrity of the wing-tip demonstrator during wind tunnel tests. Three wind tunnel tests were performed for the CRIAQ MDO 505 wing-tip demonstrator at the IAR-NRC subsonic wind tunnel facility in Ottawa. The first two tests were performed for the wing-tip equipped with conventional aileron. The purpose of these tests was to validate the control system designed for the morphing upper surface, the numerical optimization and aerodynamic analysis and to evaluate the optimization efficiency on the boundary layer behaviour and the wing drag. The third set of wind tunnel tests was performed on the wing-tip equipped with a morphing aileron. The purpose of this test was to evaluate the performances of the morphing aileron, in conjunction with the active morphing upper surface, and their effect on the lift, drag and boundary layer behaviour. Transition data, obtained from Infrared Thermography, and pressure data, extracted from Kulite and pressure taps recordings, were used to validate the numerical optimization and aerodynamic performances of the wing-tip demonstrator. A set of wind tunnel tests was performed on the ATR-42 rigid wing models at the Price-Paidoussis subsonic wind tunnel at Ecole de technologie Superieure. The results from the pressure taps recordings were used to validate the numerical optimization. A second derivative of the pressure distribution method was applied to evaluate the transition region on the upper surface of the wing models for comparison with the numerical transition values. (Abstract shortened by ProQuest.).

  17. B-ALL minimal residual disease flow cytometry: an application of a novel method for optimization of a single-tube model.

    PubMed

    Shaver, Aaron C; Greig, Bruce W; Mosse, Claudio A; Seegmiller, Adam C

    2015-05-01

    Optimizing a clinical flow cytometry panel can be a subjective process dependent on experience. We develop a quantitative method to make this process more rigorous and apply it to B lymphoblastic leukemia/lymphoma (B-ALL) minimal residual disease (MRD) testing. We retrospectively analyzed our existing three-tube, seven-color B-ALL MRD panel and used our novel method to develop an optimized one-tube, eight-color panel, which was tested prospectively. The optimized one-tube, eight-color panel resulted in greater efficiency of time and resources with no loss in diagnostic power. Constructing a flow cytometry panel using a rigorous, objective, quantitative method permits optimization and avoids problems of interdependence and redundancy in a large, multiantigen panel. Copyright© by the American Society for Clinical Pathology.

  18. Collagen gel droplet-embedded culture drug sensitivity testing in squamous cell carcinoma cell lines derived from human oral cancers: Optimal contact concentrations of cisplatin and fluorouracil.

    PubMed

    Sakuma, Kaname; Tanaka, Akira; Mataga, Izumi

    2016-12-01

    The collagen gel droplet-embedded culture drug sensitivity test (CD-DST) is an anticancer drug sensitivity test that uses a method of three-dimensional culture of extremely small samples, and it is suited to primary cultures of human cancer cells. It is a useful method for oral squamous cell carcinoma (OSCC), in which the cancer tissues available for testing are limited. However, since the optimal contact concentrations of anticancer drugs have yet to be established in OSCC, CD-DST for detecting drug sensitivities of OSCC is currently performed by applying the optimal contact concentrations for stomach cancer. In the present study, squamous carcinoma cell lines from human oral cancer were used to investigate the optimal contact concentrations of cisplatin (CDDP) and fluorouracil (5-FU) during CD-DST for OSCC. CD-DST was performed in 7 squamous cell carcinoma cell lines derived from human oral cancers (Ca9-22, HSC-3, HSC-4, HO-1-N-1, KON, OSC-19 and SAS) using CDDP (0.15, 0.3, 1.25, 2.5, 5.0 and 10.0 µg/ml) and 5-FU (0.4, 0.9, 1.8, 3.8, 7.5, 15.0 and 30.0 µg/ml), and the optimal contact concentrations were calculated from the clinical response rate of OSCC to single-drug treatment and the in vitro efficacy rate curve. The optimal concentrations were 0.5 µg/ml for CDDP and 0.7 µg/ml for 5-FU. The antitumor efficacy of CDDP at this optimal contact concentration in CD-DST was compared to the antitumor efficacy in the nude mouse method. The T/C values, which were calculated as the ratio of the colony volume of the treatment group and the colony volume of the control group, at the optimal contact concentration of CDDP and of the nude mouse method were almost in agreement (P<0.05) and predicted clinical efficacy, indicating that the calculated optimal contact concentration is valid. Therefore, chemotherapy for OSCC based on anticancer drug sensitivity tests offers patients a greater freedom of choice and is likely to assume a greater importance in the selection of treatment from the perspectives of function preservation and quality of life, as well as representing a treatment option for unresectable, intractable or recurrent cases.

  19. Recent advances in stellarator optimization

    DOE PAGES

    Gates, D. A.; Boozer, A. H.; Brown, T.; ...

    2017-10-27

    Computational optimization has revolutionized the field of stellarator design. To date, optimizations have focused primarily on optimization of neoclassical confinement and ideal MHD stability, although limited optimization of other parameters has also been performed. Here, we outline a select set of new concepts for stellarator optimization that, when taken as a group, present a significant step forward in the stellarator concept. One of the criticisms that has been leveled at existing methods of design is the complexity of the resultant field coils. Recently, a new coil optimization code—COILOPT++, which uses a spline instead of a Fourier representation of the coils,—wasmore » written and included in the STELLOPT suite of codes. The advantage of this method is that it allows the addition of real space constraints on the locations of the coils. The code has been tested by generating coil designs for optimized quasi-axisymmetric stellarator plasma configurations of different aspect ratios. As an initial exercise, a constraint that the windings be vertical was placed on large major radius half of the non-planar coils. Further constraints were also imposed that guaranteed that sector blanket modules could be removed from between the coils, enabling a sector maintenance scheme. Results of this exercise will be presented. New ideas on methods for the optimization of turbulent transport have garnered much attention since these methods have led to design concepts that are calculated to have reduced turbulent heat loss. We have explored possibilities for generating an experimental database to test whether the reduction in transport that is predicted is consistent with experimental observations. Thus, a series of equilibria that can be made in the now latent QUASAR experiment have been identified that will test the predicted transport scalings. Fast particle confinement studies aimed at developing a generalized optimization algorithm are also discussed. A new algorithm developed for the design of the scraper element on W7-X is presented along with ideas for automating the optimization approach.« less

  20. Recent advances in stellarator optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gates, D. A.; Boozer, A. H.; Brown, T.

    Computational optimization has revolutionized the field of stellarator design. To date, optimizations have focused primarily on optimization of neoclassical confinement and ideal MHD stability, although limited optimization of other parameters has also been performed. Here, we outline a select set of new concepts for stellarator optimization that, when taken as a group, present a significant step forward in the stellarator concept. One of the criticisms that has been leveled at existing methods of design is the complexity of the resultant field coils. Recently, a new coil optimization code—COILOPT++, which uses a spline instead of a Fourier representation of the coils,—wasmore » written and included in the STELLOPT suite of codes. The advantage of this method is that it allows the addition of real space constraints on the locations of the coils. The code has been tested by generating coil designs for optimized quasi-axisymmetric stellarator plasma configurations of different aspect ratios. As an initial exercise, a constraint that the windings be vertical was placed on large major radius half of the non-planar coils. Further constraints were also imposed that guaranteed that sector blanket modules could be removed from between the coils, enabling a sector maintenance scheme. Results of this exercise will be presented. New ideas on methods for the optimization of turbulent transport have garnered much attention since these methods have led to design concepts that are calculated to have reduced turbulent heat loss. We have explored possibilities for generating an experimental database to test whether the reduction in transport that is predicted is consistent with experimental observations. Thus, a series of equilibria that can be made in the now latent QUASAR experiment have been identified that will test the predicted transport scalings. Fast particle confinement studies aimed at developing a generalized optimization algorithm are also discussed. A new algorithm developed for the design of the scraper element on W7-X is presented along with ideas for automating the optimization approach.« less

  1. Wing-section optimization for supersonic viscous flow

    NASA Technical Reports Server (NTRS)

    Item, Cem C.; Baysal, Oktay (Editor)

    1995-01-01

    To improve the shape of a supersonic wing, an automated method that also includes higher fidelity to the flow physics is desirable. With this impetus, an aerodynamic optimization methodology incorporating thin-layer Navier-Stokes equations and sensitivity analysis had been previously developed. Prior to embarking upon the wind design task, the present investigation concentrated on testing the feasibility of the methodology, and the identification of adequate problem formulations, by defining two-dimensional, cost-effective test cases. Starting with two distinctly different initial airfoils, two independent shape optimizations resulted in shapes with similar features: slightly cambered, parabolic profiles with sharp leading- and trailing-edges. Secondly, the normal section to the subsonic portion of the leading edge, which had a high normal angle-of-attack, was considered. The optimization resulted in a shape with twist and camber which eliminated the adverse pressure gradient, hence, exploiting the leading-edge thrust. The wing section shapes obtained in all the test cases had the features predicted by previous studies. Therefore, it was concluded that the flowfield analyses and sensitivity coefficients were computed and fed to the present gradient-based optimizer correctly. Also, as a result of the present two-dimensional study, suggestions were made for the problem formulations which should contribute to an effective wing shape optimization.

  2. Experimental Optimization of a Free-to-Rotate Wing for Small UAS

    NASA Technical Reports Server (NTRS)

    Logan, Michael J.; DeLoach, Richard; Copeland, Tiwana; Vo, Steven

    2014-01-01

    This paper discusses an experimental investigation conducted to optimize a free-to-rotate wing for use on a small unmanned aircraft system (UAS). Although free-to-rotate wings have been used for decades on various small UAS and small manned aircraft, little is known about how to optimize these unusual wings for a specific application. The paper discusses some of the design rationale of the basic wing. In addition, three main parameters were selected for "optimization", wing camber, wing pivot location, and wing center of gravity (c.g.) location. A small apparatus was constructed to enable some simple experimental analysis of these parameters. A design-of-experiment series of tests were first conducted to discern which of the main optimization parameters were most likely to have the greatest impact on the outputs of interest, namely, some measure of "stability", some measure of the lift being generated at the neutral position, and how quickly the wing "recovers" from an upset. A second set of tests were conducted to develop a response-surface numerical representation of these outputs as functions of the three primary inputs. The response surface numerical representations are then used to develop an "optimum" within the trade space investigated. The results of the optimization are then tested experimentally to validate the predictions.

  3. Two Point Exponential Approximation Method for structural optimization of problems with frequency constraints

    NASA Technical Reports Server (NTRS)

    Fadel, G. M.

    1991-01-01

    The point exponential approximation method was introduced by Fadel et al. (Fadel, 1990), and tested on structural optimization problems with stress and displacement constraints. The reports in earlier papers were promising, and the method, which consists of correcting Taylor series approximations using previous design history, is tested in this paper on optimization problems with frequency constraints. The aim of the research is to verify the robustness and speed of convergence of the two point exponential approximation method when highly non-linear constraints are used.

  4. Optimal control of thermally coupled Navier Stokes equations

    NASA Technical Reports Server (NTRS)

    Ito, Kazufumi; Scroggs, Jeffrey S.; Tran, Hien T.

    1994-01-01

    The optimal boundary temperature control of the stationary thermally coupled incompressible Navier-Stokes equation is considered. Well-posedness and existence of the optimal control and a necessary optimality condition are obtained. Optimization algorithms based on the augmented Lagrangian method with second order update are discussed. A test example motivated by control of transport process in the high pressure vapor transport (HVPT) reactor is presented to demonstrate the applicability of our theoretical results and proposed algorithm.

  5. Developing a Shuffled Complex-Self Adaptive Hybrid Evolution (SC-SAHEL) Framework for Water Resources Management and Water-Energy System Optimization

    NASA Astrophysics Data System (ADS)

    Rahnamay Naeini, M.; Sadegh, M.; AghaKouchak, A.; Hsu, K. L.; Sorooshian, S.; Yang, T.

    2017-12-01

    Meta-Heuristic optimization algorithms have gained a great deal of attention in a wide variety of fields. Simplicity and flexibility of these algorithms, along with their robustness, make them attractive tools for solving optimization problems. Different optimization methods, however, hold algorithm-specific strengths and limitations. Performance of each individual algorithm obeys the "No-Free-Lunch" theorem, which means a single algorithm cannot consistently outperform all possible optimization problems over a variety of problems. From users' perspective, it is a tedious process to compare, validate, and select the best-performing algorithm for a specific problem or a set of test cases. In this study, we introduce a new hybrid optimization framework, entitled Shuffled Complex-Self Adaptive Hybrid EvoLution (SC-SAHEL), which combines the strengths of different evolutionary algorithms (EAs) in a parallel computing scheme, and allows users to select the most suitable algorithm tailored to the problem at hand. The concept of SC-SAHEL is to execute different EAs as separate parallel search cores, and let all participating EAs to compete during the course of the search. The newly developed SC-SAHEL algorithm is designed to automatically select, the best performing algorithm for the given optimization problem. This algorithm is rigorously effective in finding the global optimum for several strenuous benchmark test functions, and computationally efficient as compared to individual EAs. We benchmark the proposed SC-SAHEL algorithm over 29 conceptual test functions, and two real-world case studies - one hydropower reservoir model and one hydrological model (SAC-SMA). Results show that the proposed framework outperforms individual EAs in an absolute majority of the test problems, and can provide competitive results to the fittest EA algorithm with more comprehensive information during the search. The proposed framework is also flexible for merging additional EAs, boundary-handling techniques, and sampling schemes, and has good potential to be used in Water-Energy system optimal operation and management.

  6. MEMS resonant load cells for micro-mechanical test frames: feasibility study and optimal design

    NASA Astrophysics Data System (ADS)

    Torrents, A.; Azgin, K.; Godfrey, S. W.; Topalli, E. S.; Akin, T.; Valdevit, L.

    2010-12-01

    This paper presents the design, optimization and manufacturing of a novel micro-fabricated load cell based on a double-ended tuning fork. The device geometry and operating voltages are optimized for maximum force resolution and range, subject to a number of manufacturing and electromechanical constraints. All optimizations are enabled by analytical modeling (verified by selected finite elements analyses) coupled with an efficient C++ code based on the particle swarm optimization algorithm. This assessment indicates that force resolutions of ~0.5-10 nN are feasible in vacuum (~1-50 mTorr), with force ranges as large as 1 N. Importantly, the optimal design for vacuum operation is independent of the desired range, ensuring versatility. Experimental verifications on a sub-optimal device fabricated using silicon-on-glass technology demonstrate a resolution of ~23 nN at a vacuum level of ~50 mTorr. The device demonstrated in this article will be integrated in a hybrid micro-mechanical test frame for unprecedented combinations of force resolution and range, displacement resolution and range, optical (or SEM) access to the sample, versatility and cost.

  7. Multi-level optimization of a beam-like space truss utilizing a continuum model

    NASA Technical Reports Server (NTRS)

    Yates, K.; Gurdal, Z.; Thangjitham, S.

    1992-01-01

    A continuous beam model is developed for approximate analysis of a large, slender, beam-like truss. The model is incorporated in a multi-level optimization scheme for the weight minimization of such trusses. This scheme is tested against traditional optimization procedures for savings in computational cost. Results from both optimization methods are presented for comparison.

  8. Testing the sensitivity of pumpage to increases in surficial aquifer system heads in the Cypress Creek well-field area, West-Central Florida : an optimization technique

    USGS Publications Warehouse

    Yobbi, Dann K.

    2002-01-01

    Tampa Bay depends on ground water for most of the water supply. Numerous wetlands and lakes in Pasco County have been impacted by the high demand for ground water. Central Pasco County, particularly the area within the Cypress Creek well field, has been greatly affected. Probable causes for the decline in surface-water levels are well-field pumpage and a decade-long drought. Efforts are underway to increase surface-water levels by developing alternative sources of water supply, thus reducing the quantity of well-field pumpage. Numerical ground-water flow simulations coupled with an optimization routine were used in a series of simulations to test the sensitivity of optimal pumpage to desired increases in surficial aquifer system heads in the Cypress Creek well field. The ground-water system was simulated using the central northern Tampa Bay ground-water flow model. Pumping solutions for 1987 equilibrium conditions and for a transient 6-month timeframe were determined for five test cases, each reflecting a range of desired target recovery heads at different head control sites in the surficial aquifer system. Results are presented in the form of curves relating average head recovery to total optimal pumpage. Pumping solutions are sensitive to the location of head control sites formulated in the optimization problem and as expected, total optimal pumpage decreased when desired target head increased. The distribution of optimal pumpage for individual production wells also was significantly affected by the location of head control sites. A pumping advantage was gained for test-case formulations where hydraulic heads were maximized in cells near the production wells, in cells within the steady-state pumping center cone of depression, and in cells within the area of the well field where confining-unit leakance is the highest. More water was pumped and the ratio of head recovery per unit decrease in optimal pumpage was more than double for test cases where hydraulic heads are maximized in cells located at or near the production wells. Additionally, the ratio of head recovery per unit decrease in pumpage was about three times more for the area where confining-unit leakance is the highest than for other leakance zone areas of the well field. For many head control sites, optimal heads corresponding to optimal pumpage deviated from the desired target recovery heads. Overall, pumping solutions were constrained by the limiting recovery values, initial head conditions, and by upper boundary conditions of the ground-water flow model.

  9. Optimal Discrete Event Supervisory Control of Aircraft Gas Turbine Engines

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan (Technical Monitor); Ray, Asok

    2004-01-01

    This report presents an application of the recently developed theory of optimal Discrete Event Supervisory (DES) control that is based on a signed real measure of regular languages. The DES control techniques are validated on an aircraft gas turbine engine simulation test bed. The test bed is implemented on a networked computer system in which two computers operate in the client-server mode. Several DES controllers have been tested for engine performance and reliability.

  10. An Evaluation of the Sniffer Global Optimization Algorithm Using Standard Test Functions

    NASA Astrophysics Data System (ADS)

    Butler, Roger A. R.; Slaminka, Edward E.

    1992-03-01

    The performance of Sniffer—a new global optimization algorithm—is compared with that of Simulated Annealing. Using the number of function evaluations as a measure of efficiency, the new algorithm is shown to be significantly better at finding the global minimum of seven standard test functions. Several of the test functions used have many local minima and very steep walls surrounding the global minimum. Such functions are intended to thwart global minimization algorithms.

  11. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE - A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnis Judzis

    2003-01-01

    This document details the progress to date on the ''OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE -- A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING'' contract for the quarter starting October 2002 through December 2002. Even though we are awaiting the optimization portion of the testing program, accomplishments included the following: (1) Smith International participated in the DOE Mud Hammer program through full scale benchmarking testing during the week of 4 November 2003. (2) TerraTek acknowledges Smith International, BP America, PDVSA, and ConocoPhillips for cost-sharing the Smith benchmarking tests allowing extension of the contract to add to themore » benchmarking testing program. (3) Following the benchmark testing of the Smith International hammer, representatives from DOE/NETL, TerraTek, Smith International and PDVSA met at TerraTek in Salt Lake City to review observations, performance and views on the optimization step for 2003. (4) The December 2002 issue of Journal of Petroleum Technology (Society of Petroleum Engineers) highlighted the DOE fluid hammer testing program and reviewed last years paper on the benchmark performance of the SDS Digger and Novatek hammers. (5) TerraTek's Sid Green presented a technical review for DOE/NETL personnel in Morgantown on ''Impact Rock Breakage'' and its importance on improving fluid hammer performance. Much discussion has taken place on the issues surrounding mud hammer performance at depth conditions.« less

  12. A Confirmatory Factor Analysis of the Life Orientation Test-Revised with Competitive Athletes

    ERIC Educational Resources Information Center

    Appaneal, Renee N.

    2012-01-01

    Current reviews outside of sport indicate that the Life Orientation Test-Revised (LOT-R) items load on two separate factors (optimism and pessimism) and, therefore, should be treated as independent constructs. However, researchers in the sport sciences continue to use the single composite score reflecting a unidimensional definition of optimism.…

  13. Age-Related Differences in Goals: Testing Predictions from Selection, Optimization, and Compensation Theory and Socioemotional Selectivity Theory

    ERIC Educational Resources Information Center

    Penningroth, Suzanna L.; Scott, Walter D.

    2012-01-01

    Two prominent theories of lifespan development, socioemotional selectivity theory and selection, optimization, and compensation theory, make similar predictions for differences in the goal representations of younger and older adults. Our purpose was to test whether the goals of younger and older adults differed in ways predicted by these two…

  14. Optimal Scoring Methods of Hand-Strength Tests in Patients with Stroke

    ERIC Educational Resources Information Center

    Huang, Sheau-Ling; Hsieh, Ching-Lin; Lin, Jau-Hong; Chen, Hui-Mei

    2011-01-01

    The purpose of this study was to determine the optimal scoring methods for measuring strength of the more-affected hand in patients with stroke by examining the effect of reducing measurement errors. Three hand-strength tests of grip, palmar pinch, and lateral pinch were administered at two sessions in 56 patients with stroke. Five scoring methods…

  15. In Search of Optimal Cognitive Diagnostic Model(s) for ESL Grammar Test Data

    ERIC Educational Resources Information Center

    Yi, Yeon-Sook

    2017-01-01

    This study compares five cognitive diagnostic models in search of optimal one(s) for English as a Second Language grammar test data. Using a unified modeling framework that can represent specific models with proper constraints, the article first fit the full model (the log-linear cognitive diagnostic model, LCDM) and investigated which model…

  16. Optimization of Equation of State and Burn Model Parameters for Explosives

    NASA Astrophysics Data System (ADS)

    Bergh, Magnus; Wedberg, Rasmus; Lundgren, Jonas

    2017-06-01

    A reactive burn model implemented in a multi-dimensional hydrocode can be a powerful tool for predicting non-ideal effects as well as initiation phenomena in explosives. Calibration against experiment is, however, critical and non-trivial. Here, a procedure is presented for calibrating the Ignition and Growth Model utilizing hydrocode simulation in conjunction with the optimization program LS-OPT. The model is applied to the explosive PBXN-109. First, a cylinder expansion test is presented together with a new automatic routine for product equation of state calibration. Secondly, rate stick tests and instrumented gap tests are presented. Data from these experiments are used to calibrate burn model parameters. Finally, we discuss the applicability and development of this optimization routine.

  17. TestSTORM: Simulator for optimizing sample labeling and image acquisition in localization based super-resolution microscopy

    PubMed Central

    Sinkó, József; Kákonyi, Róbert; Rees, Eric; Metcalf, Daniel; Knight, Alex E.; Kaminski, Clemens F.; Szabó, Gábor; Erdélyi, Miklós

    2014-01-01

    Localization-based super-resolution microscopy image quality depends on several factors such as dye choice and labeling strategy, microscope quality and user-defined parameters such as frame rate and number as well as the image processing algorithm. Experimental optimization of these parameters can be time-consuming and expensive so we present TestSTORM, a simulator that can be used to optimize these steps. TestSTORM users can select from among four different structures with specific patterns, dye and acquisition parameters. Example results are shown and the results of the vesicle pattern are compared with experimental data. Moreover, image stacks can be generated for further evaluation using localization algorithms, offering a tool for further software developments. PMID:24688813

  18. Genetic evolutionary taboo search for optimal marker placement in infrared patient setup

    NASA Astrophysics Data System (ADS)

    Riboldi, M.; Baroni, G.; Spadea, M. F.; Tagaste, B.; Garibaldi, C.; Cambria, R.; Orecchia, R.; Pedotti, A.

    2007-09-01

    In infrared patient setup adequate selection of the external fiducial configuration is required for compensating inner target displacements (target registration error, TRE). Genetic algorithms (GA) and taboo search (TS) were applied in a newly designed approach to optimal marker placement: the genetic evolutionary taboo search (GETS) algorithm. In the GETS paradigm, multiple solutions are simultaneously tested in a stochastic evolutionary scheme, where taboo-based decision making and adaptive memory guide the optimization process. The GETS algorithm was tested on a group of ten prostate patients, to be compared to standard optimization and to randomly selected configurations. The changes in the optimal marker configuration, when TRE is minimized for OARs, were specifically examined. Optimal GETS configurations ensured a 26.5% mean decrease in the TRE value, versus 19.4% for conventional quasi-Newton optimization. Common features in GETS marker configurations were highlighted in the dataset of ten patients, even when multiple runs of the stochastic algorithm were performed. Including OARs in TRE minimization did not considerably affect the spatial distribution of GETS marker configurations. In conclusion, the GETS algorithm proved to be highly effective in solving the optimal marker placement problem. Further work is needed to embed site-specific deformation models in the optimization process.

  19. An Efficacious Multi-Objective Fuzzy Linear Programming Approach for Optimal Power Flow Considering Distributed Generation.

    PubMed

    Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri

    2016-01-01

    This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality.

  20. An Efficacious Multi-Objective Fuzzy Linear Programming Approach for Optimal Power Flow Considering Distributed Generation

    PubMed Central

    Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri

    2016-01-01

    This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality. PMID:26954783

  1. Optimizing the design of a reproduction toxicity test with the pond snail Lymnaea stagnalis.

    PubMed

    Charles, Sandrine; Ducrot, Virginie; Azam, Didier; Benstead, Rachel; Brettschneider, Denise; De Schamphelaere, Karel; Filipe Goncalves, Sandra; Green, John W; Holbech, Henrik; Hutchinson, Thomas H; Faber, Daniel; Laranjeiro, Filipe; Matthiessen, Peter; Norrgren, Leif; Oehlmann, Jörg; Reategui-Zirena, Evelyn; Seeland-Fremer, Anne; Teigeler, Matthias; Thome, Jean-Pierre; Tobor Kaplon, Marysia; Weltje, Lennart; Lagadic, Laurent

    2016-11-01

    This paper presents the results from two ring-tests addressing the feasibility, robustness and reproducibility of a reproduction toxicity test with the freshwater gastropod Lymnaea stagnalis (RENILYS strain). Sixteen laboratories (from inexperienced to expert laboratories in mollusc testing) from nine countries participated in these ring-tests. Survival and reproduction were evaluated in L. stagnalis exposed to cadmium, tributyltin, prochloraz and trenbolone according to an OECD draft Test Guideline. In total, 49 datasets were analysed to assess the practicability of the proposed experimental protocol, and to estimate the between-laboratory reproducibility of toxicity endpoint values. The statistical analysis of count data (number of clutches or eggs per individual-day) leading to ECx estimation was specifically developed and automated through a free web-interface. Based on a complementary statistical analysis, the optimal test duration was established and the most sensitive and cost-effective reproduction toxicity endpoint was identified, to be used as the core endpoint. This validation process and the resulting optimized protocol were used to consolidate the OECD Test Guideline for the evaluation of reproductive effects of chemicals in L. stagnalis. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  3. OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE--A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnis Judzis

    2004-04-01

    This document details the progress to date on the OPTIMIZATION OF MUD HAMMER DRILLING PERFORMANCE--A PROGRAM TO BENCHMARK THE VIABILITY OF ADVANCED MUD HAMMER DRILLING contract for the quarter starting January 2004 through March 2004. The DOE and TerraTek continue to wait for Novatek on the optimization portion of the testing program (they are completely rebuilding their fluid hammer). The latest indication is that the Novatek tool would be ready for retesting only 3Q 2004. Smith International's hammer will be tested in April of 2004 (2Q 2004 report). Accomplishments included the following: (1) TerraTek presented a paper for publication inmore » conjunction with a peer review at the GTI Natural Gas Technologies Conference February 10, 2004. Manuscripts and associated presentation material were delivered on schedule. The paper was entitled ''Mud Hammer Performance Optimization''. (2) Shell Exploration and Production continued to express high interest in the ''cutter impact'' testing program Task 8. Hughes Christensen supplied inserts for this testing program. (3) TerraTek hosted an Industry/DOE planning meeting to finalize a testing program for ''Cutter Impact Testing--Understanding Rock Breakage with Bits'' on February 13, 2004. (4) Formal dialogue with Terralog was initiated. Terralog has recently been awarded a DOE contract to model hammer mechanics with TerraTek as a sub-contractor. (5) Novatek provided the DOE with a schedule to complete their new fluid hammer and test it at TerraTek.« less

  4. Optimization of PSA screening policies: a comparison of the patient and societal perspectives.

    PubMed

    Zhang, Jingyu; Denton, Brian T; Balasubramanian, Hari; Shah, Nilay D; Inman, Brant A

    2012-01-01

    To estimate the benefit of PSA-based screening for prostate cancer from the patient and societal perspectives. A partially observable Markov decision process model was used to optimize PSA screening decisions. Age-specific prostate cancer incidence rates and the mortality rates from prostate cancer and competing causes were considered. The model trades off the potential benefit of early detection with the cost of screening and loss of patient quality of life due to screening and treatment. PSA testing and biopsy decisions are made based on the patient's probability of having prostate cancer. Probabilities are inferred based on the patient's complete PSA history using Bayesian updating. The results of all PSA tests and biopsies done in Olmsted County, Minnesota, from 1993 to 2005 (11,872 men and 50,589 PSA test results). Patients' perspective: to maximize expected quality-adjusted life years (QALYs); societal perspective: to maximize the expected monetary value based on societal willingness to pay for QALYs and the cost of PSA testing, prostate biopsies, and treatment. From the patient perspective, the optimal policy recommends stopping PSA testing and biopsy at age 76. From the societal perspective, the stopping age is 71. The expected incremental benefit of optimal screening over the traditional guideline of annual PSA screening with threshold 4.0 ng/mL for biopsy is estimated to be 0.165 QALYs per person from the patient perspective and 0.161 QALYs per person from the societal perspective. PSA screening based on traditional guidelines is found to be worse than no screening at all. PSA testing done with traditional guidelines underperforms and therefore underestimates the potential benefit of screening. Optimal screening guidelines differ significantly depending on the perspective of the decision maker.

  5. Computational study of engine external aerodynamics as a part of multidisciplinary optimization procedure

    NASA Astrophysics Data System (ADS)

    Savelyev, Andrey; Anisimov, Kirill; Kazhan, Egor; Kursakov, Innocentiy; Lysenkov, Alexandr

    2016-10-01

    The paper is devoted to the development of methodology to optimize external aerodynamics of the engine. Optimization procedure is based on numerical solution of the Reynolds-averaged Navier-Stokes equations. As a method of optimization the surrogate based method is used. As a test problem optimal shape design of turbofan nacelle is considered. The results of the first stage, which investigates classic airplane configuration with engine located under the wing, are presented. Described optimization procedure is considered in the context of multidisciplinary optimization of the 3rd generation, developed in the project AGILE.

  6. An enhancement of ROC curves made them clinically relevant for diagnostic-test comparison and optimal-threshold determination.

    PubMed

    Subtil, Fabien; Rabilloud, Muriel

    2015-07-01

    The receiver operating characteristic curves (ROC curves) are often used to compare continuous diagnostic tests or determine the optimal threshold of a test; however, they do not consider the costs of misclassifications or the disease prevalence. The ROC graph was extended to allow for these aspects. Two new lines are added to the ROC graph: a sensitivity line and a specificity line. Their slopes depend on the disease prevalence and on the ratio of the net benefit of treating a diseased subject to the net cost of treating a nondiseased one. First, these lines help researchers determine the range of specificities within which test comparisons of partial areas under the curves is clinically relevant. Second, the ROC curve point the farthest from the specificity line is shown to be the optimal threshold in terms of expected utility. This method was applied: (1) to determine the optimal threshold of ratio specific immunoglobulin G (IgG)/total IgG for the diagnosis of congenital toxoplasmosis and (2) to select, among two markers, the most accurate for the diagnosis of left ventricular hypertrophy in hypertensive subjects. The two additional lines transform the statistically valid ROC graph into a clinically relevant tool for test selection and threshold determination. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Quantum chi-squared and goodness of fit testing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Temme, Kristan; Verstraete, Frank

    2015-01-15

    A quantum mechanical hypothesis test is presented for the hypothesis that a certain setup produces a given quantum state. Although the classical and the quantum problems are very much related to each other, the quantum problem is much richer due to the additional optimization over the measurement basis. A goodness of fit test for i.i.d quantum states is developed and a max-min characterization for the optimal measurement is introduced. We find the quantum measurement which leads both to the maximal Pitman and Bahadur efficiencies, and determine the associated divergence rates. We discuss the relationship of the quantum goodness of fitmore » test to the problem of estimating multiple parameters from a density matrix. These problems are found to be closely related and we show that the largest error of an optimal strategy, determined by the smallest eigenvalue of the Fisher information matrix, is given by the divergence rate of the goodness of fit test.« less

  8. On the design of innovative heterogeneous tests using a shape optimization approach

    NASA Astrophysics Data System (ADS)

    Aquino, J.; Campos, A. Andrade; Souto, N.; Thuillier, S.

    2018-05-01

    The development of full-field measurement methods enabled a new trend of mechanical tests. By providing the inhomogeneous strain field from the tests, these techniques are being widely used in sheet metal identification strategies, through heterogeneous mechanical tests. In this work, a heterogeneous mechanical test with an innovative tool/specimen shape, capable of producing rich heterogeneous strain paths providing extensive information on material behavior, is aimed. The specimen is found using a shape optimization process where a dedicated indicator that evaluates the richness of strain information is used. The methodology and results here presented are extended to non-specimen geometry dependence and to the non-dependence of the geometry parametrization through the use of the Ritz method for boundary value problems. Different curve models, such as Splines, B-Splines and NURBS, are used and C1 continuity throughout the specimen is guaranteed. Moreover, various optimization methods are used, deterministic and stochastic, in order to find the method or a combination of methods able to effectively minimize the cost function.

  9. Optimizing area under the ROC curve using semi-supervised learning

    PubMed Central

    Wang, Shijun; Li, Diana; Petrick, Nicholas; Sahiner, Berkman; Linguraru, Marius George; Summers, Ronald M.

    2014-01-01

    Receiver operating characteristic (ROC) analysis is a standard methodology to evaluate the performance of a binary classification system. The area under the ROC curve (AUC) is a performance metric that summarizes how well a classifier separates two classes. Traditional AUC optimization techniques are supervised learning methods that utilize only labeled data (i.e., the true class is known for all data) to train the classifiers. In this work, inspired by semi-supervised and transductive learning, we propose two new AUC optimization algorithms hereby referred to as semi-supervised learning receiver operating characteristic (SSLROC) algorithms, which utilize unlabeled test samples in classifier training to maximize AUC. Unlabeled samples are incorporated into the AUC optimization process, and their ranking relationships to labeled positive and negative training samples are considered as optimization constraints. The introduced test samples will cause the learned decision boundary in a multidimensional feature space to adapt not only to the distribution of labeled training data, but also to the distribution of unlabeled test data. We formulate the semi-supervised AUC optimization problem as a semi-definite programming problem based on the margin maximization theory. The proposed methods SSLROC1 (1-norm) and SSLROC2 (2-norm) were evaluated using 34 (determined by power analysis) randomly selected datasets from the University of California, Irvine machine learning repository. Wilcoxon signed rank tests showed that the proposed methods achieved significant improvement compared with state-of-the-art methods. The proposed methods were also applied to a CT colonography dataset for colonic polyp classification and showed promising results.1 PMID:25395692

  10. Application of Adaptive DP-optimality to Design a Pilot Study for a Clotting Time Test for Enoxaparin.

    PubMed

    Gulati, Abhishek; Faed, James M; Isbister, Geoffrey K; Duffull, Stephen B

    2015-10-01

    Dosing of enoxaparin, like other anticoagulants, may result in bleeding following excessive doses and clot formation if the dose is too low. We recently showed that a factor Xa based clotting time test could potentially assess the effect of enoxaparin on the clotting system. However, the test did not perform well in subsequent individuals and effectiveness of an exogenous phospholipid, Actin FS, in reducing the variability in the clotting time was assessed. The aim of this work was to conduct an adaptive pilot study to determine the range of concentrations of Xa and Actin FS to take forward into a proof-of-concept study. A nonlinear parametric function was developed to describe the response surface over the factors of interest. An adaptive method was used to estimate the parameters using a D-optimal design criterion. In order to provide a reasonable probability of observing a success of the clotting time test, a P-optimal design criterion was incorporated using a loss function to describe the hybrid DP-optimality. The use of adaptive DP-optimality method resulted in an efficient estimation of model parameters using data from only 6 healthy volunteers. The use of response surface modelling identified a range of sets of Xa and Actin FS concentrations, any of which could be used for the proof-of-concept study. This study shows that parsimonious adaptive DP-optimal designs may provide both precise parameter estimates for response surface modelling as well as clinical confidence in the potential benefits of the study.

  11. Optimizing area under the ROC curve using semi-supervised learning.

    PubMed

    Wang, Shijun; Li, Diana; Petrick, Nicholas; Sahiner, Berkman; Linguraru, Marius George; Summers, Ronald M

    2015-01-01

    Receiver operating characteristic (ROC) analysis is a standard methodology to evaluate the performance of a binary classification system. The area under the ROC curve (AUC) is a performance metric that summarizes how well a classifier separates two classes. Traditional AUC optimization techniques are supervised learning methods that utilize only labeled data (i.e., the true class is known for all data) to train the classifiers. In this work, inspired by semi-supervised and transductive learning, we propose two new AUC optimization algorithms hereby referred to as semi-supervised learning receiver operating characteristic (SSLROC) algorithms, which utilize unlabeled test samples in classifier training to maximize AUC. Unlabeled samples are incorporated into the AUC optimization process, and their ranking relationships to labeled positive and negative training samples are considered as optimization constraints. The introduced test samples will cause the learned decision boundary in a multidimensional feature space to adapt not only to the distribution of labeled training data, but also to the distribution of unlabeled test data. We formulate the semi-supervised AUC optimization problem as a semi-definite programming problem based on the margin maximization theory. The proposed methods SSLROC1 (1-norm) and SSLROC2 (2-norm) were evaluated using 34 (determined by power analysis) randomly selected datasets from the University of California, Irvine machine learning repository. Wilcoxon signed rank tests showed that the proposed methods achieved significant improvement compared with state-of-the-art methods. The proposed methods were also applied to a CT colonography dataset for colonic polyp classification and showed promising results.

  12. [Optimization of stir-baking with vinegar technology for Curcumae Radix by orthogonal test].

    PubMed

    Shi, Dianhua; Su, Benzheng; Sun, Lili; Zhang, Jun; Qu, Yongsheng

    2011-05-01

    To optimize the stir-baking with vinegar technology for Curcumae Radix. The intrinsic quality (the content of Curcumin) and traditional outward appearance were chosen as indexes. The best technology was determined by orthogonal test L9 (3(4)). The factors of the moistening time, stir-baking temperature and stir-baking time were investigated. The optimal technology was as follows: the quantity of vinegar was 10%, the moistening time was 10 min, the stir-baking temperature was 130 degrees C and the stir-baking time was 10 min. The optimal stir-baking with vinegar technology for Curcumae Radix is reasonable, which can be used to guide the standardized production of Curcumae Radix stir-baked with vinegar.

  13. Complaint-adaptive power density optimization as a tool for HTP-guided steering in deep hyperthermia treatment of pelvic tumors

    NASA Astrophysics Data System (ADS)

    Canters, R. A. M.; Franckena, M.; van der Zee, J.; Van Rhoon, G. C.

    2008-12-01

    For an efficient clinical use of HTP (hyperthermia treatment planning), optimization methods are needed. In this study, a complaint-adaptive PD (power density) optimization as a tool for HTP-guided steering in deep hyperthermia of pelvic tumors is developed and tested. PD distribution in patients is predicted using FE-models. Two goal functions, Opt1 and Opt2, are applied to optimize PD distributions. Optimization consists of three steps: initial optimization, adaptive optimization after a first complaint and increasing the weight of a region after recurring complaints. Opt1 initially considers only target PD whereas Opt2 also takes into account hot spots. After patient complaints though, both limit PD in a region. Opt1 and Opt2 are evaluated in a phantom test, using patient models and during hyperthermia treatment. The phantom test and a sensitivity study in ten patient models, show that HTP-guided steering is most effective in peripheral complaint regions. Clinical evaluation in two groups of five patients shows that time between complaints is longer using Opt2 (p = 0.007). However, this does not lead to significantly different temperatures (T50s of 40.3 (Opt1) versus 40.1 °C (Opt2) (p = 0.898)). HTP-guided steering is feasible in terms of PD reduction in complaint regions and in time consumption. Opt2 is preferable in future use, because of better complaint reduction and control.

  14. Examining Relationships among Enabling School Structures, Academic Optimism and Organizational Citizenship Behaviors

    ERIC Educational Resources Information Center

    Messick, Penelope Pope

    2012-01-01

    This study examined the relationships among enabling school structures, academic optimism, and organizational citizenship behaviors. Additionally, it sought to determine if academic optimism served as a mediator between enabling school structures and organizational citizenship behaviors. Three existing survey instruments, previously tested for…

  15. Development of optimized PPP insulated pipe-cable systems in the commercial voltage range

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allam, E.M.; McKean, A.L.

    1992-05-01

    The primary objectives of this project included the development of an alternate domestic source of Paper-Polypropylene-Paper (PPP) laminate and the development of optimized designs for PPP-insulated pipe-type cable systems in the commercial voltage range. The development of a domestic source of PPP laminate was successfully completed. This laminate was utilized throughout the program for fabrication of full-size prototype cables submitted for laboratory qualification tests. Selected cables at rated voltages of 138, 230 and 345kV have been designed, fabricated and subjected to the series of qualification tests leading to full laboratory qualification. An optimized design of 2000 kcmil, 345kV cable insulatedmore » with 600 mils of domestic PPP laminate was fabricated and successfully passed all laboratory qualification tests. This cable design was subsequently installed at Waltz Mill to undergo the series of field tests leading to full commercial qualification.« less

  16. Development of optimized PPP insulated pipe-cable systems in the commercial voltage range. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allam, E.M.; McKean, A.L.

    1992-05-01

    The primary objectives of this project included the development of an alternate domestic source of Paper-Polypropylene-Paper (PPP) laminate and the development of optimized designs for PPP-insulated pipe-type cable systems in the commercial voltage range. The development of a domestic source of PPP laminate was successfully completed. This laminate was utilized throughout the program for fabrication of full-size prototype cables submitted for laboratory qualification tests. Selected cables at rated voltages of 138, 230 and 345kV have been designed, fabricated and subjected to the series of qualification tests leading to full laboratory qualification. An optimized design of 2000 kcmil, 345kV cable insulatedmore » with 600 mils of domestic PPP laminate was fabricated and successfully passed all laboratory qualification tests. This cable design was subsequently installed at Waltz Mill to undergo the series of field tests leading to full commercial qualification.« less

  17. Voltage stability index based optimal placement of static VAR compensator and sizing using Cuckoo search algorithm

    NASA Astrophysics Data System (ADS)

    Venkateswara Rao, B.; Kumar, G. V. Nagesh; Chowdary, D. Deepak; Bharathi, M. Aruna; Patra, Stutee

    2017-07-01

    This paper furnish the new Metaheuristic algorithm called Cuckoo Search Algorithm (CSA) for solving optimal power flow (OPF) problem with minimization of real power generation cost. The CSA is found to be the most efficient algorithm for solving single objective optimal power flow problems. The CSA performance is tested on IEEE 57 bus test system with real power generation cost minimization as objective function. Static VAR Compensator (SVC) is one of the best shunt connected device in the Flexible Alternating Current Transmission System (FACTS) family. It has capable of controlling the voltage magnitudes of buses by injecting the reactive power to system. In this paper SVC is integrated in CSA based Optimal Power Flow to optimize the real power generation cost. SVC is used to improve the voltage profile of the system. CSA gives better results as compared to genetic algorithm (GA) in both without and with SVC conditions.

  18. Implementation and on-sky results of an optimal wavefront controller for the MMT NGS adaptive optics system

    NASA Astrophysics Data System (ADS)

    Powell, Keith B.; Vaitheeswaran, Vidhya

    2010-07-01

    The MMT observatory has recently implemented and tested an optimal wavefront controller for the NGS adaptive optics system. Open loop atmospheric data collected at the telescope is used as the input to a MATLAB based analytical model. The model uses nonlinear constrained minimization to determine controller gains and optimize the system performance. The real-time controller performing the adaptive optics close loop operation is implemented on a dedicated high performance PC based quad core server. The controller algorithm is written in C and uses the GNU scientific library for linear algebra. Tests at the MMT confirmed the optimal controller significantly reduced the residual RMS wavefront compared with the previous controller. Significant reductions in image FWHM and increased peak intensities were obtained in J, H and K-bands. The optimal PID controller is now operating as the baseline wavefront controller for the MMT NGS-AO system.

  19. Water-resources optimization model for Santa Barbara, California

    USGS Publications Warehouse

    Nishikawa, Tracy

    1998-01-01

    A simulation-optimization model has been developed for the optimal management of the city of Santa Barbara's water resources during a drought. The model, which links groundwater simulation with linear programming, has a planning horizon of 5 years. The objective is to minimize the cost of water supply subject to: water demand constraints, hydraulic head constraints to control seawater intrusion, and water capacity constraints. The decision variables are montly water deliveries from surface water and groundwater. The state variables are hydraulic heads. The drought of 1947-51 is the city's worst drought on record, and simulated surface-water supplies for this period were used as a basis for testing optimal management of current water resources under drought conditions. The simulation-optimization model was applied using three reservoir operation rules. In addition, the model's sensitivity to demand, carry over [the storage of water in one year for use in the later year(s)], head constraints, and capacity constraints was tested.

  20. COPS: Large-scale nonlinearly constrained optimization problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bondarenko, A.S.; Bortz, D.M.; More, J.J.

    2000-02-10

    The authors have started the development of COPS, a collection of large-scale nonlinearly Constrained Optimization Problems. The primary purpose of this collection is to provide difficult test cases for optimization software. Problems in the current version of the collection come from fluid dynamics, population dynamics, optimal design, and optimal control. For each problem they provide a short description of the problem, notes on the formulation of the problem, and results of computational experiments with general optimization solvers. They currently have results for DONLP2, LANCELOT, MINOS, SNOPT, and LOQO.

  1. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  2. A study of data representation in Hadoop to optimize data storage and search performance for the ATLAS EventIndex

    NASA Astrophysics Data System (ADS)

    Baranowski, Z.; Canali, L.; Toebbicke, R.; Hrivnac, J.; Barberis, D.

    2017-10-01

    This paper reports on the activities aimed at improving the architecture and performance of the ATLAS EventIndex implementation in Hadoop. The EventIndex contains tens of billions of event records, each of which consists of ∼100 bytes, all having the same probability to be searched or counted. Data formats represent one important area for optimizing the performance and storage footprint of applications based on Hadoop. This work reports on the production usage and on tests using several data formats including Map Files, Apache Parquet, Avro, and various compression algorithms. The query engine plays also a critical role in the architecture. We report also on the use of HBase for the EventIndex, focussing on the optimizations performed in production and on the scalability tests. Additional engines that have been tested include Cloudera Impala, in particular for its SQL interface, and the optimizations for data warehouse workloads and reports.

  3. Grayscale Optical Correlator Workbench

    NASA Technical Reports Server (NTRS)

    Hanan, Jay; Zhou, Hanying; Chao, Tien-Hsin

    2006-01-01

    Grayscale Optical Correlator Workbench (GOCWB) is a computer program for use in automatic target recognition (ATR). GOCWB performs ATR with an accurate simulation of a hardware grayscale optical correlator (GOC). This simulation is performed to test filters that are created in GOCWB. Thus, GOCWB can be used as a stand-alone ATR software tool or in combination with GOC hardware for building (target training), testing, and optimization of filters. The software is divided into three main parts, denoted filter, testing, and training. The training part is used for assembling training images as input to a filter. The filter part is used for combining training images into a filter and optimizing that filter. The testing part is used for testing new filters and for general simulation of GOC output. The current version of GOCWB relies on the mathematical software tools from MATLAB binaries for performing matrix operations and fast Fourier transforms. Optimization of filters is based on an algorithm, known as OT-MACH, in which variables specified by the user are parameterized and the best filter is selected on the basis of an average result for correct identification of targets in multiple test images.

  4. Adaptive transmission disequilibrium test for family trio design.

    PubMed

    Yuan, Min; Tian, Xin; Zheng, Gang; Yang, Yaning

    2009-01-01

    The transmission disequilibrium test (TDT) is a standard method to detect association using family trio design. It is optimal for an additive genetic model. Other TDT-type tests optimal for recessive and dominant models have also been developed. Association tests using family data, including the TDT-type statistics, have been unified to a class of more comprehensive and flexable family-based association tests (FBAT). TDT-type tests have high efficiency when the genetic model is known or correctly specified, but may lose power if the model is mis-specified. Hence tests that are robust to genetic model mis-specification yet efficient are preferred. Constrained likelihood ratio test (CLRT) and MAX-type test have been shown to be efficiency robust. In this paper we propose a new efficiency robust procedure, referred to as adaptive TDT (aTDT). It uses the Hardy-Weinberg disequilibrium coefficient to identify the potential genetic model underlying the data and then applies the TDT-type test (or FBAT for general applications) corresponding to the selected model. Simulation demonstrates that aTDT is efficiency robust to model mis-specifications and generally outperforms the MAX test and CLRT in terms of power. We also show that aTDT has power close to, but much more robust, than the optimal TDT-type test based on a single genetic model. Applications to real and simulated data from Genetic Analysis Workshop (GAW) illustrate the use of our adaptive TDT.

  5. A comparative review of methods for comparing means using partially paired data.

    PubMed

    Guo, Beibei; Yuan, Ying

    2017-06-01

    In medical experiments with the objective of testing the equality of two means, data are often partially paired by design or because of missing data. The partially paired data represent a combination of paired and unpaired observations. In this article, we review and compare nine methods for analyzing partially paired data, including the two-sample t-test, paired t-test, corrected z-test, weighted t-test, pooled t-test, optimal pooled t-test, multiple imputation method, mixed model approach, and the test based on a modified maximum likelihood estimate. We compare the performance of these methods through extensive simulation studies that cover a wide range of scenarios with different effect sizes, sample sizes, and correlations between the paired variables, as well as true underlying distributions. The simulation results suggest that when the sample size is moderate, the test based on the modified maximum likelihood estimator is generally superior to the other approaches when the data is normally distributed and the optimal pooled t-test performs the best when the data is not normally distributed, with well-controlled type I error rates and high statistical power; when the sample size is small, the optimal pooled t-test is to be recommended when both variables have missing data and the paired t-test is to be recommended when only one variable has missing data.

  6. A sensitivity equation approach to shape optimization in fluid flows

    NASA Technical Reports Server (NTRS)

    Borggaard, Jeff; Burns, John

    1994-01-01

    A sensitivity equation method to shape optimization problems is applied. An algorithm is developed and tested on a problem of designing optimal forebody simulators for a 2D, inviscid supersonic flow. The algorithm uses a BFGS/Trust Region optimization scheme with sensitivities computed by numerically approximating the linear partial differential equations that determine the flow sensitivities. Numerical examples are presented to illustrate the method.

  7. Implementation of Chaotic Gaussian Particle Swarm Optimization for Optimize Learning-to-Rank Software Defect Prediction Model Construction

    NASA Astrophysics Data System (ADS)

    Buchari, M. A.; Mardiyanto, S.; Hendradjaya, B.

    2018-03-01

    Finding the existence of software defect as early as possible is the purpose of research about software defect prediction. Software defect prediction activity is required to not only state the existence of defects, but also to be able to give a list of priorities which modules require a more intensive test. Therefore, the allocation of test resources can be managed efficiently. Learning to rank is one of the approach that can provide defect module ranking data for the purposes of software testing. In this study, we propose a meta-heuristic chaotic Gaussian particle swarm optimization to improve the accuracy of learning to rank software defect prediction approach. We have used 11 public benchmark data sets as experimental data. Our overall results has demonstrated that the prediction models construct using Chaotic Gaussian Particle Swarm Optimization gets better accuracy on 5 data sets, ties in 5 data sets and gets worse in 1 data sets. Thus, we conclude that the application of Chaotic Gaussian Particle Swarm Optimization in Learning-to-Rank approach can improve the accuracy of the defect module ranking in data sets that have high-dimensional features.

  8. PopED lite: An optimal design software for preclinical pharmacokinetic and pharmacodynamic studies.

    PubMed

    Aoki, Yasunori; Sundqvist, Monika; Hooker, Andrew C; Gennemark, Peter

    2016-04-01

    Optimal experimental design approaches are seldom used in preclinical drug discovery. The objective is to develop an optimal design software tool specifically designed for preclinical applications in order to increase the efficiency of drug discovery in vivo studies. Several realistic experimental design case studies were collected and many preclinical experimental teams were consulted to determine the design goal of the software tool. The tool obtains an optimized experimental design by solving a constrained optimization problem, where each experimental design is evaluated using some function of the Fisher Information Matrix. The software was implemented in C++ using the Qt framework to assure a responsive user-software interaction through a rich graphical user interface, and at the same time, achieving the desired computational speed. In addition, a discrete global optimization algorithm was developed and implemented. The software design goals were simplicity, speed and intuition. Based on these design goals, we have developed the publicly available software PopED lite (http://www.bluetree.me/PopED_lite). Optimization computation was on average, over 14 test problems, 30 times faster in PopED lite compared to an already existing optimal design software tool. PopED lite is now used in real drug discovery projects and a few of these case studies are presented in this paper. PopED lite is designed to be simple, fast and intuitive. Simple, to give many users access to basic optimal design calculations. Fast, to fit a short design-execution cycle and allow interactive experimental design (test one design, discuss proposed design, test another design, etc). Intuitive, so that the input to and output from the software tool can easily be understood by users without knowledge of the theory of optimal design. In this way, PopED lite is highly useful in practice and complements existing tools. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Design of clinical trials involving multiple hypothesis tests with a common control.

    PubMed

    Schou, I Manjula; Marschner, Ian C

    2017-07-01

    Randomized clinical trials comparing several treatments to a common control are often reported in the medical literature. For example, multiple experimental treatments may be compared with placebo, or in combination therapy trials, a combination therapy may be compared with each of its constituent monotherapies. Such trials are typically designed using a balanced approach in which equal numbers of individuals are randomized to each arm, however, this can result in an inefficient use of resources. We provide a unified framework and new theoretical results for optimal design of such single-control multiple-comparator studies. We consider variance optimal designs based on D-, A-, and E-optimality criteria, using a general model that allows for heteroscedasticity and a range of effect measures that include both continuous and binary outcomes. We demonstrate the sensitivity of these designs to the type of optimality criterion by showing that the optimal allocation ratios are systematically ordered according to the optimality criterion. Given this sensitivity to the optimality criterion, we argue that power optimality is a more suitable approach when designing clinical trials where testing is the objective. Weighted variance optimal designs are also discussed, which, like power optimal designs, allow the treatment difference to play a major role in determining allocation ratios. We illustrate our methods using two real clinical trial examples taken from the medical literature. Some recommendations on the use of optimal designs in single-control multiple-comparator trials are also provided. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Event Oriented Design and Adaptive Multiprocessing

    DTIC Science & Technology

    1991-08-31

    System 5 2.3 The Classification 5 2.4 Real-Time Systems 7 2.5 Non Real-Time Systems 10 2.6 Common Characterizations of all Software Systems 10 2.7... Non -Optimal Guarantee Test Theorem 37 6.3.2 Chetto’s Optimal Guarantee Test Theorem 37 6.3.3 Multistate Case: An Extended Guarantee 39 Test Theorem...which subdivides all software systems according to the way in which they operate, such as interactive, non interactive, real-time, etc. Having defined

  11. Inverse problems in the design, modeling and testing of engineering systems

    NASA Technical Reports Server (NTRS)

    Alifanov, Oleg M.

    1991-01-01

    Formulations, classification, areas of application, and approaches to solving different inverse problems are considered for the design of structures, modeling, and experimental data processing. Problems in the practical implementation of theoretical-experimental methods based on solving inverse problems are analyzed in order to identify mathematical models of physical processes, aid in input data preparation for design parameter optimization, help in design parameter optimization itself, and to model experiments, large-scale tests, and real tests of engineering systems.

  12. An optimal controller for an electric ventricular-assist device: theory, implementation, and testing.

    PubMed

    Klute, G K; Tasch, U; Geselowitz, D B

    1992-04-01

    This paper addresses the development and testing of an optimal position feedback controller for the Penn State electric ventricular-assist device (EVAD). The control law is designed to minimize the expected value of the EVAD's power consumption for a targeted patient population. The closed-loop control law is implemented on an Intel 8096 microprocessor and in vitro test runs show that this controller improves the EVAD's efficiency by 15-21%, when compared with the performance of the currently used feedforward control scheme.

  13. Academic Optimism and Organizational Climate: An Elementary School Effectiveness Test of Two Measures

    ERIC Educational Resources Information Center

    Reeves, Jonathan Bart

    2010-01-01

    This study examined the relationship of two climate constructs in academic optimism and organizational climate as each relates to school effectiveness. Academic optimism is an academic environment comprised of three dimensions: academic emphasis, collective efficacy, and faculty trust (Hoy, Tarter, & Hoy, 2006). The Organizational Climate…

  14. Advances on the constitutive characterization of composites via multiaxial robotic testing and design optimization

    Treesearch

    John G. Michopoulos; John Hermanson; Athanasios Iliopoulos

    2014-01-01

    The research areas of mutiaxial robotic testing and design optimization have been recently utilized for the purpose of data-driven constitutive characterization of anisotropic material systems. This effort has been enabled by both the progress in the areas of computers and information in engineering as well as the progress in computational automation. Although our...

  15. Test Design Optimization in CAT Early Stage with the Nominal Response Model

    ERIC Educational Resources Information Center

    Passos, Valeria Lima; Berger, Martijn P. F.; Tan, Frans E.

    2007-01-01

    The early stage of computerized adaptive testing (CAT) refers to the phase of the trait estimation during the administration of only a few items. This phase can be characterized by bias and instability of estimation. In this study, an item selection criterion is introduced in an attempt to lessen this instability: the D-optimality criterion. A…

  16. The Effects of an Emotion Strengthening Training Program on the Optimism Level of Nurses

    ERIC Educational Resources Information Center

    Balci Celik, Seher

    2008-01-01

    The aim of this study is to investigate the effects of emotion strengthening as a training programme for optimism for nurses. The experimental and control group of this research was totally composed of 20 nurses. A pre-test post-test research model with control group was used in this research. Nurses' optimistm levels have been measured by…

  17. Enabling School Structure, Collective Responsibility, and a Culture of Academic Optimism: Toward a Robust Model of School Performance in Taiwan

    ERIC Educational Resources Information Center

    Wu, Jason H.; Hoy, Wayne K.; Tarter, C. John

    2013-01-01

    Purpose: The purpose of this research is twofold: to test a theory of academic optimism in Taiwan elementary schools and to expand the theory by adding new variables, collective responsibility and enabling school structure, to the model. Design/methodology/approach: Structural equation modeling was used to test, refine, and expand an…

  18. Model-Based Thermal System Design Optimization for the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-01-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  19. Model-based thermal system design optimization for the James Webb Space Telescope

    NASA Astrophysics Data System (ADS)

    Cataldo, Giuseppe; Niedner, Malcolm B.; Fixsen, Dale J.; Moseley, Samuel H.

    2017-10-01

    Spacecraft thermal model validation is normally performed by comparing model predictions with thermal test data and reducing their discrepancies to meet the mission requirements. Based on thermal engineering expertise, the model input parameters are adjusted to tune the model output response to the test data. The end result is not guaranteed to be the best solution in terms of reduced discrepancy and the process requires months to complete. A model-based methodology was developed to perform the validation process in a fully automated fashion and provide mathematical bases to the search for the optimal parameter set that minimizes the discrepancies between model and data. The methodology was successfully applied to several thermal subsystems of the James Webb Space Telescope (JWST). Global or quasiglobal optimal solutions were found and the total execution time of the model validation process was reduced to about two weeks. The model sensitivities to the parameters, which are required to solve the optimization problem, can be calculated automatically before the test begins and provide a library for sensitivity studies. This methodology represents a crucial commodity when testing complex, large-scale systems under time and budget constraints. Here, results for the JWST Core thermal system will be presented in detail.

  20. Accounting for Proof Test Data in a Reliability Based Design Optimization Framework

    NASA Technical Reports Server (NTRS)

    Ventor, Gerharad; Scotti, Stephen J.

    2012-01-01

    This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.

  1. Simplified Numerical Analysis of ECT Probe - Eddy Current Benchmark Problem 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sikora, R.; Chady, T.; Gratkowski, S.

    2005-04-09

    In this paper a third eddy current benchmark problem is considered. The objective of the benchmark is to determine optimal operating frequency and size of the pancake coil designated for testing tubes made of Inconel. It can be achieved by maximization of the change in impedance of the coil due to a flaw. Approximation functions of the probe (coil) characteristic were developed and used in order to reduce number of required calculations. It results in significant speed up of the optimization process. An optimal testing frequency and size of the probe were achieved as a final result of the calculation.

  2. Viscoelastic material inversion using Sierra-SD and ROL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walsh, Timothy; Aquino, Wilkins; Ridzal, Denis

    2014-11-01

    In this report we derive frequency-domain methods for inverse characterization of the constitutive parameters of viscoelastic materials. The inverse problem is cast in a PDE-constrained optimization framework with efficient computation of gradients and Hessian vector products through matrix free operations. The abstract optimization operators for first and second derivatives are derived from first principles. Various methods from the Rapid Optimization Library (ROL) are tested on the viscoelastic inversion problem. The methods described herein are applied to compute the viscoelastic bulk and shear moduli of a foam block model, which was recently used in experimental testing for viscoelastic property characterization.

  3. Implementation and Performance Issues in Collaborative Optimization

    NASA Technical Reports Server (NTRS)

    Braun, Robert; Gage, Peter; Kroo, Ilan; Sobieski, Ian

    1996-01-01

    Collaborative optimization is a multidisciplinary design architecture that is well-suited to large-scale multidisciplinary optimization problems. This paper compares this approach with other architectures, examines the details of the formulation, and some aspects of its performance. A particular version of the architecture is proposed to better accommodate the occurrence of multiple feasible regions. The use of system level inequality constraints is shown to increase the convergence rate. A series of simple test problems, demonstrated to challenge related optimization architectures, is successfully solved with collaborative optimization.

  4. Optimization of transonic wind tunnel data acquisition and control systems for providing continuous mode tests

    NASA Astrophysics Data System (ADS)

    Petronevich, V. V.

    2016-10-01

    The paper observes the issues related to the increase of efficiency and information content of experimental research in transonic wind tunnels (WT). In particular, questions of optimizing the WT Data Acquisition and Control Systems (DACS) to provide the continuous mode test method are discussed. The problem of Mach number (M number) stabilization in the test section of the large transonic compressor-type wind tunnels at subsonic flow conditions with continuous change of the aircraft model angle of attack is observed on the example of T-128 wind tunnel. To minimize the signals distortion in T-128 DACS measurement channels the optimal MGCplus filter settings of the data acquisition system used in T-128 wind tunnel to measure loads were experimentally determined. As a result of the tests performed a good agreement of the results of balance measurements for pitch/pause and continuous test modes was obtained. Carrying out balance tests for pitch/pause and continuous test methods was provided by the regular data acquisition and control system of T-128 wind tunnel with unified software package POTOK. The architecture and functional abilities of POTOK software package are observed.

  5. Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm.

    PubMed

    Pickett, Stephen D; Green, Darren V S; Hunt, David L; Pardoe, David A; Hughes, Ian

    2011-01-13

    Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure-activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods.

  6. Automated Lead Optimization of MMP-12 Inhibitors Using a Genetic Algorithm

    PubMed Central

    2010-01-01

    Traditional lead optimization projects involve long synthesis and testing cycles, favoring extensive structure−activity relationship (SAR) analysis and molecular design steps, in an attempt to limit the number of cycles that a project must run to optimize a development candidate. Microfluidic-based chemistry and biology platforms, with cycle times of minutes rather than weeks, lend themselves to unattended autonomous operation. The bottleneck in the lead optimization process is therefore shifted from synthesis or test to SAR analysis and design. As such, the way is open to an algorithm-directed process, without the need for detailed user data analysis. Here, we present results of two synthesis and screening experiments, undertaken using traditional methodology, to validate a genetic algorithm optimization process for future application to a microfluidic system. The algorithm has several novel features that are important for the intended application. For example, it is robust to missing data and can suggest compounds for retest to ensure reliability of optimization. The algorithm is first validated on a retrospective analysis of an in-house library embedded in a larger virtual array of presumed inactive compounds. In a second, prospective experiment with MMP-12 as the target protein, 140 compounds are submitted for synthesis over 10 cycles of optimization. Comparison is made to the results from the full combinatorial library that was synthesized manually and tested independently. The results show that compounds selected by the algorithm are heavily biased toward the more active regions of the library, while the algorithm is robust to both missing data (compounds where synthesis failed) and inactive compounds. This publication places the full combinatorial library and biological data into the public domain with the intention of advancing research into algorithm-directed lead optimization methods. PMID:24900251

  7. Development of a chromatographic method with multi-criteria decision making design for simultaneous determination of nifedipine and atenolol in content uniformity testing.

    PubMed

    Ahmed, Sameh; Alqurshi, Abdulmalik; Mohamed, Abdel-Maaboud Ismail

    2018-07-01

    A new robust and reliable high-performance liquid chromatography (HPLC) method with multi-criteria decision making (MCDM) approach was developed to allow simultaneous quantification of atenolol (ATN) and nifedipine (NFD) in content uniformity testing. Felodipine (FLD) was used as an internal standard (I.S.) in this study. A novel marriage between a new interactive response optimizer and a HPLC method was suggested for multiple response optimizations of target responses. An interactive response optimizer was used as a decision and prediction tool for the optimal settings of target responses, according to specified criteria, based on Derringer's desirability. Four independent variables were considered in this study: Acetonitrile%, buffer pH and concentration along with column temperature. Eight responses were optimized: retention times of ATN, NFD, and FLD, resolutions between ATN/NFD and NFD/FLD, and plate numbers for ATN, NFD, and FLD. Multiple regression analysis was applied in order to scan the influences of the most significant variables for the regression models. The experimental design was set to give minimum retention times, maximum resolution and plate numbers. The interactive response optimizer allowed prediction of optimum conditions according to these criteria with a good composite desirability value of 0.98156. The developed method was validated according to the International Conference on Harmonization (ICH) guidelines with the aid of the experimental design. The developed MCDM-HPLC method showed superior robustness and resolution in short analysis time allowing successful simultaneous content uniformity testing of ATN and NFD in marketed capsules. The current work presents an interactive response optimizer as an efficient platform to optimize, predict responses, and validate HPLC methodology with tolerable design space for assay in quality control laboratories. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Launch Vehicle Propulsion Parameter Design Multiple Selection Criteria

    NASA Technical Reports Server (NTRS)

    Shelton, Joey Dewayne

    2004-01-01

    The optimization tool described herein addresses and emphasizes the use of computer tools to model a system and focuses on a concept development approach for a liquid hydrogen/liquid oxygen single-stage-to-orbit system, but more particularly the development of the optimized system using new techniques. This methodology uses new and innovative tools to run Monte Carlo simulations, genetic algorithm solvers, and statistical models in order to optimize a design concept. The concept launch vehicle and propulsion system were modeled and optimized to determine the best design for weight and cost by varying design and technology parameters. Uncertainty levels were applied using Monte Carlo Simulations and the model output was compared to the National Aeronautics and Space Administration Space Shuttle Main Engine. Several key conclusions are summarized here for the model results. First, the Gross Liftoff Weight and Dry Weight were 67% higher for the design case for minimization of Design, Development, Test and Evaluation cost when compared to the weights determined by the minimization of Gross Liftoff Weight case. In turn, the Design, Development, Test and Evaluation cost was 53% higher for optimized Gross Liftoff Weight case when compared to the cost determined by case for minimization of Design, Development, Test and Evaluation cost. Therefore, a 53% increase in Design, Development, Test and Evaluation cost results in a 67% reduction in Gross Liftoff Weight. Secondly, the tool outputs define the sensitivity of propulsion parameters, technology and cost factors and how these parameters differ when cost and weight are optimized separately. A key finding was that for a Space Shuttle Main Engine thrust level the oxidizer/fuel ratio of 6.6 resulted in the lowest Gross Liftoff Weight rather than at 5.2 for the maximum specific impulse, demonstrating the relationships between specific impulse, engine weight, tank volume and tank weight. Lastly, the optimum chamber pressure for Gross Liftoff Weight minimization was 2713 pounds per square inch as compared to 3162 for the Design, Development, Test and Evaluation cost optimization case. This chamber pressure range is close to 3000 pounds per square inch for the Space Shuttle Main Engine.

  9. Simulation Modeling to Compare High-Throughput, Low-Iteration Optimization Strategies for Metabolic Engineering

    PubMed Central

    Heinsch, Stephen C.; Das, Siba R.; Smanski, Michael J.

    2018-01-01

    Increasing the final titer of a multi-gene metabolic pathway can be viewed as a multivariate optimization problem. While numerous multivariate optimization algorithms exist, few are specifically designed to accommodate the constraints posed by genetic engineering workflows. We present a strategy for optimizing expression levels across an arbitrary number of genes that requires few design-build-test iterations. We compare the performance of several optimization algorithms on a series of simulated expression landscapes. We show that optimal experimental design parameters depend on the degree of landscape ruggedness. This work provides a theoretical framework for designing and executing numerical optimization on multi-gene systems. PMID:29535690

  10. Dispositional and explanatory style optimism as potential moderators of the relationship between hopelessness and suicidal ideation.

    PubMed

    Hirsch, Jameson K; Conner, Kenneth R

    2006-12-01

    To test the hypothesis that higher levels of optimism reduce the association between hopelessness and suicidal ideation, 284 college students completed self-report measures of optimism and Beck scales for hopelessness, suicidal ideation, and depression. A statistically significant interaction between hopelessness and one measure of optimism was obtained, consistent with the hypothesis that optimism moderates the relationship between hopelessness and suicidal ideation. Hopelessness is not inevitably associated with suicidal ideation. Optimism may be an important moderator of the association. The development of treatments to enhance optimism may complement standard treatments to reduce suicidality that target depression and hopelessness.

  11. Optimization and experimental validation of a thermal cycle that maximizes entropy coefficient fisher identifiability for lithium iron phosphate cells

    NASA Astrophysics Data System (ADS)

    Mendoza, Sergio; Rothenberger, Michael; Hake, Alison; Fathy, Hosam

    2016-03-01

    This article presents a framework for optimizing the thermal cycle to estimate a battery cell's entropy coefficient at 20% state of charge (SOC). Our goal is to maximize Fisher identifiability: a measure of the accuracy with which a parameter can be estimated. Existing protocols in the literature for estimating entropy coefficients demand excessive laboratory time. Identifiability optimization makes it possible to achieve comparable accuracy levels in a fraction of the time. This article demonstrates this result for a set of lithium iron phosphate (LFP) cells. We conduct a 24-h experiment to obtain benchmark measurements of their entropy coefficients. We optimize a thermal cycle to maximize parameter identifiability for these cells. This optimization proceeds with respect to the coefficients of a Fourier discretization of this thermal cycle. Finally, we compare the estimated parameters using (i) the benchmark test, (ii) the optimized protocol, and (iii) a 15-h test from the literature (by Forgez et al.). The results are encouraging for two reasons. First, they confirm the simulation-based prediction that the optimized experiment can produce accurate parameter estimates in 2 h, compared to 15-24. Second, the optimized experiment also estimates a thermal time constant representing the effects of thermal capacitance and convection heat transfer.

  12. Modeling, Analysis, and Optimization Issues for Large Space Structures

    NASA Technical Reports Server (NTRS)

    Pinson, L. D. (Compiler); Amos, A. K. (Compiler); Venkayya, V. B. (Compiler)

    1983-01-01

    Topics concerning the modeling, analysis, and optimization of large space structures are discussed including structure-control interaction, structural and structural dynamics modeling, thermal analysis, testing, and design.

  13. Support Vector Machine Based on Adaptive Acceleration Particle Swarm Optimization

    PubMed Central

    Abdulameer, Mohammed Hasan; Othman, Zulaiha Ali

    2014-01-01

    Existing face recognition methods utilize particle swarm optimizer (PSO) and opposition based particle swarm optimizer (OPSO) to optimize the parameters of SVM. However, the utilization of random values in the velocity calculation decreases the performance of these techniques; that is, during the velocity computation, we normally use random values for the acceleration coefficients and this creates randomness in the solution. To address this problem, an adaptive acceleration particle swarm optimization (AAPSO) technique is proposed. To evaluate our proposed method, we employ both face and iris recognition based on AAPSO with SVM (AAPSO-SVM). In the face and iris recognition systems, performance is evaluated using two human face databases, YALE and CASIA, and the UBiris dataset. In this method, we initially perform feature extraction and then recognition on the extracted features. In the recognition process, the extracted features are used for SVM training and testing. During the training and testing, the SVM parameters are optimized with the AAPSO technique, and in AAPSO, the acceleration coefficients are computed using the particle fitness values. The parameters in SVM, which are optimized by AAPSO, perform efficiently for both face and iris recognition. A comparative analysis between our proposed AAPSO-SVM and the PSO-SVM technique is presented. PMID:24790584

  14. The impact of child's severity on quality-of-life among parents of children with autism spectrum disorder: the mediating role of optimism.

    PubMed

    Wisessathorn, Manika; Chanuantong, Tanasugarn; Fisher, Edwin B

    2013-10-01

    Investigate the impact of child severity and optimism on quality-of-life in parents of children with Autism Spectrum Disorder (ASD). Additionally, the role of optimism as mediator between child's severity and parental quality-of-life was also evaluated Three hundred three parents of children with ASD were recruited from the local autistic centers and schools in Bangkok, Thailand. A set of demographic information sheet, the Childhood Autism Rating Scale (CARS), the Life Oriented Test-Revised (LOT-R), and the WHOQOL-BREF test were submitted for collecting parental information. Using Pearson Correlation, a significant negative association was found between child's severity and parental quality-of-life while optimism was found to correlate positively with parental outcomes. The finding from path-analysis confirmed that impairment of language and repetitive behavior of an ASD child associated with optimism that, in turn, predicted level of parental quality-of-life in all domains. The current findings assured a role of optimism as mediator between child's severity and parental quality-of-life. Implications for the development of intervention focused on enhancing parent's optimism were recommended.

  15. Simulated parallel annealing within a neighborhood for optimization of biomechanical systems.

    PubMed

    Higginson, J S; Neptune, R R; Anderson, F C

    2005-09-01

    Optimization problems for biomechanical systems have become extremely complex. Simulated annealing (SA) algorithms have performed well in a variety of test problems and biomechanical applications; however, despite advances in computer speed, convergence to optimal solutions for systems of even moderate complexity has remained prohibitive. The objective of this study was to develop a portable parallel version of a SA algorithm for solving optimization problems in biomechanics. The algorithm for simulated parallel annealing within a neighborhood (SPAN) was designed to minimize interprocessor communication time and closely retain the heuristics of the serial SA algorithm. The computational speed of the SPAN algorithm scaled linearly with the number of processors on different computer platforms for a simple quadratic test problem and for a more complex forward dynamic simulation of human pedaling.

  16. Optimal quantum networks and one-shot entropies

    NASA Astrophysics Data System (ADS)

    Chiribella, Giulio; Ebler, Daniel

    2016-09-01

    We develop a semidefinite programming method for the optimization of quantum networks, including both causal networks and networks with indefinite causal structure. Our method applies to a broad class of performance measures, defined operationally in terms of interative tests set up by a verifier. We show that the optimal performance is equal to a max relative entropy, which quantifies the informativeness of the test. Building on this result, we extend the notion of conditional min-entropy from quantum states to quantum causal networks. The optimization method is illustrated in a number of applications, including the inversion, charge conjugation, and controlization of an unknown unitary dynamics. In the non-causal setting, we show a proof-of-principle application to the maximization of the winning probability in a non-causal quantum game.

  17. Interactive optimization approach for optimal impulsive rendezvous using primer vector and evolutionary algorithms

    NASA Astrophysics Data System (ADS)

    Luo, Ya-Zhong; Zhang, Jin; Li, Hai-yang; Tang, Guo-Jin

    2010-08-01

    In this paper, a new optimization approach combining primer vector theory and evolutionary algorithms for fuel-optimal non-linear impulsive rendezvous is proposed. The optimization approach is designed to seek the optimal number of impulses as well as the optimal impulse vectors. In this optimization approach, adding a midcourse impulse is determined by an interactive method, i.e. observing the primer-magnitude time history. An improved version of simulated annealing is employed to optimize the rendezvous trajectory with the fixed-number of impulses. This interactive approach is evaluated by three test cases: coplanar circle-to-circle rendezvous, same-circle rendezvous and non-coplanar rendezvous. The results show that the interactive approach is effective and efficient in fuel-optimal non-linear rendezvous design. It can guarantee solutions, which satisfy the Lawden's necessary optimality conditions.

  18. A Novel Consensus-Based Particle Swarm Optimization-Assisted Trust-Tech Methodology for Large-Scale Global Optimization.

    PubMed

    Zhang, Yong-Feng; Chiang, Hsiao-Dong

    2017-09-01

    A novel three-stage methodology, termed the "consensus-based particle swarm optimization (PSO)-assisted Trust-Tech methodology," to find global optimal solutions for nonlinear optimization problems is presented. It is composed of Trust-Tech methods, consensus-based PSO, and local optimization methods that are integrated to compute a set of high-quality local optimal solutions that can contain the global optimal solution. The proposed methodology compares very favorably with several recently developed PSO algorithms based on a set of small-dimension benchmark optimization problems and 20 large-dimension test functions from the CEC 2010 competition. The analytical basis for the proposed methodology is also provided. Experimental results demonstrate that the proposed methodology can rapidly obtain high-quality optimal solutions that can contain the global optimal solution. The scalability of the proposed methodology is promising.

  19. A programmable optimization environment using the GAMESS-US and MERLIN/MCL packages. Applications on intermolecular interaction energies

    NASA Astrophysics Data System (ADS)

    Kalatzis, Fanis G.; Papageorgiou, Dimitrios G.; Demetropoulos, Ioannis N.

    2006-09-01

    The Merlin/MCL optimization environment and the GAMESS-US package were combined so as to offer an extended and efficient quantum chemistry optimization system, capable of implementing complex optimization strategies for generic molecular modeling problems. A communication and data exchange interface was established between the two packages exploiting all Merlin features such as multiple optimizers, box constraints, user extensions and a high level programming language. An important feature of the interface is its ability to perform dimer computations by eliminating the basis set superposition error using the counterpoise (CP) method of Boys and Bernardi. Furthermore it offers CP-corrected geometry optimizations using analytic derivatives. The unified optimization environment was applied to construct portions of the intermolecular potential energy surface of the weakly bound H-bonded complex C 6H 6-H 2O by utilizing the high level Merlin Control Language. The H-bonded dimer HF-H 2O was also studied by CP-corrected geometry optimization. The ab initio electronic structure energies were calculated using the 6-31G ** basis set at the Restricted Hartree-Fock and second-order Moller-Plesset levels, while all geometry optimizations were carried out using a quasi-Newton algorithm provided by Merlin. Program summaryTitle of program: MERGAM Catalogue identifier:ADYB_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADYB_v1_0 Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computer for which the program is designed and others on which it has been tested: The program is designed for machines running the UNIX operating system. It has been tested on the following architectures: IA32 (Linux with gcc/g77 v.3.2.3), AMD64 (Linux with the Portland group compilers v.6.0), SUN64 (SunOS 5.8 with the Sun Workshop compilers v.5.2) and SGI64 (IRIX 6.5 with the MIPSpro compilers v.7.4) Installations: University of Ioannina, Greece Operating systems or monitors under which the program has been tested: UNIX Programming language used: ANSI C, ANSI Fortran-77 No. of lines in distributed program, including test data, etc.:11 282 No. of bytes in distributed program, including test data, etc.: 49 458 Distribution format: tar.gz Memory required to execute with typical data: Memory requirements mainly depend on the selection of a GAMESS-US basis set and the number of atoms No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: no Nature of physical problem: Multidimensional geometry optimization is of great importance in any ab initio calculation since it usually is one of the most CPU-intensive tasks, especially on large molecular systems. For example, the geometric and energetic description of van der Waals and weakly bound H-bonded complexes requires the construction of related important portions of the multidimensional intermolecular potential energy surface (IPES). So the various held views about the nature of these bonds can be quantitatively tested. Method of solution: The Merlin/MCL optimization environment was interconnected with the GAMESS-US package to facilitate geometry optimization in quantum chemistry problems. The important portions of the IPES require the capability to program optimization strategies. The Merlin/MCL environment was used for the implementation of such strategies. In this work, a CP-corrected geometry optimization was performed on the HF-H 2O complex and an MCL program was developed to study portions of the potential energy surface of the C 6H 6-H 2O complex. Restrictions on the complexity of the problem: The Merlin optimization environment and the GAMESS-US package must be installed. The MERGAM interface requires GAMESS-US input files that have been constructed in Cartesian coordinates. This restriction occurs from a design-time requirement to not allow reorientation of atomic coordinates; this rule holds always true when applying the COORD = UNIQUE keyword in a GAMESS-US input file. Typical running time: It depends on the size of the molecular system, the size of the basis set and the method of electron correlation. Execution of the test run took approximately 5 min on a 2.8 GHz Intel Pentium CPU.

  20. Efficacy of the Aussie Optimism Program: Promoting Pro-social Behavior and Preventing Suicidality in Primary School Students. A Randomised-Controlled Trial

    PubMed Central

    Roberts, Clare M.; Kane, Robert T.; Rooney, Rosanna M.; Pintabona, Yolanda; Baughman, Natalie; Hassan, Sharinaz; Cross, Donna; Zubrick, Stephen R.; Silburn, Sven R.

    2018-01-01

    The efficacy of an enhanced version of the Aussie Optimism Program (AOP) was investigated in a cluster randomized controlled trial. Grade 6 students aged 10–11 years of age (N = 2288) from 63 government primary schools in Perth, Western Australia, participated in the pre, post, and follow-up study. Schools were randomly assigned to one of three conditions: Aussie Optimism with teacher training, Aussie Optimism with teacher training plus coaching, or a usual care condition that received the regular Western Australian Health Education Curriculum. Students in the Aussie Optimism conditions received 20, 1-h lessons relating to social and interpersonal skills and optimistic thinking skills over the last 2 years of primary school. Parents in the active conditions received a parent information booklet each year, plus a self-directed program in Grade 7. Students and parents completed the Extended Strengths and Difficulties Questionnaire. Students who scored in the clinical range on the Emotional Symptoms Scale were given The Diagnostic Interview for Children and Adolescents IV, to assess suicidal ideation and behavior, and depressive and anxiety disorders. Results indicated that Aussie Optimism with teacher training plus coaching was associated with the best outcomes: a significant increase in student-reported pro-social behavior from pre-test to post-test 1 (maintained at post-test 2) and significantly lower incidence rates from suicidal ideation at post-test 2 and follow-up. No significant intervention effects on anxiety and depressive disorders, and total difficulties were reported. These findings suggest that the AOP with teacher training along with coaching may have the potential to positively impact on suicidality and pro-social behavior in the pre-adolescent years. PMID:29599729

  1. Test Scheduling for Core-Based SOCs Using Genetic Algorithm Based Heuristic Approach

    NASA Astrophysics Data System (ADS)

    Giri, Chandan; Sarkar, Soumojit; Chattopadhyay, Santanu

    This paper presents a Genetic algorithm (GA) based solution to co-optimize test scheduling and wrapper design for core based SOCs. Core testing solutions are generated as a set of wrapper configurations, represented as rectangles with width equal to the number of TAM (Test Access Mechanism) channels and height equal to the corresponding testing time. A locally optimal best-fit heuristic based bin packing algorithm has been used to determine placement of rectangles minimizing the overall test times, whereas, GA has been utilized to generate the sequence of rectangles to be considered for placement. Experimental result on ITC'02 benchmark SOCs shows that the proposed method provides better solutions compared to the recent works reported in the literature.

  2. Machining Parameters Optimization using Hybrid Firefly Algorithm and Particle Swarm Optimization

    NASA Astrophysics Data System (ADS)

    Farahlina Johari, Nur; Zain, Azlan Mohd; Haszlinna Mustaffa, Noorfa; Udin, Amirmudin

    2017-09-01

    Firefly Algorithm (FA) is a metaheuristic algorithm that is inspired by the flashing behavior of fireflies and the phenomenon of bioluminescent communication and the algorithm is used to optimize the machining parameters (feed rate, depth of cut, and spindle speed) in this research. The algorithm is hybridized with Particle Swarm Optimization (PSO) to discover better solution in exploring the search space. Objective function of previous research is used to optimize the machining parameters in turning operation. The optimal machining cutting parameters estimated by FA that lead to a minimum surface roughness are validated using ANOVA test.

  3. The WHOMEN’s Scale (Women’s HAART Optimism Monitoring and EvaluatioN Scale v.1) and the Association with Fertility Intentions and Sexual Behaviours Among HIV-Positive Women in Uganda

    PubMed Central

    Lima, Viviane Dias; Andia, Irene; Kabakyenga, Jerome; Mbabazi, Pamela; Emenyonu, Nneka; Patterson, Thomas L.; Hogg, Robert S.; Bangsberg, David R.

    2013-01-01

    The objective of this study was to develop a reliable HAART optimism scale among HIV-positive women in Uganda and to test the scale’s validity against measures of fertility intentions, sexual activity, and unprotected sexual intercourse. We used cross-sectional survey data of 540 women (18–50 years) attending Mbarara University’s HIV clinic in Uganda. Women were asked how much they agreed or disagreed with 23 statements about HAART. Data were subjected to a principal components and factor analyses. Subsequently, we tested the association between the scale and fertility intentions and sexual behaviour using Wilcoxon rank sum test. Factor analysis yielded three factors, one of which was an eight-item HAART optimism scale with moderately high internal consistency (α = 0.70). Women who reported that they intended to have (more) children had significantly higher HAART optimism scores (median = 13.5 [IQR: 12–16]) than women who did not intend to have (more) children (median = 10.5 [IQR: 8–12]; P <0.0001). Similarly, women who were sexually active and who reported practicing unprotected sexual intercourse had significantly higher HAART optimism scores than women who were sexually abstinent or who practiced protected sexual intercourse. Our reliable and valid scale, termed the Women’s HAART Optimism Monitoring and EvaluatioN scale (WHOMEN’s scale), may be valuable to broader studies investigating the role of HAART optimism on reproductive intentions and sexual behaviours of HIV-positive women in high HIV prevalence settings. PMID:19387819

  4. The Application of Optimal Defaults to Improve Elementary School Lunch Selections: Proof of Concept

    ERIC Educational Resources Information Center

    Loeb, Katharine L.; Radnitz, Cynthia; Keller, Kathleen L.; Schwartz, Marlene B.; Zucker, Nancy; Marcus, Sue; Pierson, Richard N.; Shannon, Michael; DeLaurentis, Danielle

    2018-01-01

    Background: In this study, we applied behavioral economics to optimize elementary school lunch choices via parent-driven decisions. Specifically, this experiment tested an optimal defaults paradigm, examining whether strategically manipulating the health value of a default menu could be co-opted to improve school-based lunch selections. Methods:…

  5. Aerodynamic design using numerical optimization

    NASA Technical Reports Server (NTRS)

    Murman, E. M.; Chapman, G. T.

    1983-01-01

    The procedure of using numerical optimization methods coupled with computational fluid dynamic (CFD) codes for the development of an aerodynamic design is examined. Several approaches that replace wind tunnel tests, develop pressure distributions and derive designs, or fulfill preset design criteria are presented. The method of Aerodynamic Design by Numerical Optimization (ADNO) is described and illustrated with examples.

  6. Semidefinite Relaxation-Based Optimization of Multiple-Input Wireless Power Transfer Systems

    NASA Astrophysics Data System (ADS)

    Lang, Hans-Dieter; Sarris, Costas D.

    2017-11-01

    An optimization procedure for multi-transmitter (MISO) wireless power transfer (WPT) systems based on tight semidefinite relaxation (SDR) is presented. This method ensures physical realizability of MISO WPT systems designed via convex optimization -- a robust, semi-analytical and intuitive route to optimizing such systems. To that end, the nonconvex constraints requiring that power is fed into rather than drawn from the system via all transmitter ports are incorporated in a convex semidefinite relaxation, which is efficiently and reliably solvable by dedicated algorithms. A test of the solution then confirms that this modified problem is equivalent (tight relaxation) to the original (nonconvex) one and that the true global optimum has been found. This is a clear advantage over global optimization methods (e.g. genetic algorithms), where convergence to the true global optimum cannot be ensured or tested. Discussions of numerical results yielded by both the closed-form expressions and the refined technique illustrate the importance and practicability of the new method. It, is shown that this technique offers a rigorous optimization framework for a broad range of current and emerging WPT applications.

  7. Brief report: Assessing dispositional optimism in adolescence--factor structure and concurrent validity of the Life Orientation Test--Revised.

    PubMed

    Monzani, Dario; Steca, Patrizia; Greco, Andrea

    2014-02-01

    Dispositional optimism is an individual difference promoting psychosocial adjustment and well-being during adolescence. Dispositional optimism was originally defined as a one-dimensional construct; however, empirical evidence suggests two correlated factors in the Life Orientation Test - Revised (LOT-R). The main aim of the study was to evaluate the dimensionality of the LOT-R. This study is the first attempt to identify the best factor structure, comparing congeneric, two correlated-factor, and two orthogonal-factor models in a sample of adolescents. Concurrent validity was also assessed. The results demonstrated the superior fit of the two orthogonal-factor model thus reconciling the one-dimensional definition of dispositional optimism with the bi-dimensionality of the LOT-R. Moreover, the results of correlational analyses proved the concurrent validity of this self-report measure: optimism is moderately related to indices of psychosocial adjustment and well-being. Thus, the LOT-R is a useful, valid, and reliable self-report measure to properly assess optimism in adolescence. Copyright © 2013 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  8. A Longitudinal Examination of Hope and Optimism and Their Role in Type 1 Diabetes in Youths

    PubMed Central

    Steele, Ric G.; Nelson, Michael B.; Peugh, James; Egan, Anna; Clements, Mark; Patton, Susana R.

    2016-01-01

    Objectives To test the longitudinal associations between hope and optimism and health outcomes (i.e., HbA1c and self-monitored blood glucose [SMBG]) among youths with Type 1 diabetes mellitus (T1DM) over a 6-month period. Methods A total of 110 participants (aged 10–16 years) completed study measures at Time 1, and 81 completed measures at Time 2. Analyses examined hope and optimism as predictors of change in health outcomes, and examined SMBG as a mediator of the relationship between hope and optimism, and HbA1c. Results Change in hope, but not optimism, was associated with change in SMBG and HbA1c. Change in SMBG mediated the relationship between change in hope and HbA1c, but not between optimism and HbA1c. Conclusions It may be beneficial to assess hope in pediatric T1DM patients to identify youths who may be at risk for poor diabetes management, and to test the benefit of hope-based intervention efforts in clinical studies. PMID:26628250

  9. Investigating the detection of multi-homed devices independent of operating systems

    DTIC Science & Technology

    2017-09-01

    timestamp data was used to estimate clock skews using linear regression and linear optimization methods. Analysis revealed that detection depends on...the consistency of the estimated clock skew. Through vertical testing, it was also shown that clock skew consistency depends on the installed...optimization methods. Analysis revealed that detection depends on the consistency of the estimated clock skew. Through vertical testing, it was also

  10. A Simulation of Readiness-Based Sparing Policies

    DTIC Science & Technology

    2017-06-01

    variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...available in the optimization tools. 14. SUBJECT TERMS readiness-based sparing, discrete event simulation, optimization, multi-indenture...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the

  11. IPO: a tool for automated optimization of XCMS parameters.

    PubMed

    Libiseller, Gunnar; Dvorzak, Michaela; Kleb, Ulrike; Gander, Edgar; Eisenberg, Tobias; Madeo, Frank; Neumann, Steffen; Trausinger, Gert; Sinner, Frank; Pieber, Thomas; Magnes, Christoph

    2015-04-16

    Untargeted metabolomics generates a huge amount of data. Software packages for automated data processing are crucial to successfully process these data. A variety of such software packages exist, but the outcome of data processing strongly depends on algorithm parameter settings. If they are not carefully chosen, suboptimal parameter settings can easily lead to biased results. Therefore, parameter settings also require optimization. Several parameter optimization approaches have already been proposed, but a software package for parameter optimization which is free of intricate experimental labeling steps, fast and widely applicable is still missing. We implemented the software package IPO ('Isotopologue Parameter Optimization') which is fast and free of labeling steps, and applicable to data from different kinds of samples and data from different methods of liquid chromatography - high resolution mass spectrometry and data from different instruments. IPO optimizes XCMS peak picking parameters by using natural, stable (13)C isotopic peaks to calculate a peak picking score. Retention time correction is optimized by minimizing relative retention time differences within peak groups. Grouping parameters are optimized by maximizing the number of peak groups that show one peak from each injection of a pooled sample. The different parameter settings are achieved by design of experiments, and the resulting scores are evaluated using response surface models. IPO was tested on three different data sets, each consisting of a training set and test set. IPO resulted in an increase of reliable groups (146% - 361%), a decrease of non-reliable groups (3% - 8%) and a decrease of the retention time deviation to one third. IPO was successfully applied to data derived from liquid chromatography coupled to high resolution mass spectrometry from three studies with different sample types and different chromatographic methods and devices. We were also able to show the potential of IPO to increase the reliability of metabolomics data. The source code is implemented in R, tested on Linux and Windows and it is freely available for download at https://github.com/glibiseller/IPO . The training sets and test sets can be downloaded from https://health.joanneum.at/IPO .

  12. Optimization of the intravenous glucose tolerance test in T2DM patients using optimal experimental design.

    PubMed

    Silber, Hanna E; Nyberg, Joakim; Hooker, Andrew C; Karlsson, Mats O

    2009-06-01

    Intravenous glucose tolerance test (IVGTT) provocations are informative, but complex and laborious, for studying the glucose-insulin system. The objective of this study was to evaluate, through optimal design methodology, the possibilities of more informative and/or less laborious study design of the insulin modified IVGTT in type 2 diabetic patients. A previously developed model for glucose and insulin regulation was implemented in the optimal design software PopED 2.0. The following aspects of the study design of the insulin modified IVGTT were evaluated; (1) glucose dose, (2) insulin infusion, (3) combination of (1) and (2), (4) sampling times, (5) exclusion of labeled glucose. Constraints were incorporated to avoid prolonged hyper- and/or hypoglycemia and a reduced design was used to decrease run times. Design efficiency was calculated as a measure of the improvement with an optimal design compared to the basic design. The results showed that the design of the insulin modified IVGTT could be substantially improved by the use of an optimized design compared to the standard design and that it was possible to use a reduced number of samples. Optimization of sample times gave the largest improvement followed by insulin dose. The results further showed that it was possible to reduce the total sample time with only a minor loss in efficiency. Simulations confirmed the predictions from PopED. The predicted uncertainty of parameter estimates (CV) was low in all tested cases, despite the reduction in the number of samples/subject. The best design had a predicted average CV of parameter estimates of 19.5%. We conclude that improvement can be made to the design of the insulin modified IVGTT and that the most important design factor was the placement of sample times followed by the use of an optimal insulin dose. This paper illustrates how complex provocation experiments can be improved by sequential modeling and optimal design.

  13. Design Optimization and Analysis of a Composite Honeycomb Intertank

    NASA Technical Reports Server (NTRS)

    Finckenor, Jeff; Spurrier, Mile

    1999-01-01

    Intertanks, the structure between tanks of launch vehicles, are prime candidates for weight reduction of rockets. This paper discusses the optimization and detailed follow up analysis and testing of a 96 in. diameter, 77 in. tall intertank. The structure has composite face sheets with an aluminum honeycomb core. The ends taper to a thick built up laminate for a double lap bolted splice joint interface. It is made in 8 full length panels joined with bonded double lap joints. The nominal load is 4000 lb/in. Optimization is by Genetic Algorithm and minimizes weight by varying core thickness, number and orientation of acreage and buildup plies, and the size, number and spacing of bolts. A variety of design cases were run with populations up to 2000 and chromosomes as long as 150 bits. Constraints were buckling; face stresses (normal, shear, wrinkling and dimpling); bolt stress; and bolt hole stresses (bearing, net tension, wedge splitting, shear out and tension/shear out). Analysis is by a combination of elasticity solutions and empirical data. After optimization, a series of coupon tests were performed in conjunction with a rigorous analysis involving a variety of finite element models. This analysis and testing resulted in several small changes to the optimized design. The equation used for hole bearing strength was found to be inadequate, resulting in thicker ends. The core thickness increased 0.05", and potting compound was added in the taper to strengthen the facesheet bond. The intertank has undergone a 250,000 lb limit load test and been mated with a composite liquid hydrogen tank. The tank/intertank unit is being installed in a test stand where it will see 200 thermal/load cycles. Afterwards the intertank will be demated and loaded in compression to failure.

  14. The WOMEN study: what is the optimal method for ischemia evaluation in women? A multi-center, prospective, randomized study to establish the optimal method for detection of coronary artery disease (CAD) risk in women at an intermediate-high pretest likelihood of CAD: study design.

    PubMed

    Mieres, Jennifer H; Shaw, Leslee J; Hendel, Robert C; Heller, Gary V

    2009-01-01

    Coronary artery disease remains the leading cause of morbidity and mortality in women. The optimal non-invasive test for evaluation of ischemic heart disease in women is unknown. Although current guidelines support the choice of the exercise tolerance test (ETT) as a first line test for women with a normal baseline ECG and adequate exercise capabilities, supportive data for this recommendation are controversial. The what is the optimal method for ischemia evaluation in women? (WOMEN) study was designed to determine the optimal non-invasive strategy for CAD risk detection of intermediate and high risk women presenting with chest pain or equivalent symptoms suggestive of ischemic heart disease. The study will prospectively compare the 2-year event rates in women capable of performing exercise treadmill testing or Tc-99 m tetrofosmin SPECT myocardial perfusion imaging (MPI). The study will enroll women presenting for the evaluation of chest pain or anginal equivalent symptoms who are capable of performing >5 METs of exercise while at intermediate-high pretest risk for ischemic heart disease who will be randomized to either ETT testing alone or with Tc-99 m tetrofosmin SPECT MPI. The null hypothesis for this project is that the exercise ECG has the same negative predictive value for risk detection as gated myocardial perfusion SPECT in women. The primary aim is to compare 2-year cardiac event rates in women randomized to SPECT MPI to those randomized to ETT. The WOMEN study seeks to provide objective information for guidelines for the evaluation of symptomatic women with an intermediate-high likelihood for CAD.

  15. Hybrid maize breeding with doubled haploids: I. One-stage versus two-stage selection for testcross performance.

    PubMed

    Longin, C Friedrich H; Utz, H Friedrich; Reif, Jochen C; Schipprack, Wolfgang; Melchinger, Albrecht E

    2006-03-01

    Optimum allocation of resources is of fundamental importance for the efficiency of breeding programs. The objectives of our study were to (1) determine the optimum allocation for the number of lines and test locations in hybrid maize breeding with doubled haploids (DHs) regarding two optimization criteria, the selection gain deltaG(k) and the probability P(k) of identifying superior genotypes, (2) compare both optimization criteria including their standard deviations (SDs), and (3) investigate the influence of production costs of DHs on the optimum allocation. For different budgets, number of finally selected lines, ratios of variance components, and production costs of DHs, the optimum allocation of test resources under one- and two-stage selection for testcross performance with a given tester was determined by using Monte Carlo simulations. In one-stage selection, lines are tested in field trials in a single year. In two-stage selection, optimum allocation of resources involves evaluation of (1) a large number of lines in a small number of test locations in the first year and (2) a small number of the selected superior lines in a large number of test locations in the second year, thereby maximizing both optimization criteria. Furthermore, to have a realistic chance of identifying a superior genotype, the probability P(k) of identifying superior genotypes should be greater than 75%. For budgets between 200 and 5,000 field plot equivalents, P(k) > 75% was reached only for genotypes belonging to the best 5% of the population. As the optimum allocation for P(k)(5%) was similar to that for deltaG(k), the choice of the optimization criterion was not crucial. The production costs of DHs had only a minor effect on the optimum number of locations and on values of the optimization criteria.

  16. Optimization and Validation of a Plaque Reduction Neutralization Test for the Detection of Neutralizing Antibodies to Four Serotypes of Dengue Virus Used in Support of Dengue Vaccine Development

    PubMed Central

    Timiryasova, Tatyana M.; Bonaparte, Matthew I.; Luo, Ping; Zedar, Rebecca; Hu, Branda T.; Hildreth, Stephen W.

    2013-01-01

    A dengue plaque reduction neutralization test (PRNT) to measure dengue serotype–specific neutralizing antibodies for all four virus serotypes was developed, optimized, and validated in accordance with guidelines for validation of bioanalytical test methods using human serum samples from dengue-infected persons and persons receiving a dengue vaccine candidate. Production and characterization of dengue challenge viruses used in the assay was standardized. Once virus stocks were characterized, the dengue PRNT50 for each of the four serotypes was optimized according to a factorial design of experiments approach for critical test parameters, including days of cell seeding before testing, percentage of overlay carboxymethylcellulose medium, and days of incubation post-infection to generate a robust assay. The PRNT50 was then validated and demonstrated to be suitable to detect and measure dengue serotype-specific neutralizing antibodies in human serum samples with acceptable intra-assay and inter-assay precision, accuracy/dilutability, specificity, and with a lower limit of quantitation of 10. PMID:23458954

  17. Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing

    NASA Technical Reports Server (NTRS)

    Pak, Chan-gi; Lung, Shun-fat

    2010-01-01

    Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center s (Edwards, California, USA) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25-percent change in flutter speed has been shown after reducing the uncertainties

  18. °Enhancing High Temperature Anode Performance with 2° Anchoring Phases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Robert A.; Sofie, Stephen W.; Amendola, Roberta

    2015-10-01

    Project accomplishments included developing and optimizing strength testing of aluminum titanate (ALT)-doped Ni-YSZ materials and identified the dopant levels that optimized mechanical strength and enhanced electrochemical performance. We also optimized our ability to fabricate electrolyte supported button cells with anodes consisting of powders provided by Fuel Cell Energy. In several instances, those anodes were infiltrated with ALT and tested with hydrogen for 30 hours at 800°C at an applied potential of 0.4 V. Our research activities were focused in three areas: 1) mechanical strength testing on as prepared and reducced nickel-YSZ structures that were either free of a dopant ormore » prepared by mechanically mixing in ALT at various weight percents (up to 10 wt%); 2) 24-hour electrochemical testing of electroylte supported cells having anodes made from Ni/YSZ and Ni/YSZ/ALT anodes with specific attention focused on modeling degradation rates; and 3) operando EIS and optical testing of both in-house fabricated devices as well as membrane electrode assemblies that were acquired from commercial vendors.« less

  19. Reduced Uncertainties in the Flutter Analysis of the Aerostructures Test Wing

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Lung, Shun Fat

    2011-01-01

    Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. A test validated finite element model can provide a reliable flutter analysis to define the flutter placard speed to which the aircraft can be flown prior to flight flutter testing. Minimizing the difference between numerical and experimental results is a type of optimization problem. Through the use of the National Aeronautics and Space Administration Dryden Flight Research Center's (Edwards, California) multidisciplinary design, analysis, and optimization tool to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes are matched to the target data, and the mass matrix orthogonality is retained. The approach in this study has been applied to minimize the model uncertainties for the structural dynamic model of the aerostructures test wing, which was designed, built, and tested at the National Aeronautics and Space Administration Dryden Flight Research Center. A 25 percent change in flutter speed has been shown after reducing the uncertainties.

  20. Controlled laboratory testing of arthroscopic shaver systems: do blades, contact pressure, and speed influence their performance?

    PubMed

    Wieser, Karl; Erschbamer, Matthias; Neuhofer, Stefan; Ek, Eugene T; Gerber, Christian; Meyer, Dominik C

    2012-10-01

    The purposes of this study were (1) to establish a reproducible, standardized testing protocol to evaluate the performance of different shaver systems and blades in a controlled, laboratory setting, and (2) to determine the optimal use of different blades with respect to the influence of contact pressure and speed of blade rotation. A holding device was developed for reproducible testing of soft-tissue (tendon and meniscal) resection performance in a submerged environment, after loading of the shaver with interchangeable weights. The Karl Storz Powershaver S2 (Karl Storz, Tuttlingen, Germany), the Stryker Power Shaver System (Stryker, Kalamazoo, MI), and the Dyonics Power Shaver System (Smith & Nephew, Andover, MA) were tested, with different 5.5-mm shaver blades and varied contact pressure and rotation speed. For quality testing, serrated shaver blades were evaluated at 40× image magnification. Overall, more than 150 test cycles were performed. No significant differences could be detected between comparable blade types from different manufacturers. Shavers with a serrated inner blade and smooth outer blade performed significantly better than the standard smooth resectors (P < .001). Teeth on the outer layer of the blade did not lead to any further improvement of resection (P = .482). Optimal contact pressure ranged between 6 and 8 N, and optimal speed was found to be 2,000 to 2,500 rpm. Minimal blunting of the shaver blades occurred after soft-tissue resection; however, with bone resection, progressive blunting of the shaver blades was observed. Arthroscopic shavers can be tested in a controlled setting. The performance of the tested shaver types appears to be fairly independent of the manufacturer. For tendon resection, a smooth outer blade and serrated inner blade were optimal. This is one of the first established independent and quantitative assessments of arthroscopic shaver systems and blades. We believe that this study will assist the surgeon in choosing the optimal tool for the desired effect. Copyright © 2012 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  1. On the Optimization of Aerospace Plane Ascent Trajectory

    NASA Astrophysics Data System (ADS)

    Al-Garni, Ahmed; Kassem, Ayman Hamdy

    A hybrid heuristic optimization technique based on genetic algorithms and particle swarm optimization has been developed and tested for trajectory optimization problems with multi-constraints and a multi-objective cost function. The technique is used to calculate control settings for two types for ascending trajectories (constant dynamic pressure and minimum-fuel-minimum-heat) for a two-dimensional model of an aerospace plane. A thorough statistical analysis is done on the hybrid technique to make comparisons with both basic genetic algorithms and particle swarm optimization techniques with respect to convergence and execution time. Genetic algorithm optimization showed better execution time performance while particle swarm optimization showed better convergence performance. The hybrid optimization technique, benefiting from both techniques, showed superior robust performance compromising convergence trends and execution time.

  2. Design and development of bio-inspired framework for reservoir operation optimization

    NASA Astrophysics Data System (ADS)

    Asvini, M. Sakthi; Amudha, T.

    2017-12-01

    Frameworks for optimal reservoir operation play an important role in the management of water resources and delivery of economic benefits. Effective utilization and conservation of water from reservoirs helps to manage water deficit periods. The main challenge in reservoir optimization is to design operating rules that can be used to inform real-time decisions on reservoir release. We develop a bio-inspired framework for the optimization of reservoir release to satisfy the diverse needs of various stakeholders. In this work, single-objective optimization and multiobjective optimization problems are formulated using an algorithm known as "strawberry optimization" and tested with actual reservoir data. Results indicate that well planned reservoir operations lead to efficient deployment of the reservoir water with the help of optimal release patterns.

  3. Optimal In-Hospital and Discharge Medical Therapy in Acute Coronary Syndromes in Kerala: Results from the Kerala ACS Registry

    PubMed Central

    Huffman, Mark D; Prabhakaran, Dorairaj; Abraham, AK; Krishnan, Mangalath Narayanan; Nambiar, C. Asokan; Mohanan, Padinhare Purayil

    2013-01-01

    Background In-hospital and post-discharge treatment rates for acute coronary syndrome (ACS) remain low in India. However, little is known about the prevalence and predictors of the package of optimal ACS medical care in India. Our objective was to define the prevalence, predictors, and impact of optimal in-hospital and discharge medical therapy in the Kerala ACS Registry of 25,718 admissions. Methods and Results We defined optimal in-hospital ACS medical therapy as receiving the following five medications: aspirin, clopidogrel, heparin, beta-blocker, and statin. We defined optimal discharge ACS medical therapy as receiving all of the above therapies except heparin. Comparisons by optimal vs. non-optimal ACS care were made via Student’s t test for continuous variables and chi-square test for categorical variables. We created random effects logistic regression models to evaluate the association between GRACE risk score variables and optimal in-hospital or discharge medical therapy. Optimal in-hospital and discharge medical care was delivered in 40% and 46% of admissions, respectively. Wide variability in both in-hospital and discharge medical care was present with few hospitals reaching consistently high (>90%) levels. Patients receiving optimal in-hospital medical therapy had an adjusted OR (95%CI)=0.93 (0.71, 1.22) for in-hospital death and an adjusted OR (95%CI)=0.79 (0.63, 0.99) for MACE. Patients who received optimal in-hospital medical care were far more likely to receive optimal discharge care (adjusted OR [95%CI]=10.48 [9.37, 11.72]). Conclusions Strategies to improve in-hospital and discharge medical therapy are needed to improve local process-of-care measures and improve ACS outcomes in Kerala. PMID:23800985

  4. Determination of MLC model parameters for Monaco using commercial diode arrays.

    PubMed

    Kinsella, Paul; Shields, Laura; McCavana, Patrick; McClean, Brendan; Langan, Brian

    2016-07-08

    Multileaf collimators (MLCs) need to be characterized accurately in treatment planning systems to facilitate accurate intensity-modulated radiation therapy (IMRT) and volumetric-modulated arc therapy (VMAT). The aim of this study was to examine the use of MapCHECK 2 and ArcCHECK diode arrays for optimizing MLC parameters in Monaco X-ray voxel Monte Carlo (XVMC) dose calculation algorithm. A series of radiation test beams designed to evaluate MLC model parameters were delivered to MapCHECK 2, ArcCHECK, and EBT3 Gafchromic film for comparison. Initial comparison of the calculated and ArcCHECK-measured dose distributions revealed it was unclear how to change the MLC parameters to gain agreement. This ambiguity arose due to an insufficient sampling of the test field dose distributions and unexpected discrepancies in the open parts of some test fields. Consequently, the XVMC MLC parameters were optimized based on MapCHECK 2 measurements. Gafchromic EBT3 film was used to verify the accuracy of MapCHECK 2 measured dose distributions. It was found that adjustment of the MLC parameters from their default values resulted in improved global gamma analysis pass rates for MapCHECK 2 measurements versus calculated dose. The lowest pass rate of any MLC-modulated test beam improved from 68.5% to 93.5% with 3% and 2 mm gamma criteria. Given the close agreement of the optimized model to both MapCHECK 2 and film, the optimized model was used as a benchmark to highlight the relatively large discrepancies in some of the test field dose distributions found with ArcCHECK. Comparison between the optimized model-calculated dose and ArcCHECK-measured dose resulted in global gamma pass rates which ranged from 70.0%-97.9% for gamma criteria of 3% and 2 mm. The simple square fields yielded high pass rates. The lower gamma pass rates were attributed to the ArcCHECK overestimating the dose in-field for the rectangular test fields whose long axis was parallel to the long axis of the ArcCHECK. Considering ArcCHECK measurement issues and the lower gamma pass rates for the MLC-modulated test beams, it was concluded that MapCHECK 2 was a more suitable detector than ArcCHECK for the optimization process. © 2016 The Authors

  5. A Comparison of Trajectory Optimization Methods for the Impulsive Minimum Fuel Rendezvous Problem

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Mailhe, Laurie M.; Guzman, Jose J.

    2002-01-01

    In this paper we present a comparison of optimization approaches to the minimum fuel rendezvous problem. Both indirect and direct methods are compared for a variety of test cases. The indirect approach is based on primer vector theory. The direct approaches are implemented numerically and include Sequential Quadratic Programming (SQP), Quasi-Newton, Simplex, Genetic Algorithms, and Simulated Annealing. Each method is applied to a variety of test cases including, circular to circular coplanar orbits, LEO to GEO, and orbit phasing in highly elliptic orbits. We also compare different constrained optimization routines on complex orbit rendezvous problems with complicated, highly nonlinear constraints.

  6. Sculpt test problem analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sweetser, John David

    2013-10-01

    This report details Sculpt's implementation from a user's perspective. Sculpt is an automatic hexahedral mesh generation tool developed at Sandia National Labs by Steve Owen. 54 predetermined test cases are studied while varying the input parameters (Laplace iterations, optimization iterations, optimization threshold, number of processors) and measuring the quality of the resultant mesh. This information is used to determine the optimal input parameters to use for an unknown input geometry. The overall characteristics are covered in Chapter 1. The speci c details of every case are then given in Appendix A. Finally, example Sculpt inputs are given in B.1 andmore » B.2.« less

  7. Experimental study of optimal self compacting concrete with spent foundry sand as partial replacement for M-sand using Taguchi approach

    NASA Astrophysics Data System (ADS)

    Nirmala, D. B.; Raviraj, S.

    2016-06-01

    This paper presents the application of Taguchi approach to obtain optimal mix proportion for Self Compacting Concrete (SCC) containing spent foundry sand and M-sand. Spent foundry sand is used as a partial replacement for M-sand. The SCC mix has seven control factors namely, Coarse aggregate, M-sand with Spent Foundry sand, Cement, Fly ash, Water, Super plasticizer and Viscosity modifying agent. Modified Nan Su method is used to proportion the initial SCC mix. L18 (21×37) Orthogonal Arrays (OA) with the seven control factors having 3 levels is used in Taguchi approach which resulted in 18 SCC mix proportions. All mixtures are extensively tested both in fresh and hardened states to verify whether they meet the practical and technical requirements of SCC. The quality characteristics considering "Nominal the better" situation is applied to the test results to arrive at the optimal SCC mix proportion. Test results indicate that the optimal mix satisfies the requirements of fresh and hardened properties of SCC. The study reveals the feasibility of using spent foundry sand as a partial replacement of M-sand in SCC and also that Taguchi method is a reliable tool to arrive at optimal mix proportion of SCC.

  8. The Relationship Between Heart Rate Reserve and Oxygen Uptake Reserve in Heart Failure Patients on Optimized and Non-Optimized Beta-Blocker Therapy

    PubMed Central

    Carvalho, Vitor Oliveira; Guimarães, Guilherme Veiga; Bocchi, Edimar Alcides

    2008-01-01

    BACKGROUND The relationship between the percentage of oxygen consumption reserve and percentage of heart rate reserve in heart failure patients either on non-optimized or off beta-blocker therapy is known to be unreliable. The aim of this study was to evaluate the relationship between the percentage of oxygen consumption reserve and percentage of heart rate reserve in heart failure patients receiving optimized and non-optimized beta-blocker treatment during a treadmill cardiopulmonary exercise test. METHODS A total of 27 sedentary heart failure patients (86% male, 50±12 years) on optimized beta-blocker therapy with a left ventricle ejection fraction of 33±8% and 35 sedentary non-optimized heart failure patients (75% male, 47±10 years) with a left ventricle ejection fraction of 30±10% underwent the treadmill cardiopulmonary exercise test (Naughton protocol). Resting and peak effort values of both the percentage of oxygen consumption reserve and percentage of heart rate reserve were, by definition, 0 and 100, respectively. RESULTS The heart rate slope for the non-optimized group was derived from the points 0.949±0.088 (0 intercept) and 1.055±0.128 (1 intercept), p<0.0001. The heart rate slope for the optimized group was derived from the points 1.026±0.108 (0 intercept) and 1.012±0.108 (1 intercept), p=0.47. Regression linear plots for the heart rate slope for each patient in the non-optimized and optimized groups revealed a slope of 0.986 (almost perfect) for the optimized group, but the regression analysis for the non-optimized group was 0.030 (far from perfect, which occurs at 1). CONCLUSION The relationship between the percentage of oxygen consumption reserve and percentage of heart rate reserve in patients on optimized beta-blocker therapy was reliable, but this relationship was unreliable in non-optimized heart failure patients. PMID:19060991

  9. Fully optimized shaped pupils: preparation for a test at the Subaru Telescope

    NASA Astrophysics Data System (ADS)

    Carlotti, Alexis; Kasdin, N. Jeremy; Martinache, Frantz; Vanderbei, Robert J.; Young, Elizabeth J.; Che, George; Groff, Tyler D.; Guyon, Olivier

    2012-09-01

    The SCExAO instrument at the Subaru telescope, mainly based on a PIAA coronagraph can benefit from the addition of a robust and simple shaped pupil coronagraph. New shaped pupils, fully optimized in 2 dimensions, make it possible to design optimal apodizers for arbitrarily complex apertures, for instance on-axis telescopes such as the Subaru telescope. We have designed several masks with inner working angles as small as 2.5 λ / D, and for high-contrast regions with different shapes. Using Princeton University nanofabrication facilities, we have manufactured two masks by photolithography. These masks have been tested in the laboratory, both in Princeton and in the facilities of the National Astronomical Observatory of Japan (NAOJ) in Hilo. The goal of this work is to prepare tests on the sky of a shaped pupil coronagraph in 2012.

  10. Effect of a road safety training program on drivers' comparative optimism.

    PubMed

    Perrissol, Stéphane; Smeding, Annique; Laumond, Francis; Le Floch, Valérie

    2011-01-01

    Reducing comparative optimism regarding risk perceptions in traffic accidents has been proven to be particularly difficult (Delhomme, 2000). This is unfortunate because comparative optimism is assumed to impede preventive action. The present study tested whether a road safety training course could reduce drivers' comparative optimism in high control situations. Results show that the training course efficiently reduced comparative optimism in high control, but not in low control situations. Mechanisms underlying this finding and implications for the design of road safety training courses are discussed. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Optimization of wood plastic composite decks

    NASA Astrophysics Data System (ADS)

    Ravivarman, S.; Venkatesh, G. S.; Karmarkar, A.; Shivkumar N., D.; Abhilash R., M.

    2018-04-01

    Wood Plastic Composite (WPC) is a new class of natural fibre based composite material that contains plastic matrix reinforced with wood fibres or wood flour. In the present work, Wood Plastic Composite was prepared with 70-wt% of wood flour reinforced in polypropylene matrix. Mechanical characterization of the composite was done by carrying out laboratory tests such as tensile test and flexural test as per the American Society for Testing and Materials (ASTM) standards. Computer Aided Design (CAD) model of the laboratory test specimen (tensile test) was created and explicit finite element analysis was carried out on the finite element model in non-linear Explicit FE code LS - DYNA. The piecewise linear plasticity (MAT 24) material model was identified as a suitable model in LS-DYNA material library, describing the material behavior of the developed composite. The composite structures for decking application in construction industry were then optimized for cross sectional area and distance between two successive supports (span length) by carrying out various numerical experiments in LS-DYNA. The optimized WPC deck (Elliptical channel-2 E10) has 45% reduced weight than the baseline model (solid cross-section) considered in this study with the load carrying capacity meeting acceptance criterion (allowable deflection & stress) for outdoor decking application.

  12. Dispositional optimism, self-framing and medical decision-making.

    PubMed

    Zhao, Xu; Huang, Chunlei; Li, Xuesong; Zhao, Xin; Peng, Jiaxi

    2015-03-01

    Self-framing is an important but underinvestigated area in risk communication and behavioural decision-making, especially in medical settings. The present study aimed to investigate the relationship among dispositional optimism, self-frame and decision-making. Participants (N = 500) responded to the Life Orientation Test-Revised and self-framing test of medical decision-making problem. The participants whose scores were higher than the middle value were regarded as highly optimistic individuals. The rest were regarded as low optimistic individuals. The results showed that compared to the high dispositional optimism group, participants from the low dispositional optimism group showed a greater tendency to use negative vocabulary to construct their self-frame, and tended to choose the radiation therapy with high treatment survival rate, but low 5-year survival rate. Based on the current findings, it can be concluded that self-framing effect still exists in medical situation and individual differences in dispositional optimism can influence the processing of information in a framed decision task, as well as risky decision-making. © 2014 International Union of Psychological Science.

  13. Multidisciplinary Optimization of a Transport Aircraft Wing using Particle Swarm Optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Venter, Gerhard

    2002-01-01

    The purpose of this paper is to demonstrate the application of particle swarm optimization to a realistic multidisciplinary optimization test problem. The paper's new contributions to multidisciplinary optimization is the application of a new algorithm for dealing with the unique challenges associated with multidisciplinary optimization problems, and recommendations as to the utility of the algorithm in future multidisciplinary optimization applications. The selected example is a bi-level optimization problem that demonstrates severe numerical noise and has a combination of continuous and truly discrete design variables. The use of traditional gradient-based optimization algorithms is thus not practical. The numerical results presented indicate that the particle swarm optimization algorithm is able to reliably find the optimum design for the problem presented here. The algorithm is capable of dealing with the unique challenges posed by multidisciplinary optimization as well as the numerical noise and truly discrete variables present in the current example problem.

  14. Determination of the optimal mesh parameters for Iguassu centrifuge flow and separation calculations

    NASA Astrophysics Data System (ADS)

    Romanihin, S. M.; Tronin, I. V.

    2016-09-01

    We present the method and the results of the determination for optimal computational mesh parameters for axisymmetric modeling of flow and separation in the Iguasu gas centrifuge. The aim of this work was to determine the mesh parameters which provide relatively low computational cost whithout loss of accuracy. We use direct search optimization algorithm to calculate optimal mesh parameters. Obtained parameters were tested by the calculation of the optimal working regime of the Iguasu GC. Separative power calculated using the optimal mesh parameters differs less than 0.5% from the result obtained on the detailed mesh. Presented method can be used to determine optimal mesh parameters of the Iguasu GC with different rotor speeds.

  15. Testing the Limits of Optimizing Dual-Task Performance in Younger and Older Adults

    PubMed Central

    Strobach, Tilo; Frensch, Peter; Müller, Herrmann Josef; Schubert, Torsten

    2012-01-01

    Impaired dual-task performance in younger and older adults can be improved with practice. Optimal conditions even allow for a (near) elimination of this impairment in younger adults. However, it is unknown whether such (near) elimination is the limit of performance improvements in older adults. The present study tests this limit in older adults under conditions of (a) a high amount of dual-task training and (b) training with simplified component tasks in dual-task situations. The data showed that a high amount of dual-task training in older adults provided no evidence for an improvement of dual-task performance to the optimal dual-task performance level achieved by younger adults. However, training with simplified component tasks in dual-task situations exclusively in older adults provided a similar level of optimal dual-task performance in both age groups. Therefore through applying a testing the limits approach, we demonstrated that older adults improved dual-task performance to the same level as younger adults at the end of training under very specific conditions. PMID:22408613

  16. Extensions of D-optimal Minimal Designs for Symmetric Mixture Models.

    PubMed

    Li, Yanyan; Raghavarao, Damaraju; Chervoneva, Inna

    2017-01-01

    The purpose of mixture experiments is to explore the optimum blends of mixture components, which will provide desirable response characteristics in finished products. D-optimal minimal designs have been considered for a variety of mixture models, including Scheffé's linear, quadratic, and cubic models. Usually, these D-optimal designs are minimally supported since they have just as many design points as the number of parameters. Thus, they lack the degrees of freedom to perform the Lack of Fit tests. Also, the majority of the design points in D-optimal minimal designs are on the boundary: vertices, edges, or faces of the design simplex. Also a new strategy for adding multiple interior points for symmetric mixture models is proposed. We compare the proposed designs with Cornell (1986) two ten-point designs for the Lack of Fit test by simulations.

  17. Using genetic algorithms to determine near-optimal pricing, investment and operating strategies in the electric power industry

    NASA Astrophysics Data System (ADS)

    Wu, Dongjun

    Network industries have technologies characterized by a spatial hierarchy, the "network," with capital-intensive interconnections and time-dependent, capacity-limited flows of products and services through the network to customers. This dissertation studies service pricing, investment and business operating strategies for the electric power network. First-best solutions for a variety of pricing and investment problems have been studied. The evaluation of genetic algorithms (GA, which are methods based on the idea of natural evolution) as a primary means of solving complicated network problems, both w.r.t. pricing: as well as w.r.t. investment and other operating decisions, has been conducted. New constraint-handling techniques in GAs have been studied and tested. The actual application of such constraint-handling techniques in solving practical non-linear optimization problems has been tested on several complex network design problems with encouraging initial results. Genetic algorithms provide solutions that are feasible and close to optimal when the optimal solution is know; in some instances, the near-optimal solutions for small problems by the proposed GA approach can only be tested by pushing the limits of currently available non-linear optimization software. The performance is far better than several commercially available GA programs, which are generally inadequate in solving any of the problems studied in this dissertation, primarily because of their poor handling of constraints. Genetic algorithms, if carefully designed, seem very promising in solving difficult problems which are intractable by traditional analytic methods.

  18. A Simultaneous Approach to Optimizing Treatment Assignments with Mastery Scores. Research Report 89-5.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    An approach to simultaneous optimization of assignments of subjects to treatments followed by an end-of-mastery test is presented using the framework of Bayesian decision theory. Focus is on demonstrating how rules for the simultaneous optimization of sequences of decisions can be found. The main advantages of the simultaneous approach, compared…

  19. Testing Optimal Foraging Theory Using Bird Predation on Goldenrod Galls

    ERIC Educational Resources Information Center

    Yahnke, Christopher J.

    2006-01-01

    All animals must make choices regarding what foods to eat, where to eat, and how much time to spend feeding. Optimal foraging theory explains these behaviors in terms of costs and benefits. This laboratory exercise focuses on optimal foraging theory by investigating the winter feeding behavior of birds on the goldenrod gall fly by comparing…

  20. High Temperature Tribometer. Phase 1

    DTIC Science & Technology

    1989-06-01

    13 Figure 2.3.2 Setpoint and Gain Windows in FW.EXE ......... . Figure 2.4.1 Data-Flow Diagram for Data-Acquisition Module ..... .. 23 I Figure...mounted in a friction force measuring device. Optimally , material testing results should not be test machine sensitiye; but due to equipment variables...fixed. The friction force due to sliding should be continuously measured. This is optimally done in conjunction with the normal force measurement via

  1. Meeting the challenges with the Douglas Aircraft Company Aeroelastic Design Optimization Program (ADOP)

    NASA Technical Reports Server (NTRS)

    Rommel, Bruce A.

    1989-01-01

    An overview of the Aeroelastic Design Optimization Program (ADOP) at the Douglas Aircraft Company is given. A pilot test program involving the animation of mode shapes with solid rendering as well as wire frame displays, a complete aircraft model of a high-altitude hypersonic aircraft to test ADOP procedures, a flap model, and an aero-mesh modeler for doublet lattice aerodynamics are discussed.

  2. Multiobjective generalized extremal optimization algorithm for simulation of daylight illuminants

    NASA Astrophysics Data System (ADS)

    Kumar, Srividya Ravindra; Kurian, Ciji Pearl; Gomes-Borges, Marcos Eduardo

    2017-10-01

    Daylight illuminants are widely used as references for color quality testing and optical vision testing applications. Presently used daylight simulators make use of fluorescent bulbs that are not tunable and occupy more space inside the quality testing chambers. By designing a spectrally tunable LED light source with an optimal number of LEDs, cost, space, and energy can be saved. This paper describes an application of the generalized extremal optimization (GEO) algorithm for selection of the appropriate quantity and quality of LEDs that compose the light source. The multiobjective approach of this algorithm tries to get the best spectral simulation with minimum fitness error toward the target spectrum, correlated color temperature (CCT) the same as the target spectrum, high color rendering index (CRI), and luminous flux as required for testing applications. GEO is a global search algorithm based on phenomena of natural evolution and is especially designed to be used in complex optimization problems. Several simulations have been conducted to validate the performance of the algorithm. The methodology applied to model the LEDs, together with the theoretical basis for CCT and CRI calculation, is presented in this paper. A comparative result analysis of M-GEO evolutionary algorithm with the Levenberg-Marquardt conventional deterministic algorithm is also presented.

  3. Efficiency Improvements to the Displacement Based Multilevel Structural Optimization Algorithm

    NASA Technical Reports Server (NTRS)

    Plunkett, C. L.; Striz, A. G.; Sobieszczanski-Sobieski, J.

    2001-01-01

    Multilevel Structural Optimization (MSO) continues to be an area of research interest in engineering optimization. In the present project, the weight optimization of beams and trusses using Displacement based Multilevel Structural Optimization (DMSO), a member of the MSO set of methodologies, is investigated. In the DMSO approach, the optimization task is subdivided into a single system and multiple subsystems level optimizations. The system level optimization minimizes the load unbalance resulting from the use of displacement functions to approximate the structural displacements. The function coefficients are then the design variables. Alternately, the system level optimization can be solved using the displacements themselves as design variables, as was shown in previous research. Both approaches ensure that the calculated loads match the applied loads. In the subsystems level, the weight of the structure is minimized using the element dimensions as design variables. The approach is expected to be very efficient for large structures, since parallel computing can be utilized in the different levels of the problem. In this paper, the method is applied to a one-dimensional beam and a large three-dimensional truss. The beam was tested to study possible simplifications to the system level optimization. In previous research, polynomials were used to approximate the global nodal displacements. The number of coefficients of the polynomials equally matched the number of degrees of freedom of the problem. Here it was desired to see if it is possible to only match a subset of the degrees of freedom in the system level. This would lead to a simplification of the system level, with a resulting increase in overall efficiency. However, the methods tested for this type of system level simplification did not yield positive results. The large truss was utilized to test further improvements in the efficiency of DMSO. In previous work, parallel processing was applied to the subsystems level, where the derivative verification feature of the optimizer NPSOL had been utilized in the optimizations. This resulted in large runtimes. In this paper, the optimizations were repeated without using the derivative verification, and the results are compared to those from the previous work. Also, the optimizations were run on both, a network of SUN workstations using the MPICH implementation of the Message Passing Interface (MPI) and on the faster Beowulf cluster at ICASE, NASA Langley Research Center, using the LAM implementation of UP]. The results on both systems were consistent and showed that it is not necessary to verify the derivatives and that this gives a large increase in efficiency of the DMSO algorithm.

  4. Time Scale Optimization and the Hunt for Astronomical Cycles in Deep Time Strata

    NASA Astrophysics Data System (ADS)

    Meyers, Stephen R.

    2016-04-01

    A valuable attribute of astrochronology is the direct link between chronometer and climate change, providing a remarkable opportunity to constrain the evolution of the surficial Earth System. Consequently, the hunt for astronomical cycles in strata has spurred the development of a rich conceptual framework for climatic/oceanographic change, and has allowed exploration of the geologic record with unprecedented temporal resolution. Accompanying these successes, however, has been a persistent skepticism about appropriate astrochronologic testing and circular reasoning: how does one reliably test for astronomical cycles in stratigraphic data, especially when time is poorly constrained? From this perspective, it would seem that the merits and promise of astrochronology (e.g., a geologic time scale measured in ≤400 kyr increments) also serves as its Achilles heel, if the confirmation of such short rhythms defies rigorous statistical testing. To address these statistical challenges in astrochronologic testing, a new approach has been developed that (1) explicitly evaluates time scale uncertainty, (2) is resilient to common problems associated with spectrum confidence level assessment and 'multiple testing', and (3) achieves high statistical power under a wide range of conditions (it can identify astronomical cycles when present in data). Designated TimeOpt (for "time scale optimization"; Meyers 2015), the method employs a probabilistic linear regression model framework to investigate amplitude modulation and frequency ratios (bundling) in stratigraphic data, while simultaneously determining the optimal time scale. This presentation will review the TimeOpt method, and demonstrate how the flexible statistical framework can be further extended to evaluate (and optimize upon) complex sedimentation rate models, enhancing the statistical power of the approach, and addressing the challenge of unsteady sedimentation. Meyers, S. R. (2015), The evaluation of eccentricity-related amplitude modulation and bundling in paleoclimate data: An inverse approach for astrochronologic testing and time scale optimization, Paleoceanography, 30, doi:10.1002/ 2015PA002850.

  5. Investigation of optimized graded concrete for Oklahoma.

    DOT National Transportation Integrated Search

    2013-07-01

    This report presents the results of several novel test methods to investigate concrete for slip formed paving. These tests include the Box Test, a novel test to evaluate the response of concrete to vibration, the AIMS2, an automated test for aggregat...

  6. Identification of vehicle suspension parameters by design optimization

    NASA Astrophysics Data System (ADS)

    Tey, J. Y.; Ramli, R.; Kheng, C. W.; Chong, S. Y.; Abidin, M. A. Z.

    2014-05-01

    The design of a vehicle suspension system through simulation requires accurate representation of the design parameters. These parameters are usually difficult to measure or sometimes unavailable. This article proposes an efficient approach to identify the unknown parameters through optimization based on experimental results, where the covariance matrix adaptation-evolutionary strategy (CMA-es) is utilized to improve the simulation and experimental results against the kinematic and compliance tests. This speeds up the design and development cycle by recovering all the unknown data with respect to a set of kinematic measurements through a single optimization process. A case study employing a McPherson strut suspension system is modelled in a multi-body dynamic system. Three kinematic and compliance tests are examined, namely, vertical parallel wheel travel, opposite wheel travel and single wheel travel. The problem is formulated as a multi-objective optimization problem with 40 objectives and 49 design parameters. A hierarchical clustering method based on global sensitivity analysis is used to reduce the number of objectives to 30 by grouping correlated objectives together. Then, a dynamic summation of rank value is used as pseudo-objective functions to reformulate the multi-objective optimization to a single-objective optimization problem. The optimized results show a significant improvement in the correlation between the simulated model and the experimental model. Once accurate representation of the vehicle suspension model is achieved, further analysis, such as ride and handling performances, can be implemented for further optimization.

  7. Illness representations as mediators of the relationship between dispositional optimism and depression in patients with chronic tinnitus: a cross-sectional study.

    PubMed

    Vollmann, Manja; Scharloo, Margreet; Langguth, Berthold; Kalkouskaya, Natallia; Salewski, Christel

    2013-01-01

    Both dispositional optimism and illness representations are related to psychological health in chronic patients. In a group of chronic tinnitus sufferers, the interplay between these two variables was examined. Specifically, it was tested to what extent the relationship between dispositional optimism and depression is mediated by more positive illness representations. The study had a cross-sectional design. One hundred and eighteen patients diagnosed with chronic tinnitus completed questionnaires assessing optimism (Life Orientation Test-Revised [LOT-R]), illness representations (Illness Perceptions Questionnaire-Revised [IPQ-R]) and depression (Hospital Anxiety and Depression Scale [HADS]). Correlation analysis showed that optimism was associated with more positive illness representations and lower levels of depression. Simple mediation analyses revealed that the relationship between optimism and depression was partially mediated by the illness representation dimensions consequences, treatment control, coherence, emotional representations and internal causes. A multiple mediation analysis indicated that the total mediation effect of illness representations is particularly due to the dimension consequences. Optimism influences depression in tinnitus patients both directly and indirectly. The indirect effect indicates that optimism is associated with more positive tinnitus-specific illness representations which, in turn, are related to less depression. These findings contribute to a better understanding of the interplay between generalised expectancies, illness-specific perceptions and psychological adjustment to medical conditions.

  8. An optimized proportional-derivative controller for the human upper extremity with gravity.

    PubMed

    Jagodnik, Kathleen M; Blana, Dimitra; van den Bogert, Antonie J; Kirsch, Robert F

    2015-10-15

    When Functional Electrical Stimulation (FES) is used to restore movement in subjects with spinal cord injury (SCI), muscle stimulation patterns should be selected to generate accurate and efficient movements. Ideally, the controller for such a neuroprosthesis will have the simplest architecture possible, to facilitate translation into a clinical setting. In this study, we used the simulated annealing algorithm to optimize two proportional-derivative (PD) feedback controller gain sets for a 3-dimensional arm model that includes musculoskeletal dynamics and has 5 degrees of freedom and 22 muscles, performing goal-oriented reaching movements. Controller gains were optimized by minimizing a weighted sum of position errors, orientation errors, and muscle activations. After optimization, gain performance was evaluated on the basis of accuracy and efficiency of reaching movements, along with three other benchmark gain sets not optimized for our system, on a large set of dynamic reaching movements for which the controllers had not been optimized, to test ability to generalize. Robustness in the presence of weakened muscles was also tested. The two optimized gain sets were found to have very similar performance to each other on all metrics, and to exhibit significantly better accuracy, compared with the three standard gain sets. All gain sets investigated used physiologically acceptable amounts of muscular activation. It was concluded that optimization can yield significant improvements in controller performance while still maintaining muscular efficiency, and that optimization should be considered as a strategy for future neuroprosthesis controller design. Published by Elsevier Ltd.

  9. Multidisciplinary design optimization of vehicle instrument panel based on multi-objective genetic algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Wu, Guangqiang

    2013-03-01

    Typical multidisciplinary design optimization(MDO) has gradually been proposed to balance performances of lightweight, noise, vibration and harshness(NVH) and safety for instrument panel(IP) structure in the automotive development. Nevertheless, plastic constitutive relation of Polypropylene(PP) under different strain rates, has not been taken into consideration in current reliability-based and collaborative IP MDO design. In this paper, based on tensile test under different strain rates, the constitutive relation of Polypropylene material is studied. Impact simulation tests for head and knee bolster are carried out to meet the regulation of FMVSS 201 and FMVSS 208, respectively. NVH analysis is performed to obtain mainly the natural frequencies and corresponding mode shapes, while the crashworthiness analysis is employed to examine the crash behavior of IP structure. With the consideration of lightweight, NVH, head and knee bolster impact performance, design of experiment(DOE), response surface model(RSM), and collaborative optimization(CO) are applied to realize the determined and reliability-based optimizations, respectively. Furthermore, based on multi-objective genetic algorithm(MOGA), the optimal Pareto sets are completed to solve the multi-objective optimization(MOO) problem. The proposed research ensures the smoothness of Pareto set, enhances the ability of engineers to make a comprehensive decision about multi-objectives and choose the optimal design, and improves the quality and efficiency of MDO.

  10. Dispositional optimism and coping strategies in patients with a kidney transplant.

    PubMed

    Costa-Requena, Gemma; Cantarell-Aixendri, M Carmen; Parramon-Puig, Gemma; Serón-Micas, Daniel

    2014-01-01

     Dispositional optimism is a personal resource that determines the coping style and adaptive response to chronic diseases. The aim of this study was to assess the correlations between dispositional optimism and coping strategies in patients with recent kidney transplantation and evaluate the differences in the use of coping strategies in accordance with the level of dispositional optimism.  Patients who were hospitalised in the nephrology department were selected consecutively after kidney transplantation was performed. The evaluation instruments were the Life Orientation Test-Revised, and the Coping Strategies Inventory. The data were analysed with central tendency measures, correlation analyses and means were compared using Student’s t-test.   66 patients with a kidney transplant participated in the study. The coping styles that characterised patients with a recent kidney transplantation were Social withdrawal and Problem avoidance. Correlations between dispositional optimism and coping strategies were significant in a positive direction in Problem-solving (p<.05) and Cognitive restructuring (p<.01), and inversely with Self-criticism (p<.05). Differences in dispositional optimism created significant differences in the Self-Criticism dimension (t=2.58; p<.01).  Dispositional optimism scores provide differences in coping responses after kidney transplantation. Moreover, coping strategies may influence the patient’s perception of emotional wellbeing after kidney transplantation.

  11. Inverse modeling of rainfall infiltration with a dual permeability approach using different matrix-fracture coupling variants.

    NASA Astrophysics Data System (ADS)

    Blöcher, Johanna; Kuraz, Michal

    2017-04-01

    In this contribution we propose implementations of the dual permeability model with different inter-domain exchange descriptions and metaheuristic optimization algorithms for parameter identification and mesh optimization. We compare variants of the coupling term with different numbers of parameters to test if a reduction of parameters is feasible. This can reduce parameter uncertainty in inverse modeling, but also allow for different conceptual models of the domain and matrix coupling. The different variants of the dual permeability model are implemented in the open-source objective library DRUtES written in FORTRAN 2003/2008 in 1D and 2D. For parameter identification we use adaptations of the particle swarm optimization (PSO) and Teaching-learning-based optimization (TLBO), which are population-based metaheuristics with different learning strategies. These are high-level stochastic-based search algorithms that don't require gradient information or a convex search space. Despite increasing computing power and parallel processing, an overly fine mesh is not feasible for parameter identification. This creates the need to find a mesh that optimizes both accuracy and simulation time. We use a bi-objective PSO algorithm to generate a Pareto front of optimal meshes to account for both objectives. The dual permeability model and the optimization algorithms were tested on virtual data and field TDR sensor readings. The TDR sensor readings showed a very steep increase during rapid rainfall events and a subsequent steep decrease. This was theorized to be an effect of artificial macroporous envelopes surrounding TDR sensors creating an anomalous region with distinct local soil hydraulic properties. One of our objectives is to test how well the dual permeability model can describe this infiltration behavior and what coupling term would be most suitable.

  12. Optimal cost design of water distribution networks using a decomposition approach

    NASA Astrophysics Data System (ADS)

    Lee, Ho Min; Yoo, Do Guen; Sadollah, Ali; Kim, Joong Hoon

    2016-12-01

    Water distribution network decomposition, which is an engineering approach, is adopted to increase the efficiency of obtaining the optimal cost design of a water distribution network using an optimization algorithm. This study applied the source tracing tool in EPANET, which is a hydraulic and water quality analysis model, to the decomposition of a network to improve the efficiency of the optimal design process. The proposed approach was tested by carrying out the optimal cost design of two water distribution networks, and the results were compared with other optimal cost designs derived from previously proposed optimization algorithms. The proposed decomposition approach using the source tracing technique enables the efficient decomposition of an actual large-scale network, and the results can be combined with the optimal cost design process using an optimization algorithm. This proves that the final design in this study is better than those obtained with other previously proposed optimization algorithms.

  13. Pareto-optimal multi-objective dimensionality reduction deep auto-encoder for mammography classification.

    PubMed

    Taghanaki, Saeid Asgari; Kawahara, Jeremy; Miles, Brandon; Hamarneh, Ghassan

    2017-07-01

    Feature reduction is an essential stage in computer aided breast cancer diagnosis systems. Multilayer neural networks can be trained to extract relevant features by encoding high-dimensional data into low-dimensional codes. Optimizing traditional auto-encoders works well only if the initial weights are close to a proper solution. They are also trained to only reduce the mean squared reconstruction error (MRE) between the encoder inputs and the decoder outputs, but do not address the classification error. The goal of the current work is to test the hypothesis that extending traditional auto-encoders (which only minimize reconstruction error) to multi-objective optimization for finding Pareto-optimal solutions provides more discriminative features that will improve classification performance when compared to single-objective and other multi-objective approaches (i.e. scalarized and sequential). In this paper, we introduce a novel multi-objective optimization of deep auto-encoder networks, in which the auto-encoder optimizes two objectives: MRE and mean classification error (MCE) for Pareto-optimal solutions, rather than just MRE. These two objectives are optimized simultaneously by a non-dominated sorting genetic algorithm. We tested our method on 949 X-ray mammograms categorized into 12 classes. The results show that the features identified by the proposed algorithm allow a classification accuracy of up to 98.45%, demonstrating favourable accuracy over the results of state-of-the-art methods reported in the literature. We conclude that adding the classification objective to the traditional auto-encoder objective and optimizing for finding Pareto-optimal solutions, using evolutionary multi-objective optimization, results in producing more discriminative features. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Determination of the optimal cutoff value for a serological assay: an example using the Johne's Absorbed EIA.

    PubMed Central

    Ridge, S E; Vizard, A L

    1993-01-01

    Traditionally, in order to improve diagnostic accuracy, existing tests have been replaced with newly developed diagnostic tests with superior sensitivity and specificity. However, it is possible to improve existing tests by altering the cutoff value chosen to distinguish infected individuals from uninfected individuals. This paper uses data obtained from an investigation of the operating characteristics of the Johne's Absorbed EIA to demonstrate a method of determining a preferred cutoff value from several potentially useful cutoff settings. A method of determining the financial gain from using the preferred rather than the current cutoff value and a decision analysis method to assist in determining the optimal cutoff value when critical population parameters are not known with certainty are demonstrated. The results of this study indicate that the currently recommended cutoff value for the Johne's Absorbed EIA is only close to optimal when the disease prevalence is very low and false-positive test results are deemed to be very costly. In other situations, there were considerable financial advantages to using cutoff values calculated to maximize the benefit of testing. It is probable that the current cutoff values for other diagnostic tests may not be the most appropriate for every testing situation. This paper offers methods for identifying the cutoff value that maximizes the benefit of medical and veterinary diagnostic tests. PMID:8501227

  15. Performance seeking control program overview

    NASA Technical Reports Server (NTRS)

    Orme, John S.

    1995-01-01

    The Performance Seeking Control (PSC) program evolved from a series of integrated propulsion-flight control research programs flown at NASA Dryden Flight Research Center (DFRC) on an F-15. The first of these was the Digital Electronic Engine Control (DEEC) program and provided digital engine controls suitable for integration. The DEEC and digital electronic flight control system of the NASA F-15 were ideally suited for integrated controls research. The Advanced Engine Control System (ADECS) program proved that integrated engine and aircraft control could improve overall system performance. The objective of the PSC program was to advance the technology for a fully integrated propulsion flight control system. Whereas ADECS provided single variable control for an average engine, PSC controlled multiple propulsion system variables while adapting to the measured engine performance. PSC was developed as a model-based, adaptive control algorithm and included four optimization modes: minimum fuel flow at constant thrust, minimum turbine temperature at constant thrust, maximum thrust, and minimum thrust. Subsonic and supersonic flight testing were conducted at NASA Dryden covering the four PSC optimization modes and over the full throttle range. Flight testing of the PSC algorithm, conducted in a series of five flight test phases, has been concluded at NASA Dryden covering all four of the PSC optimization modes. Over a three year period and five flight test phases 72 research flights were conducted. The primary objective of flight testing was to exercise each PSC optimization mode and quantify the resulting performance improvements.

  16. Optimizing a Test Method to Evaluate Resistance of Pervious Concrete to Cycles of Freezing and Thawing in the Presence of Different Deicing Salts

    PubMed Central

    Tsang, Chehong; Shehata, Medhat H.; Lotfy, Abdurrahmaan

    2016-01-01

    The lack of a standard test method for evaluating the resistance of pervious concrete to cycles of freezing and thawing in the presence of deicing salts is the motive behind this study. Different sample size and geometry, cycle duration, and level of submersion in brine solutions were investigated to achieve an optimized test method. The optimized test method was able to produce different levels of damage when different types of deicing salts were used. The optimized duration of one cycle was found to be 24 h with twelve hours of freezing at −18 °C and twelve hours of thawing at +21 °C, with the bottom 10 mm of the sample submerged in the brine solution. Cylinder samples with a diameter of 100 mm and height of 150 mm were used and found to produce similar results to 150 mm-cubes. Based on the obtained results a mass loss of 3%–5% is proposed as a failure criterion of cylindrical samples. For the materials and within the cycles of freezing/thawing investigated here, the deicers that caused the most damage were NaCl, CaCl2 and urea, followed by MgCl2, potassium acetate, sodium acetate and calcium-magnesium acetate. More testing is needed to validate the effects of different deicers under long term exposures and different temperature ranges. PMID:28773998

  17. Hull Form Design and Optimization Tool Development

    DTIC Science & Technology

    2012-07-01

    global minimum. The algorithm accomplishes this by using a method known as metaheuristics which allows the algorithm to examine a large area by...further development of these tools including the implementation and testing of a new optimization algorithm , the improvement of a rapid hull form...under the 2012 Naval Research Enterprise Intern Program. 15. SUBJECT TERMS hydrodynamic, hull form, generation, optimization, algorithm

  18. Strong stabilization servo controller with optimization of performance criteria.

    PubMed

    Sarjaš, Andrej; Svečko, Rajko; Chowdhury, Amor

    2011-07-01

    Synthesis of a simple robust controller with a pole placement technique and a H(∞) metrics is the method used for control of a servo mechanism with BLDC and BDC electric motors. The method includes solving a polynomial equation on the basis of the chosen characteristic polynomial using the Manabe standard polynomial form and parametric solutions. Parametric solutions are introduced directly into the structure of the servo controller. On the basis of the chosen parametric solutions the robustness of a closed-loop system is assessed through uncertainty models and assessment of the norm ‖•‖(∞). The design procedure and the optimization are performed with a genetic algorithm differential evolution - DE. The DE optimization method determines a suboptimal solution throughout the optimization on the basis of a spectrally square polynomial and Šiljak's absolute stability test. The stability of the designed controller during the optimization is being checked with Lipatov's stability condition. Both utilized approaches: Šiljak's test and Lipatov's condition, check the robustness and stability characteristics on the basis of the polynomial's coefficients, and are very convenient for automated design of closed-loop control and for application in optimization algorithms such as DE. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Quad-rotor flight path energy optimization

    NASA Astrophysics Data System (ADS)

    Kemper, Edward

    Quad-Rotor unmanned areal vehicles (UAVs) have been a popular area of research and development in the last decade, especially with the advent of affordable microcontrollers like the MSP 430 and the Raspberry Pi. Path-Energy Optimization is an area that is well developed for linear systems. In this thesis, this idea of path-energy optimization is extended to the nonlinear model of the Quad-rotor UAV. The classical optimization technique is adapted to the nonlinear model that is derived for the problem at hand, coming up with a set of partial differential equations and boundary value conditions to solve these equations. Then, different techniques to implement energy optimization algorithms are tested using simulations in Python. First, a purely nonlinear approach is used. This method is shown to be computationally intensive, with no practical solution available in a reasonable amount of time. Second, heuristic techniques to minimize the energy of the flight path are tested, using Ziegler-Nichols' proportional integral derivative (PID) controller tuning technique. Finally, a brute force look-up table based PID controller is used. Simulation results of the heuristic method show that both reliable control of the system and path-energy optimization are achieved in a reasonable amount of time.

  20. Effective Condition for Whole Testis Cryopreservation of Endangered Miho Spine Loach (Cobitis choii) Through the Optimization of Mud Loach (Misgurnus mizolepis) Whole Testis Cryopreservation Condition.

    PubMed

    Kim, J J; Nam, Y K; Bang, I C; Gong, S P

      BACKGROUND: Miho spine loach (Cobitis choii) is an endangered Korean endemic fish. Whole testis cryopreservation is a good way for species preservation, but needs to the sacrifice of a large number of fish to optimize the freezing condition. Considering this limitation, a surrogate fish species was used for the protocol development. This study was to establish the effective condition for Miho spine loach whole testis cryopreservation by optimizing the conditions for whole testis cryopreservation in an allied species, mud loach (Misgurnus mizolepis). The condition for whole testis cryopreservation was optimized in mud loach first, and then the optimal condition was applied to Miho spine loach testes. The optimal condition for mud loach testis cryopreservation consists of the freezing medium containing 1.3 M dimethyl sulfoxide, 6% fetal bovine serum and 0.3 M trehalose, -1 C/min cooling rate and 26 degree C thawing temperature, which also permits effective cryopreservation of Miho spine loach testes. An effective cryopreservation condition for whole testis of the endangered Miho spine loach has been established by using mud loach as a surrogate fish.

  1. Optimizing α for better statistical decisions: a case study involving the pace-of-life syndrome hypothesis: optimal α levels set to minimize Type I and II errors frequently result in different conclusions from those using α = 0.05.

    PubMed

    Mudge, Joseph F; Penny, Faith M; Houlahan, Jeff E

    2012-12-01

    Setting optimal significance levels that minimize Type I and Type II errors allows for more transparent and well-considered statistical decision making compared to the traditional α = 0.05 significance level. We use the optimal α approach to re-assess conclusions reached by three recently published tests of the pace-of-life syndrome hypothesis, which attempts to unify occurrences of different physiological, behavioral, and life history characteristics under one theory, over different scales of biological organization. While some of the conclusions reached using optimal α were consistent to those previously reported using the traditional α = 0.05 threshold, opposing conclusions were also frequently reached. The optimal α approach reduced probabilities of Type I and Type II errors, and ensured statistical significance was associated with biological relevance. Biologists should seriously consider their choice of α when conducting null hypothesis significance tests, as there are serious disadvantages with consistent reliance on the traditional but arbitrary α = 0.05 significance level. Copyright © 2012 WILEY Periodicals, Inc.

  2. A Longitudinal Examination of Hope and Optimism and Their Role in Type 1 Diabetes in Youths.

    PubMed

    Van Allen, Jason; Steele, Ric G; Nelson, Michael B; Peugh, James; Egan, Anna; Clements, Mark; Patton, Susana R

    2016-08-01

    To test the longitudinal associations between hope and optimism and health outcomes (i.e., HbA1c and self-monitored blood glucose [SMBG]) among youths with Type 1 diabetes mellitus (T1DM) over a 6-month period. A total of 110 participants (aged 10-16 years) completed study measures at Time 1, and 81 completed measures at Time 2. Analyses examined hope and optimism as predictors of change in health outcomes, and examined SMBG as a mediator of the relationship between hope and optimism, and HbA1c. Change in hope, but not optimism, was associated with change in SMBG and HbA1c. Change in SMBG mediated the relationship between change in hope and HbA1c, but not between optimism and HbA1c. It may be beneficial to assess hope in pediatric T1DM patients to identify youths who may be at risk for poor diabetes management, and to test the benefit of hope-based intervention efforts in clinical studies. © The Author 2015. Published by Oxford University Press on behalf of the Society of Pediatric Psychology. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  3. An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics

    NASA Technical Reports Server (NTRS)

    Baluja, Shumeet

    1995-01-01

    This report is a repository of the results obtained from a large scale empirical comparison of seven iterative and evolution-based optimization heuristics. Twenty-seven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include job-shop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2368 to 22040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.

  4. SU-E-T-446: Group-Sparsity Based Angle Generation Method for Beam Angle Optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gao, H

    2015-06-15

    Purpose: This work is to develop the effective algorithm for beam angle optimization (BAO), with the emphasis on enabling further improvement from existing treatment-dependent templates based on clinical knowledge and experience. Methods: The proposed BAO algorithm utilizes a priori beam angle templates as the initial guess, and iteratively generates angular updates for this initial set, namely angle generation method, with improved dose conformality that is quantitatively measured by the objective function. That is, during each iteration, we select “the test angle” in the initial set, and use group-sparsity based fluence map optimization to identify “the candidate angle” for updating “themore » test angle”, for which all the angles in the initial set except “the test angle”, namely “the fixed set”, are set free, i.e., with no group-sparsity penalty, and the rest of angles including “the test angle” during this iteration are in “the working set”. And then “the candidate angle” is selected with the smallest objective function value from the angles in “the working set” with locally maximal group sparsity, and replaces “the test angle” if “the fixed set” with “the candidate angle” has a smaller objective function value by solving the standard fluence map optimization (with no group-sparsity regularization). Similarly other angles in the initial set are in turn selected as “the test angle” for angular updates and this chain of updates is iterated until no further new angular update is identified for a full loop. Results: The tests using the MGH public prostate dataset demonstrated the effectiveness of the proposed BAO algorithm. For example, the optimized angular set from the proposed BAO algorithm was better the MGH template. Conclusion: A new BAO algorithm is proposed based on the angle generation method via group sparsity, with improved dose conformality from the given template. Hao Gao was partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less

  5. Parallelization of Program to Optimize Simulated Trajectories (POST3D)

    NASA Technical Reports Server (NTRS)

    Hammond, Dana P.; Korte, John J. (Technical Monitor)

    2001-01-01

    This paper describes the parallelization of the Program to Optimize Simulated Trajectories (POST3D). POST3D uses a gradient-based optimization algorithm that reaches an optimum design point by moving from one design point to the next. The gradient calculations required to complete the optimization process, dominate the computational time and have been parallelized using a Single Program Multiple Data (SPMD) on a distributed memory NUMA (non-uniform memory access) architecture. The Origin2000 was used for the tests presented.

  6. Pitch Guidance Optimization for the Orion Abort Flight Tests

    NASA Technical Reports Server (NTRS)

    Stillwater, Ryan Allanque

    2010-01-01

    The National Aeronautics and Space Administration created the Constellation program to develop the next generation of manned space vehicles and launch vehicles. The Orion abort system is initiated in the event of an unsafe condition during launch. The system has a controller gains schedule that can be tuned to reduce the attitude errors between the simulated Orion abort trajectories and the guidance trajectory. A program was created that uses the method of steepest descent to tune the pitch gains schedule by an automated procedure. The gains schedule optimization was applied to three potential abort scenarios; each scenario tested using the optimized gains schedule resulted in reduced attitude errors when compared to the Orion production gains schedule.

  7. Detection of Mycoplasma hyopneumoniae by polymerase chain reaction in swine presenting respiratory problems

    PubMed Central

    Yamaguti, M.; Muller, E.E.; Piffer, A.I.; Kich, J.D.; Klein, C.S.; Kuchiishi, S.S.

    2008-01-01

    Since Mycoplasma hyopneumoniae isolation in appropriate media is a difficult task and impractical for daily routine diagnostics, Nested-PCR (N-PCR) techniques are currently used to improve the direct diagnostic sensitivity of Swine Enzootic Pneumonia. In a first experiment, this paper describes a N-PCR technique optimization based on three variables: different sampling sites, sample transport media, and DNA extraction methods, using eight pigs. Based on the optimization results, a second experiment was conducted for testing validity using 40 animals. In conclusion, the obtained results of the N-PCR optimization and validation allow us to recommend this test as a routine monitoring diagnostic method for Mycoplasma hyopneumoniae infection in swine herds. PMID:24031248

  8. Evaluation of the selection methods used in the exIWO algorithm based on the optimization of multidimensional functions

    NASA Astrophysics Data System (ADS)

    Kostrzewa, Daniel; Josiński, Henryk

    2016-06-01

    The expanded Invasive Weed Optimization algorithm (exIWO) is an optimization metaheuristic modelled on the original IWO version inspired by dynamic growth of weeds colony. The authors of the present paper have modified the exIWO algorithm introducing a set of both deterministic and non-deterministic strategies of individuals' selection. The goal of the project was to evaluate the modified exIWO by testing its usefulness for multidimensional numerical functions optimization. The optimized functions: Griewank, Rastrigin, and Rosenbrock are frequently used as benchmarks because of their characteristics.

  9. Gravity inversion of a fault by Particle swarm optimization (PSO).

    PubMed

    Toushmalani, Reza

    2013-01-01

    Particle swarm optimization is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. In this paper we introduce and use this method in gravity inverse problem. We discuss the solution for the inverse problem of determining the shape of a fault whose gravity anomaly is known. Application of the proposed algorithm to this problem has proven its capability to deal with difficult optimization problems. The technique proved to work efficiently when tested to a number of models.

  10. A new improved artificial bee colony algorithm for ship hull form optimization

    NASA Astrophysics Data System (ADS)

    Huang, Fuxin; Wang, Lijue; Yang, Chi

    2016-04-01

    The artificial bee colony (ABC) algorithm is a relatively new swarm intelligence-based optimization algorithm. Its simplicity of implementation, relatively few parameter settings and promising optimization capability make it widely used in different fields. However, it has problems of slow convergence due to its solution search equation. Here, a new solution search equation based on a combination of the elite solution pool and the block perturbation scheme is proposed to improve the performance of the algorithm. In addition, two different solution search equations are used by employed bees and onlooker bees to balance the exploration and exploitation of the algorithm. The developed algorithm is validated by a set of well-known numerical benchmark functions. It is then applied to optimize two ship hull forms with minimum resistance. The tested results show that the proposed new improved ABC algorithm can outperform the ABC algorithm in most of the tested problems.

  11. Design optimization of a high specific speed Francis turbine runner

    NASA Astrophysics Data System (ADS)

    Enomoto, Y.; Kurosawa, S.; Kawajiri, H.

    2012-11-01

    Francis turbine is used in many hydroelectric power stations. This paper presents the development of hydraulic performance in a high specific speed Francis turbine runner. In order to achieve the improvements of turbine efficiency throughout a wide operating range, a new runner design method which combines the latest Computational Fluid Dynamics (CFD) and a multi objective optimization method with an existing design system was applied in this study. The validity of the new design system was evaluated by model performance tests. As the results, it was confirmed that the optimized runner presented higher efficiency compared with an originally designed runner. Besides optimization of runner, instability vibration which occurred at high part load operating condition was investigated by model test and gas-liquid two-phase flow analysis. As the results, it was confirmed that the instability vibration was caused by oval cross section whirl which was caused by recirculation flow near runner cone wall.

  12. Frequency optimization in the eddy current test for high purity niobium

    NASA Astrophysics Data System (ADS)

    Joung, Mijoung; Jung, Yoochul; Kim, Hyungjin

    2017-01-01

    The eddy current test (ECT) is frequently used as a non-destructive method to check for the defects of high purity niobium (RRR300, Residual Resistivity Ratio) in a superconducting radio frequency (SRF) cavity. Determining an optimal frequency corresponding to specific material properties and probe specification is a very important step. The ECT experiments for high purity Nb were performed to determine the optimal frequency using the standard sample of high purity Nb having artificial defects. The target depth was considered with the treatment step that the niobium receives as the SRF cavity material. The results were analysed via the selectivity that led to a specific result, depending on the size of the defects. According to the results, the optimal frequency was determined to be 200 kHz, and a few features of the ECT for the high purity Nb were observed.

  13. Texture and haptic cues in slant discrimination: reliability-based cue weighting without statistically optimal cue combination

    NASA Astrophysics Data System (ADS)

    Rosas, Pedro; Wagemans, Johan; Ernst, Marc O.; Wichmann, Felix A.

    2005-05-01

    A number of models of depth-cue combination suggest that the final depth percept results from a weighted average of independent depth estimates based on the different cues available. The weight of each cue in such an average is thought to depend on the reliability of each cue. In principle, such a depth estimation could be statistically optimal in the sense of producing the minimum-variance unbiased estimator that can be constructed from the available information. Here we test such models by using visual and haptic depth information. Different texture types produce differences in slant-discrimination performance, thus providing a means for testing a reliability-sensitive cue-combination model with texture as one of the cues to slant. Our results show that the weights for the cues were generally sensitive to their reliability but fell short of statistically optimal combination - we find reliability-based reweighting but not statistically optimal cue combination.

  14. Development of thermal control methods for specialized components and scientific instruments at very low temperatures (follow-on)

    NASA Technical Reports Server (NTRS)

    Wright, J. P.; Wilson, D. E.

    1976-01-01

    Many payloads currently proposed to be flown by the space shuttle system require long-duration cooling in the 3 to 200 K temperature range. Common requirements also exist for certain DOD payloads. Parametric design and optimization studies are reported for multistage and diode heat pipe radiator systems designed to operate in this temperature range. Also optimized are ground test systems for two long-life passive thermal control concepts operating under specified space environmental conditions. The ground test systems evaluated are ultimately intended to evolve into flight test qualification prototypes for early shuttle flights.

  15. Composite panel development at JPL

    NASA Technical Reports Server (NTRS)

    Mcelroy, Paul; Helms, Rich

    1988-01-01

    Parametric computer studies can be use in a cost effective manner to determine optimized composite mirror panel designs. An InterDisciplinary computer Model (IDM) was created to aid in the development of high precision reflector panels for LDR. The materials properties, thermal responses, structural geometries, and radio/optical precision are synergistically analyzed for specific panel designs. Promising panels designs are fabricated and tested so that comparison with panel test results can be used to verify performance prediction models and accommodate design refinement. The iterative approach of computer design and model refinement with performance testing and materials optimization has shown good results for LDR panels.

  16. SFDT-1 Camera Pointing and Sun-Exposure Analysis and Flight Performance

    NASA Technical Reports Server (NTRS)

    White, Joseph; Dutta, Soumyo; Striepe, Scott

    2015-01-01

    The Supersonic Flight Dynamics Test (SFDT) vehicle was developed to advance and test technologies of NASA's Low Density Supersonic Decelerator (LDSD) Technology Demonstration Mission. The first flight test (SFDT-1) occurred on June 28, 2014. In order to optimize the usefulness of the camera data, analysis was performed to optimize parachute visibility in the camera field of view during deployment and inflation and to determine the probability of sun-exposure issues with the cameras given the vehicle heading and launch time. This paper documents the analysis, results and comparison with flight video of SFDT-1.

  17. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and constraints. On the aerostructural test problem formulated with thousands of constraints, the matrix-free optimizer is estimated to reduce the total computational time by up to 90% compared to conventional optimizers.

  18. A Matrix-Free Algorithm for Multidisciplinary Design Optimization

    NASA Astrophysics Data System (ADS)

    Lambe, Andrew Borean

    Multidisciplinary design optimization (MDO) is an approach to engineering design that exploits the coupling between components or knowledge disciplines in a complex system to improve the final product. In aircraft design, MDO methods can be used to simultaneously design the outer shape of the aircraft and the internal structure, taking into account the complex interaction between the aerodynamic forces and the structural flexibility. Efficient strategies are needed to solve such design optimization problems and guarantee convergence to an optimal design. This work begins with a comprehensive review of MDO problem formulations and solution algorithms. First, a fundamental MDO problem formulation is defined from which other formulations may be obtained through simple transformations. Using these fundamental problem formulations, decomposition methods from the literature are reviewed and classified. All MDO methods are presented in a unified mathematical notation to facilitate greater understanding. In addition, a novel set of diagrams, called extended design structure matrices, are used to simultaneously visualize both data communication and process flow between the many software components of each method. For aerostructural design optimization, modern decomposition-based MDO methods cannot efficiently handle the tight coupling between the aerodynamic and structural states. This fact motivates the exploration of methods that can reduce the computational cost. A particular structure in the direct and adjoint methods for gradient computation. motivates the idea of a matrix-free optimization method. A simple matrix-free optimizer is developed based on the augmented Lagrangian algorithm. This new matrix-free optimizer is tested on two structural optimization problems and one aerostructural optimization problem. The results indicate that the matrix-free optimizer is able to efficiently solve structural and multidisciplinary design problems with thousands of variables and constraints. On the aerostructural test problem formulated with thousands of constraints, the matrix-free optimizer is estimated to reduce the total computational time by up to 90% compared to conventional optimizers.

  19. Biodegradation of kerosene: Study of growth optimization and metabolic fate of P. janthinellum SDX7

    PubMed Central

    Khan, Shamiyan R.; Nirmal, J.I. Kumar; Kumar, Rita N.; Patel, Jignasha G.

    2015-01-01

    Penicillum janthinellum SDX7 was isolated from aged petroleum hydrocarbon-affected soil at the site of Anand, Gujarat, India, and was tested for different pH, temperature, agitation and concentrations for optimal growth of the isolate that was capable of degrading upto 95%, 63% and 58% of 1%, 3% and 5% kerosene, respectively, after a period of 16 days, at optimal growth conditions of pH 6.0, 30 °C and 180 rpm agitation. The GC/MS chromatograms revealed that then-alkane fractions are easily degraded; however, the rate might be lower for branched alkanes, n-alkylaromatics, cyclic alkanes and polynuclear aromatics. The test doses caused a concentration-dependent depletion of carbohydrates of P. janthinellum SDX7 by 3% to 80%, proteins by 4% to 81% and amino acids by 8% to 95% upto 16 days of treatment. The optimal concentration of 3% kerosene resulted in the least reduction of the metabolites of P. janthinellum such as carbohydrates, proteins and amino acids with optimal growth compared to 5% and 1% (v/v) kerosene doses on the 12th and 16th day of exposure. Phenols were found to be mounted by 43% to 66% at lower and higher concentrations during the experimental period. Fungal isolate P. janthinellum SDX7 was also tested for growth on various xenobiotic compounds. PMID:26273254

  20. Biodegradation of kerosene: Study of growth optimization and metabolic fate of P. janthinellum SDX7.

    PubMed

    Khan, Shamiyan R; Nirmal, J I Kumar; Kumar, Rita N; Patel, Jignasha G

    2015-06-01

    Penicillum janthinellum SDX7 was isolated from aged petroleum hydrocarbon-affected soil at the site of Anand, Gujarat, India, and was tested for different pH, temperature, agitation and concentrations for optimal growth of the isolate that was capable of degrading upto 95%, 63% and 58% of 1%, 3% and 5% kerosene, respectively, after a period of 16 days, at optimal growth conditions of pH 6.0, 30 °C and 180 rpm agitation. The GC/MS chromatograms revealed that then-alkane fractions are easily degraded; however, the rate might be lower for branched alkanes, n-alkylaromatics, cyclic alkanes and polynuclear aromatics. The test doses caused a concentration-dependent depletion of carbohydrates of P. janthinellum SDX7 by 3% to 80%, proteins by 4% to 81% and amino acids by 8% to 95% upto 16 days of treatment. The optimal concentration of 3% kerosene resulted in the least reduction of the metabolites of P. janthinellum such as carbohydrates, proteins and amino acids with optimal growth compared to 5% and 1% (v/v) kerosene doses on the 12(th) and 16(th) day of exposure. Phenols were found to be mounted by 43% to 66% at lower and higher concentrations during the experimental period. Fungal isolate P. janthinellum SDX7 was also tested for growth on various xenobiotic compounds.

  1. Optimization of helicopter airframe structures for vibration reduction considerations, formulations and applications

    NASA Technical Reports Server (NTRS)

    Murthy, T. Sreekanta

    1988-01-01

    Several key issues involved in the application of formal optimization technique to helicopter airframe structures for vibration reduction are addressed. Considerations which are important in the optimization of real airframe structures are discussed. Considerations necessary to establish relevant set of design variables, constraints and objectives which are appropriate to conceptual, preliminary, detailed design, ground and flight test phases of airframe design are discussed. A methodology is suggested for optimization of airframes in various phases of design. Optimization formulations that are unique to helicopter airframes are described and expressions for vibration related functions are derived. Using a recently developed computer code, the optimization of a Bell AH-1G helicopter airframe is demonstrated.

  2. A Comprehensive Review of Swarm Optimization Algorithms

    PubMed Central

    2015-01-01

    Many swarm optimization algorithms have been introduced since the early 60’s, Evolutionary Programming to the most recent, Grey Wolf Optimization. All of these algorithms have demonstrated their potential to solve many optimization problems. This paper provides an in-depth survey of well-known optimization algorithms. Selected algorithms are briefly explained and compared with each other comprehensively through experiments conducted using thirty well-known benchmark functions. Their advantages and disadvantages are also discussed. A number of statistical tests are then carried out to determine the significant performances. The results indicate the overall advantage of Differential Evolution (DE) and is closely followed by Particle Swarm Optimization (PSO), compared with other considered approaches. PMID:25992655

  3. Average optimal DPOAE primary tone levels in normal-hearing adults.

    PubMed

    Marcrum, Steven C; Kummer, Peter; Kreitmayer, Christoph; Steffens, Thomas

    2016-01-01

    Despite great progress towards optimizing DPOAE primary tone characteristics, factors such as stimulus and intra-subject emission variability have not been addressed. The purpose of this study was to identify optimal primary tone level relationships when these sources of variability were acknowledged, and to identify any influences of test frequency. Following coupler-based measurements assessing primary tone level stability, two experiments were conducted. In experiment 1, DPOAE test-retest reliability without probe replacement was measured for f2 = 1-6 kHz with L1 = L2 = 65 dB SPL. In experiment 2, optimal L1-L2 relationships were identified for f2 = 1-6 kHz. For 20 ≤ L2 ≤ 75 dB SPL, L1 was varied 15 dB SPL above and below the recommendation of L1 = 0.4 L2 + 39 [dB SPL]. Eleven normal-hearing adults participated in experiment 1. Thirty normal-hearing adults participated in experiment 2. Stimulus variability did not exceed 0.1 dB SPL. DPOAE reliability testing revealed an across-frequency mean standard error of measurement of 0.52 dB SPL. The average optimal L1-L2 relationship was described by L1 = 0.49 L2 + 41 [dB SPL]. A significant effect of frequency was identified for 6 kHz. Including relevant sources of variability improves internal validity of a primary tone level optimization formula.

  4. Stimulation of abdominal and upper thoracic muscles with surface electrodes for respiration and cough: Acute studies in adult canines.

    PubMed

    Walter, James S; Posluszny, Joseph; Dieter, Raymond; Dieter, Robert S; Sayers, Scott; Iamsakul, Kiratipath; Staunton, Christine; Thomas, Donald; Rabbat, Mark; Singh, Sanjay

    2018-05-01

    To optimize maximal respiratory responses with surface stimulation over abdominal and upper thorax muscles and using a 12-Channel Neuroprosthetic Platform. Following instrumentation, six anesthetized adult canines were hyperventilated sufficiently to produce respiratory apnea. Six abdominal tests optimized electrode arrangements and stimulation parameters using bipolar sets of 4.5 cm square electrodes. Tests in the upper thorax optimized electrode locations, and forelimb moment was limited to slight-to-moderate. During combined muscle stimulation tests, the upper thoracic was followed immediately by abdominal stimulation. Finally, a model of glottal closure for cough was conducted with the goal of increased peak expiratory flow. Optimized stimulation of abdominal muscles included three sets of bilateral surface electrodes located 4.5 cm dorsal to the lateral line and from the 8 th intercostal space to caudal to the 13 th rib, 80 or 100 mA current, and 50 Hz stimulation frequency. The maximal expired volume was 343 ± 23 ml (n=3). Optimized upper thorax stimulation included a single bilateral set of electrodes located over the 2 nd interspace, 60 to 80 mA, and 50 Hz. The maximal inspired volume was 304 ± 54 ml (n=4). Sequential stimulation of the two muscles increased the volume to 600 ± 152 ml (n=2), and the glottal closure maneuver increased the flow. Studies in an adult canine model identified optimal surface stimulation methods for upper thorax and abdominal muscles to induce sufficient volumes for ventilation and cough. Further study with this neuroprosthetic platform is warranted.

  5. Testing the Birth Unit Design Spatial Evaluation Tool (BUDSET) in Australia: a pilot study.

    PubMed

    Foureur, Maralyn J; Leap, Nicky; Davis, Deborah L; Forbes, Ian F; Homer, Caroline E S

    2011-01-01

    To pilot test the Birth Unit Design Spatial Evaluation Tool (BUDSET) in an Australian maternity care setting to determine whether such an instrument can measure the optimality of different birth settings. Optimally designed spaces to give birth are likely to influence a woman's ability to experience physiologically normal labor and birth. This is important in the current industrialized environment, where increased caesarean section rates are causing concerns. The measurement of an optimal birth space is currently impossible, because there are limited tools available. A quantitative study was undertaken to pilot test the discriminant ability of the BUDSET in eight maternity units in New South Wales, Australia. Five auditors trained in the use of the BUDSET assessed the birth units using the BUDSET, which is based on 18 design principles and is divided into four domains (Fear Cascade, Facility, Aesthetics, and Support) with three to eight assessable items in each. Data were independently collected in eight birth units. Values for each of the domains were aggregated to provide an overall Optimality Score for each birth unit. A range of Optimality Scores was derived for each of the birth units (from 51 to 77 out of a possible 100 points). The BUDSET identified units with low-scoring domains. Essentially these were older units and conventional labor ward settings. The BUDSET provides a way to assess the optimality of birth units and determine which domain areas may need improvement. There is potential for improvements to existing birth spaces, and considerable improvement can be made with simple low-cost modifications. Further research is needed to validate the tool.

  6. Forecasting of dissolved oxygen in the Guanting reservoir using an optimized NGBM (1,1) model.

    PubMed

    An, Yan; Zou, Zhihong; Zhao, Yanfei

    2015-03-01

    An optimized nonlinear grey Bernoulli model was proposed by using a particle swarm optimization algorithm to solve the parameter optimization problem. In addition, each item in the first-order accumulated generating sequence was set in turn as an initial condition to determine which alternative would yield the highest forecasting accuracy. To test the forecasting performance, the optimized models with different initial conditions were then used to simulate dissolved oxygen concentrations in the Guanting reservoir inlet and outlet (China). The empirical results show that the optimized model can remarkably improve forecasting accuracy, and the particle swarm optimization technique is a good tool to solve parameter optimization problems. What's more, the optimized model with an initial condition that performs well in in-sample simulation may not do as well as in out-of-sample forecasting. Copyright © 2015. Published by Elsevier B.V.

  7. All-in-one model for designing optimal water distribution pipe networks

    NASA Astrophysics Data System (ADS)

    Aklog, Dagnachew; Hosoi, Yoshihiko

    2017-05-01

    This paper discusses the development of an easy-to-use, all-in-one model for designing optimal water distribution networks. The model combines different optimization techniques into a single package in which a user can easily choose what optimizer to use and compare the results of different optimizers to gain confidence in the performances of the models. At present, three optimization techniques are included in the model: linear programming (LP), genetic algorithm (GA) and a heuristic one-by-one reduction method (OBORM) that was previously developed by the authors. The optimizers were tested on a number of benchmark problems and performed very well in terms of finding optimal or near-optimal solutions with a reasonable computation effort. The results indicate that the model effectively addresses the issues of complexity and limited performance trust associated with previous models and can thus be used for practical purposes.

  8. (Too) optimistic about optimism: the belief that optimism improves performance.

    PubMed

    Tenney, Elizabeth R; Logg, Jennifer M; Moore, Don A

    2015-03-01

    A series of experiments investigated why people value optimism and whether they are right to do so. In Experiments 1A and 1B, participants prescribed more optimism for someone implementing decisions than for someone deliberating, indicating that people prescribe optimism selectively, when it can affect performance. Furthermore, participants believed optimism improved outcomes when a person's actions had considerable, rather than little, influence over the outcome (Experiment 2). Experiments 3 and 4 tested the accuracy of this belief; optimism improved persistence, but it did not improve performance as much as participants expected. Experiments 5A and 5B found that participants overestimated the relationship between optimism and performance even when their focus was not on optimism exclusively. In summary, people prescribe optimism when they believe it has the opportunity to improve the chance of success-unfortunately, people may be overly optimistic about just how much optimism can do. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  9. SU-E-T-367: Optimization of DLG Using TG-119 Test Cases and a Weighted Mean Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sintay, B; Vanderstraeten, C; Terrell, J

    2014-06-01

    Purpose: Optimization of the dosimetric leaf gap (DLG) is an important step in commissioning the Eclipse treatment planning system for sliding window intensity-modulated radiation therapy (SW-IMRT) and RapidArc. Often the values needed for optimal dose delivery differ markedly from those measured at commissioning. We present a method to optimize this value using the AAPM TG-119 test cases. Methods: For SW-IMRT and RapidArc, TG-119 based test plans were created using a water-equivalent phantom. Dose distributions measured on film and ion chamber (IC) readings taken in low-gradient regions within the targets were analyzed separately. Since DLG is a single value per energy,more » SW-IMRT and RapidArc must be considered simultaneously. Plans were recalculated using a linear sweep from 0.02cm (the minimum DLG) to 0.3 cm. The calculated point doses were compared to the measured doses for each plan, and based on these comparisons an optimal DLG value was computed for each plan. TG-119 cases are designed to push the system in various ways, thus, a weighted mean of the DLG was computed where the relative importance of each type of plan was given a score from 0.0 to 1.0. Finally, SW-IMRT and RapidArc are assigned an overall weight based on clinical utilization. Our routine patient-QA (PQA) process was performed as independent validation. Results: For a Varian TrueBeam, the optimized DLG varied with σ = 0.044cm for SW-IMRT and σ = 0.035cm for RapidArc. The difference between the weighted mean SW-IMRT and RapidArc value was 0.038cm. We predicted utilization of 25% SW-IMRT and 75% RapidArc. The resulting DLG was ~1mm different than that found by commissioning and produced an average error of <1% for SW-IMRT and RapidArc PQA test cases separately. Conclusion: The weighted mean method presented is a useful tool for determining an optimal DLG value for commissioning Eclipse.« less

  10. An efficient and accurate solution methodology for bilevel multi-objective programming problems using a hybrid evolutionary-local-search algorithm.

    PubMed

    Deb, Kalyanmoy; Sinha, Ankur

    2010-01-01

    Bilevel optimization problems involve two optimization tasks (upper and lower level), in which every feasible upper level solution must correspond to an optimal solution to a lower level optimization problem. These problems commonly appear in many practical problem solving tasks including optimal control, process optimization, game-playing strategy developments, transportation problems, and others. However, they are commonly converted into a single level optimization problem by using an approximate solution procedure to replace the lower level optimization task. Although there exist a number of theoretical, numerical, and evolutionary optimization studies involving single-objective bilevel programming problems, not many studies look at the context of multiple conflicting objectives in each level of a bilevel programming problem. In this paper, we address certain intricate issues related to solving multi-objective bilevel programming problems, present challenging test problems, and propose a viable and hybrid evolutionary-cum-local-search based algorithm as a solution methodology. The hybrid approach performs better than a number of existing methodologies and scales well up to 40-variable difficult test problems used in this study. The population sizing and termination criteria are made self-adaptive, so that no additional parameters need to be supplied by the user. The study indicates a clear niche of evolutionary algorithms in solving such difficult problems of practical importance compared to their usual solution by a computationally expensive nested procedure. The study opens up many issues related to multi-objective bilevel programming and hopefully this study will motivate EMO and other researchers to pay more attention to this important and difficult problem solving activity.

  11. Implementation of design of experiments approach for the micronization of a drug with a high brittle-ductile transition particle diameter.

    PubMed

    Yazdi, Ashkan K; Smyth, Hugh D C

    2017-03-01

    To optimize air-jet milling conditions of ibuprofen (IBU) using design of experiment (DoE) method, and to test the generalizability of the optimized conditions for the processing of another non-steroidal anti-inflammatory drug (NSAID). Bulk IBU was micronized using an Aljet mill according to a circumscribed central composite (CCC) design with grinding and pushing nozzle pressures (GrindP, PushP) varying from 20 to 110 psi. Output variables included yield and particle diameters at the 50th and 90th percentile (D 50 , D 90 ). Following data analysis, the optimized conditions were identified and tested to produce IBU particles with a minimum size and an acceptable yield. Finally, indomethacin (IND) was milled using the optimized conditions as well as the control. CCC design included eight successful runs for milling IBU from the ten total runs due to powder "blowback" from the feed hopper. DoE analysis allowed the optimization of the GrindP and PushP at 75 and 65 psi. In subsequent validation experiments using the optimized conditions, the experimental D 50 and D 90 values (1.9 and 3.6 μm) corresponded closely with the DoE modeling predicted values. Additionally, the optimized conditions were superior over the control conditions for the micronization of IND where smaller D 50 and D 90 values (1.2 and 2.7 μm vs. 1.8 and 4.4 μm) were produced. The optimization of a single-step air-jet milling of IBU using the DoE approach elucidated the optimal milling conditions, which were used to micronize IND using the optimized milling conditions.

  12. The 2013 Frank Stinchfield Award: Diagnosis of infection in the early postoperative period after total hip arthroplasty.

    PubMed

    Yi, Paul H; Cross, Michael B; Moric, Mario; Sporer, Scott M; Berger, Richard A; Della Valle, Craig J

    2014-02-01

    Diagnosis of periprosthetic joint infection (PJI) can be difficult in the early postoperative period after total hip arthroplasty (THA) because normal cues from the physical examination often are unreliable, and serological markers commonly used for diagnosis are elevated from the recent surgery. The purposes of this study were to determine the optimal cutoff values for erythrocyte sedimentation rate (ESR), C-reactive protein (CRP), synovial fluid white blood cell (WBC) count, and differential for diagnosing PJI in the early postoperative period after primary THA. We reviewed 6033 consecutive primary THAs and identified 73 patients (1.2%) who underwent reoperation for any reason within the first 6 weeks postoperatively. Thirty-six of these patients were infected according to modified Musculoskeletal Infection Society criteria. Mean values for the diagnostic tests were compared between groups and receiver operating characteristic curves generated along with an area under the curve (AUC) to determine test performance and optimal cutoff values to diagnose infection. The best test for the diagnosis of PJI was the synovial fluid WBC count (AUC = 98%; optimal cutoff value 12,800 cells/μL) followed by the CRP (AUC = 93%; optimal cutoff value 93 mg/L), and synovial fluid differential (AUC = 91%; optimal cutoff value 89% PMN). The mean ESR (infected = 69 mm/hr, not infected = 46 mm/hr), CRP (infected = 192 mg/L, not infected = 30 mg/L), synovial fluid WBC count (infected = 84,954 cells/μL, not infected = 2391 cells/μL), and differential (infected = 91% polymorphonuclear cells [PMN], not infected = 63% PMN) all were significantly higher in the infected group. Optimal cutoff values for the diagnosis of PJI in the acute postoperative period were higher than those traditionally used for the diagnosis of chronic PJI. The serum CRP is an excellent screening test, whereas the synovial fluid WBC count is more specific.

  13. Development of optimized, graded-permeability axial groove heat pipes

    NASA Technical Reports Server (NTRS)

    Kapolnek, Michael R.; Holmes, H. Rolland

    1988-01-01

    Heat pipe performance can usually be improved by uniformly varying or grading wick permeability from end to end. A unique and cost effective method for grading the permeability of an axial groove heat pipe is described - selective chemical etching of the pipe casing. This method was developed and demonstrated on a proof-of-concept test article. The process improved the test article's performance by 50 percent. Further improvement is possible through the use of optimally etched grooves.

  14. Multi-strategy coevolving aging particle optimization.

    PubMed

    Iacca, Giovanni; Caraffini, Fabio; Neri, Ferrante

    2014-02-01

    We propose Multi-Strategy Coevolving Aging Particles (MS-CAP), a novel population-based algorithm for black-box optimization. In a memetic fashion, MS-CAP combines two components with complementary algorithm logics. In the first stage, each particle is perturbed independently along each dimension with a progressively shrinking (decaying) radius, and attracted towards the current best solution with an increasing force. In the second phase, the particles are mutated and recombined according to a multi-strategy approach in the fashion of the ensemble of mutation strategies in Differential Evolution. The proposed algorithm is tested, at different dimensionalities, on two complete black-box optimization benchmarks proposed at the Congress on Evolutionary Computation 2010 and 2013. To demonstrate the applicability of the approach, we also test MS-CAP to train a Feedforward Neural Network modeling the kinematics of an 8-link robot manipulator. The numerical results show that MS-CAP, for the setting considered in this study, tends to outperform the state-of-the-art optimization algorithms on a large set of problems, thus resulting in a robust and versatile optimizer.

  15. Many-to-Many Multicast Routing Schemes under a Fixed Topology

    PubMed Central

    Ding, Wei; Wang, Hongfa; Wei, Xuerui

    2013-01-01

    Many-to-many multicast routing can be extensively applied in computer or communication networks supporting various continuous multimedia applications. The paper focuses on the case where all users share a common communication channel while each user is both a sender and a receiver of messages in multicasting as well as an end user. In this case, the multicast tree appears as a terminal Steiner tree (TeST). The problem of finding a TeST with a quality-of-service (QoS) optimization is frequently NP-hard. However, we discover that it is a good idea to find a many-to-many multicast tree with QoS optimization under a fixed topology. In this paper, we are concerned with three kinds of QoS optimization objectives of multicast tree, that is, the minimum cost, minimum diameter, and maximum reliability. All of three optimization problems are distributed into two types, the centralized and decentralized version. This paper uses the dynamic programming method to devise an exact algorithm, respectively, for the centralized and decentralized versions of each optimization problem. PMID:23589706

  16. Dendritic and Axonal Wiring Optimization of Cortical GABAergic Interneurons.

    PubMed

    Anton-Sanchez, Laura; Bielza, Concha; Benavides-Piccione, Ruth; DeFelipe, Javier; Larrañaga, Pedro

    2016-10-01

    The way in which a neuronal tree expands plays an important role in its functional and computational characteristics. We aimed to study the existence of an optimal neuronal design for different types of cortical GABAergic neurons. To do this, we hypothesized that both the axonal and dendritic trees of individual neurons optimize brain connectivity in terms of wiring length. We took the branching points of real three-dimensional neuronal reconstructions of the axonal and dendritic trees of different types of cortical interneurons and searched for the minimal wiring arborization structure that respects the branching points. We compared the minimal wiring arborization with real axonal and dendritic trees. We tested this optimization problem using a new approach based on graph theory and evolutionary computation techniques. We concluded that neuronal wiring is near-optimal in most of the tested neurons, although the wiring length of dendritic trees is generally nearer to the optimum. Therefore, wiring economy is related to the way in which neuronal arborizations grow irrespective of the marked differences in the morphology of the examined interneurons.

  17. An External Archive-Guided Multiobjective Particle Swarm Optimization Algorithm.

    PubMed

    Zhu, Qingling; Lin, Qiuzhen; Chen, Weineng; Wong, Ka-Chun; Coello Coello, Carlos A; Li, Jianqiang; Chen, Jianyong; Zhang, Jun

    2017-09-01

    The selection of swarm leaders (i.e., the personal best and global best), is important in the design of a multiobjective particle swarm optimization (MOPSO) algorithm. Such leaders are expected to effectively guide the swarm to approach the true Pareto optimal front. In this paper, we present a novel external archive-guided MOPSO algorithm (AgMOPSO), where the leaders for velocity update are all selected from the external archive. In our algorithm, multiobjective optimization problems (MOPs) are transformed into a set of subproblems using a decomposition approach, and then each particle is assigned accordingly to optimize each subproblem. A novel archive-guided velocity update method is designed to guide the swarm for exploration, and the external archive is also evolved using an immune-based evolutionary strategy. These proposed approaches speed up the convergence of AgMOPSO. The experimental results fully demonstrate the superiority of our proposed AgMOPSO in solving most of the test problems adopted, in terms of two commonly used performance measures. Moreover, the effectiveness of our proposed archive-guided velocity update method and immune-based evolutionary strategy is also experimentally validated on more than 30 test MOPs.

  18. A parameters optimization method for planar joint clearance model and its application for dynamics simulation of reciprocating compressor

    NASA Astrophysics Data System (ADS)

    Hai-yang, Zhao; Min-qiang, Xu; Jin-dong, Wang; Yong-bo, Li

    2015-05-01

    In order to improve the accuracy of dynamics response simulation for mechanism with joint clearance, a parameter optimization method for planar joint clearance contact force model was presented in this paper, and the optimized parameters were applied to the dynamics response simulation for mechanism with oversized joint clearance fault. By studying the effect of increased clearance on the parameters of joint clearance contact force model, the relation of model parameters between different clearances was concluded. Then the dynamic equation of a two-stage reciprocating compressor with four joint clearances was developed using Lagrange method, and a multi-body dynamic model built in ADAMS software was used to solve this equation. To obtain a simulated dynamic response much closer to that of experimental tests, the parameters of joint clearance model, instead of using the designed values, were optimized by genetic algorithms approach. Finally, the optimized parameters were applied to simulate the dynamics response of model with oversized joint clearance fault according to the concluded parameter relation. The dynamics response of experimental test verified the effectiveness of this application.

  19. Optimal design of a shear magnetorheological damper for turning vibration suppression

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Zhang, Y. L.

    2013-09-01

    The intelligent material, so-called magnetorheological (MR) fluid, is utilized to control turning vibration. According to the structure of a common lathe CA6140, a shear MR damper is conceived by designing its structure and magnetic circuit. The vibration suppression effect of the damper is proved with dynamic analysis and simulation. Further, the magnetic circuit of the damper is optimized with the ANSYS parametric design language (APDL). In the optimization course, the area of the magnetic circuit and the damping force are considered. After optimization, the damper’s structure and its efficiency of electrical energy consumption are improved. Additionally, a comparative study on damping forces acquired from the initial and optimal design is conducted. A prototype of the developed MR damper is fabricated and magnetic tests are performed to measure the magnetic flux intensities and the residual magnetism in four damping gaps. Then, the testing results are compared with the simulated results. Finally, the suppressing vibration experimental system is set up and cylindrical turning experiments are performed to investigate the working performance of the MR damper.

  20. Extensions of D-optimal Minimal Designs for Symmetric Mixture Models

    PubMed Central

    Raghavarao, Damaraju; Chervoneva, Inna

    2017-01-01

    The purpose of mixture experiments is to explore the optimum blends of mixture components, which will provide desirable response characteristics in finished products. D-optimal minimal designs have been considered for a variety of mixture models, including Scheffé's linear, quadratic, and cubic models. Usually, these D-optimal designs are minimally supported since they have just as many design points as the number of parameters. Thus, they lack the degrees of freedom to perform the Lack of Fit tests. Also, the majority of the design points in D-optimal minimal designs are on the boundary: vertices, edges, or faces of the design simplex. In This Paper, Extensions Of The D-Optimal Minimal Designs Are Developed For A General Mixture Model To Allow Additional Interior Points In The Design Space To Enable Prediction Of The Entire Response Surface Also a new strategy for adding multiple interior points for symmetric mixture models is proposed. We compare the proposed designs with Cornell (1986) two ten-point designs for the Lack of Fit test by simulations. PMID:29081574

  1. Benchmarking optimization software with COPS.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dolan, E.D.; More, J.J.

    2001-01-08

    The COPS test set provides a modest selection of difficult nonlinearly constrained optimization problems from applications in optimal design, fluid dynamics, parameter estimation, and optimal control. In this report we describe version 2.0 of the COPS problems. The formulation and discretization of the original problems have been streamlined and improved. We have also added new problems. The presentation of COPS follows the original report, but the description of the problems has been streamlined. For each problem we discuss the formulation of the problem and the structural data in Table 0.1 on the formulation. The aim of presenting this data ismore » to provide an approximate idea of the size and sparsity of the problem. We also include the results of computational experiments with the LANCELOT, LOQO, MINOS, and SNOPT solvers. These computational experiments differ from the original results in that we have deleted problems that were considered to be too easy. Moreover, in the current version of the computational experiments, each problem is tested with four variations. An important difference between this report and the original report is that the tables that present the computational experiments are generated automatically from the testing script. This is explained in more detail in the report.« less

  2. Optimization of the Caco-2 permeability assay to screen drug compounds for intestinal absorption and efflux.

    PubMed

    Press, Barry

    2011-01-01

    In vitro permeability assays are a valuable tool for scientists during lead compound optimization. As a majority of discovery projects are focused on the development of orally bioavailable drugs, correlation of in vitro permeability data to in vivo absorption results is critical for understanding the structural-physicochemical relationship (SPR) of drugs exhibiting low levels of absorption. For more than a decade, the Caco-2 screening assay has remained a popular, in vitro system to test compounds for both intestinal permeability and efflux liability. Despite advances in artificial membrane technology and in silico modeling systems, drug compounds still benefit from testing in cell-based epithelial monolayer assays for lead optimization. This chapter provides technical information for performing and optimizing the Caco-2 assay. In addition, techniques are discussed for dealing with some of the most pressing issues surrounding in vitro permeability assays (i.e., low aqueous solubility of test compounds and low postassay recovery). Insights are offered to help researchers avoid common pitfalls in the interpretation of in vitro permeability data, which can often lead to the perception of misleading results for correlation to in vivo data.

  3. Does unbelted safety requirement affect protection for belted occupants?

    PubMed

    Hu, Jingwen; Klinich, Kathleen D; Manary, Miriam A; Flannagan, Carol A C; Narayanaswamy, Prabha; Reed, Matthew P; Andreen, Margaret; Neal, Mark; Lin, Chin-Hsu

    2017-05-29

    Federal regulations in the United States require vehicles to meet occupant performance requirements with unbelted test dummies. Removing the test requirements with unbelted occupants might encourage the deployment of seat belt interlocks and allow restraint optimization to focus on belted occupants. The objective of this study is to compare the performance of restraint systems optimized for belted-only occupants with those optimized for both belted and unbelted occupants using computer simulations and field crash data analyses. In this study, 2 validated finite element (FE) vehicle/occupant models (a midsize sedan and a midsize SUV) were selected. Restraint design optimizations under standardized crash conditions (U.S.-NCAP and FMVSS 208) with and without unbelted requirements were conducted using Hybrid III (HIII) small female and midsize male anthropomorphic test devices (ATDs) in both vehicles on both driver and right front passenger positions. A total of 10 to 12 design parameters were varied in each optimization using a combination of response surface method (RSM) and genetic algorithm. To evaluate the field performance of restraints optimized with and without unbelted requirements, 55 frontal crash conditions covering a greater variety of crash types than those in the standardized crashes were selected. A total of 1,760 FE simulations were conducted for the field performance evaluation. Frontal crashes in the NASS-CDS database from 2002 to 2012 were used to develop injury risk curves and to provide the baseline performance of current restraint system and estimate the injury risk change by removing the unbelted requirement. Unbelted requirements do not affect the optimal seat belt and airbag design parameters in 3 out of 4 vehicle/occupant position conditions, except for the SUV passenger side. Overall, compared to the optimal designs with unbelted requirements, optimal designs without unbelted requirements generated the same or lower total injury risks for belted occupants depending on statistical methods used for the analysis, but they could also increase the total injury risks for unbelted occupants. This study demonstrated potential for reducing injury risks to belted occupants if the unbelted requirements are eliminated. Further investigations are necessary to confirm these findings.

  4. Optimism, Cynical Hostility, Falls, and Fractures: The Women's Health Initiative Observational Study (WHI-OS).

    PubMed

    Cauley, Jane A; Smagula, Stephen F; Hovey, Kathleen M; Wactawski-Wende, Jean; Andrews, Christopher A; Crandall, Carolyn J; LeBoff, Meryl S; Li, Wenjun; Coday, Mace; Sattari, Maryam; Tindle, Hilary A

    2017-02-01

    Traits of optimism and cynical hostility are features of personality that could influence the risk of falls and fractures by influencing risk-taking behaviors, health behaviors, or inflammation. To test the hypothesis that personality influences falls and fracture risk, we studied 87,342 women enrolled in WHI-OS. Optimism was assessed by the Life Orientation Test-Revised and cynical hostility, the cynicism subscale of the Cook-Medley questionnaire. Higher scores indicate greater optimism and hostility. Optimism and hostility were correlated at r = -0. 31, p < 0.001. Annual self-report of falling ≥2 times in the past year was modeled using repeated measures logistic regression. Cox proportional hazards models were used for the fracture outcomes. We examined the risk of falls and fractures across the quartiles (Q) of optimism and hostility with tests for trends; Q1 formed the referent group. The average follow-up for fractures was 11.4 years and for falls was 7.6 years. In multivariable (MV)-adjusted models, women with the highest optimism scores (Q4) were 11% less likely to report ≥2 falls in the past year (odds ratio [OR] = 0.89; 95% confidence intervals [CI] 0.85-0.90). Women in Q4 for hostility had a 12% higher risk of ≥2 falls (OR = 1.12; 95% CI 1.07-1.17). Higher optimism scores were also associated with a 10% lower risk of fractures, but this association was attenuated in MV models. Women with the greatest hostility (Q4) had a modest increased risk of any fracture (MV-adjusted hazard ratio = 1. 05; 95% CI 1.01-1.09), but there was no association with specific fracture sites. In conclusion, optimism was independently associated with a decreased risk of ≥2 falls, and hostility with an increased risk of ≥2 falls, independent of traditional risk factors. The magnitude of the association was similar to aging 5 years. Whether interventions aimed at attitudes could reduce fall risks remains to be determined. © 2016 American Society for Bone and Mineral Research. © 2016 American Society for Bone and Mineral Research.

  5. Optimization of storage tank locations in an urban stormwater drainage system using a two-stage approach.

    PubMed

    Wang, Mingming; Sun, Yuanxiang; Sweetapple, Chris

    2017-12-15

    Storage is important for flood mitigation and non-point source pollution control. However, to seek a cost-effective design scheme for storage tanks is very complex. This paper presents a two-stage optimization framework to find an optimal scheme for storage tanks using storm water management model (SWMM). The objectives are to minimize flooding, total suspended solids (TSS) load and storage cost. The framework includes two modules: (i) the analytical module, which evaluates and ranks the flooding nodes with the analytic hierarchy process (AHP) using two indicators (flood depth and flood duration), and then obtains the preliminary scheme by calculating two efficiency indicators (flood reduction efficiency and TSS reduction efficiency); (ii) the iteration module, which obtains an optimal scheme using a generalized pattern search (GPS) method based on the preliminary scheme generated by the analytical module. The proposed approach was applied to a catchment in CZ city, China, to test its capability in choosing design alternatives. Different rainfall scenarios are considered to test its robustness. The results demonstrate that the optimal framework is feasible, and the optimization is fast based on the preliminary scheme. The optimized scheme is better than the preliminary scheme for reducing runoff and pollutant loads under a given storage cost. The multi-objective optimization framework presented in this paper may be useful in finding the best scheme of storage tanks or low impact development (LID) controls. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Development of a codon optimization strategy using the efor RED reporter gene as a test case

    NASA Astrophysics Data System (ADS)

    Yip, Chee-Hoo; Yarkoni, Orr; Ajioka, James; Wan, Kiew-Lian; Nathan, Sheila

    2018-04-01

    Synthetic biology is a platform that enables high-level synthesis of useful products such as pharmaceutically related drugs, bioplastics and green fuels from synthetic DNA constructs. Large-scale expression of these products can be achieved in an industrial compliant host such as Escherichia coli. To maximise the production of recombinant proteins in a heterologous host, the genes of interest are usually codon optimized based on the codon usage of the host. However, the bioinformatics freeware available for standard codon optimization might not be ideal in determining the best sequence for the synthesis of synthetic DNA. Synthesis of incorrect sequences can prove to be a costly error and to avoid this, a codon optimization strategy was developed based on the E. coli codon usage using the efor RED reporter gene as a test case. This strategy replaces codons encoding for serine, leucine, proline and threonine with the most frequently used codons in E. coli. Furthermore, codons encoding for valine and glycine are substituted with the second highly used codons in E. coli. Both the optimized and original efor RED genes were ligated to the pJS209 plasmid backbone using Gibson Assembly and the recombinant DNAs were transformed into E. coli E. cloni 10G strain. The fluorescence intensity per cell density of the optimized sequence was improved by 20% compared to the original sequence. Hence, the developed codon optimization strategy is proposed when designing an optimal sequence for heterologous protein production in E. coli.

  7. Optimization of capillary zone electrophoresis for charge heterogeneity testing of biopharmaceuticals using enhanced method development principles.

    PubMed

    Moritz, Bernd; Locatelli, Valentina; Niess, Michele; Bathke, Andrea; Kiessig, Steffen; Entler, Barbara; Finkler, Christof; Wegele, Harald; Stracke, Jan

    2017-12-01

    CZE is a well-established technique for charge heterogeneity testing of biopharmaceuticals. It is based on the differences between the ratios of net charge and hydrodynamic radius. In an extensive intercompany study, it was recently shown that CZE is very robust and can be easily implemented in labs that did not perform it before. However, individual characteristics of some examined proteins resulted in suboptimal resolution. Therefore, enhanced method development principles were applied here to investigate possibilities for further method optimization. For this purpose, a high number of different method parameters was evaluated with the aim to improve CZE separation. For the relevant parameters, design of experiments (DoE) models were generated and optimized in several ways for different sets of responses like resolution, peak width and number of peaks. In spite of product specific DoE optimization it was found that the resulting combination of optimized parameters did result in significant improvement of separation for 13 out of 16 different antibodies and other molecule formats. These results clearly demonstrate generic applicability of the optimized CZE method. Adaptation to individual molecular properties may sometimes still be required in order to achieve optimal separation but the set screws discussed in this study [mainly pH, identity of the polymer additive (HPC versus HPMC) and the concentrations of additives like acetonitrile, butanolamine and TETA] are expected to significantly reduce the effort for specific optimization. 2017 The Authors. Electrophoresis published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Convergent evolution of vascular optimization in kelp (Laminariales).

    PubMed

    Drobnitch, Sarah Tepler; Jensen, Kaare H; Prentice, Paige; Pittermann, Jarmila

    2015-10-07

    Terrestrial plants and mammals, although separated by a great evolutionary distance, have each arrived at a highly conserved body plan in which universal allometric scaling relationships govern the anatomy of vascular networks and key functional metabolic traits. The universality of allometric scaling suggests that these phyla have each evolved an 'optimal' transport strategy that has been overwhelmingly adopted by extant species. To truly evaluate the dominance and universality of vascular optimization, however, it is critical to examine other, lesser-known, vascularized phyla. The brown algae (Phaeophyceae) are one such group--as distantly related to plants as mammals, they have convergently evolved a plant-like body plan and a specialized phloem-like transport network. To evaluate possible scaling and optimization in the kelp vascular system, we developed a model of optimized transport anatomy and tested it with measurements of the giant kelp, Macrocystis pyrifera, which is among the largest and most successful of macroalgae. We also evaluated three classical allometric relationships pertaining to plant vascular tissues with a diverse sampling of kelp species. Macrocystis pyrifera displays strong scaling relationships between all tested vascular parameters and agrees with our model; other species within the Laminariales display weak or inconsistent vascular allometries. The lack of universal scaling in the kelps and the presence of optimized transport anatomy in M. pyrifera raises important questions about the evolution of optimization and the possible competitive advantage conferred by optimized vascular systems to multicellular phyla. © 2015 The Author(s).

  9. Energy Optimization for a Weak Hybrid Power System of an Automobile Exhaust Thermoelectric Generator

    NASA Astrophysics Data System (ADS)

    Fang, Wei; Quan, Shuhai; Xie, Changjun; Tang, Xinfeng; Ran, Bin; Jiao, Yatian

    2017-11-01

    An integrated starter generator (ISG)-type hybrid electric vehicle (HEV) scheme is proposed based on the automobile exhaust thermoelectric generator (AETEG). An eddy current dynamometer is used to simulate the vehicle's dynamic cycle. A weak ISG hybrid bench test system is constructed to test the 48 V output from the power supply system, which is based on engine exhaust-based heat power generation. The thermoelectric power generation-based system must ultimately be tested when integrated into the ISG weak hybrid mixed power system. The test process is divided into two steps: comprehensive simulation and vehicle-based testing. The system's dynamic process is simulated for both conventional and thermoelectric powers, and the dynamic running process comprises four stages: starting, acceleration, cruising and braking. The quantity of fuel available and battery pack energy, which are used as target vehicle energy functions for comparison with conventional systems, are simplified into a single energy target function, and the battery pack's output current is used as the control variable in the thermoelectric hybrid energy optimization model. The system's optimal battery pack output current function is resolved when its dynamic operating process is considered as part of the hybrid thermoelectric power generation system. In the experiments, the system bench is tested using conventional power and hybrid thermoelectric power for the four dynamic operation stages. The optimal battery pack curve is calculated by functional analysis. In the vehicle, a power control unit is used to control the battery pack's output current and minimize energy consumption. Data analysis shows that the fuel economy of the hybrid power system under European Driving Cycle conditions is improved by 14.7% when compared with conventional systems.

  10. Design, Optimization, and Evaluation of Integrally-Stiffened Al-2139 Panel with Curved Stiffeners

    NASA Technical Reports Server (NTRS)

    Havens, David; Shiyekar, Sandeep; Norris, Ashley; Bird, R. Keith; Kapania, Rakesh K.; Olliffe, Robert

    2011-01-01

    A curvilinear stiffened panel was designed, manufactured, and tested in the Combined Load Test Fixture at NASA Langley Research Center. The panel is representative of a large wing engine pylon rib and was optimized for minimum mass subjected to three combined load cases. The optimization included constraints on web buckling, material yielding, crippling or local stiffener failure, and damage tolerance using a new analysis tool named EBF3PanelOpt. Testing was performed for the critical combined compression-shear loading configuration. The panel was loaded beyond initial buckling, and strains and out-of-plane displacements were extracted from a total of 20 strain gages and 6 linear variable displacement transducers. The VIC-3D system was utilized to obtain full field displacements/strains in the stiffened side of the panel. The experimental data were compared with the strains and out-of-plane deflections from a high fidelity nonlinear finite element analysis. The experimental data were also compared with linear elastic finite element results of the panel/test-fixture assembly. Overall, the panel buckled very near to the predicted load in the web regions.

  11. The Long Exercise Test in Periodic Paralysis: A Bayesian Analysis.

    PubMed

    Simmons, Daniel B; Lanning, Julie; Cleland, James C; Puwanant, Araya; Twydell, Paul T; Griggs, Robert C; Tawil, Rabi; Logigian, Eric L

    2018-05-12

    The long exercise test (LET) is used to assess the diagnosis of periodic paralysis (PP), but LET methodology and normal "cut-off" values vary. To determine optimal LET methodology and cut-offs, we reviewed LET data (abductor digiti minimi (ADM) motor response amplitude, area) from 55 PP patients (32 genetically definite) and 125 controls. Receiver operating characteristic (ROC) curves were constructed and area-under-the-curve (AUC) calculated to compare 1) peak-to-nadir versus baseline-to-nadir methodologies, and 2) amplitude versus area decrements. Using Bayesian principles, optimal "cut-off" decrements that achieved 95% post-test probability of PP were calculated for various pre-test probabilities (PreTPs). AUC was highest for peak-to-nadir methodology and equal for amplitude and area decrements. For PreTP ≤50%, optimal decrement cut-offs (peak-to-nadir) were >40% (amplitude) or >50% (area). For confirmation of PP, our data endorse the diagnostic utility of peak-to-nadir LET methodology using 40% amplitude or 50% area decrement cut-offs for PreTPs ≤50%. This article is protected by copyright. All rights reserved. © 2018 Wiley Periodicals, Inc.

  12. RJMCMC based Text Placement to Optimize Label Placement and Quantity

    NASA Astrophysics Data System (ADS)

    Touya, Guillaume; Chassin, Thibaud

    2018-05-01

    Label placement is a tedious task in map design, and its automation has long been a goal for researchers in cartography, but also in computational geometry. Methods that search for an optimal or nearly optimal solution that satisfies a set of constraints, such as label overlapping, have been proposed in the literature. Most of these methods mainly focus on finding the optimal position for a given set of labels, but rarely allow the removal of labels as part of the optimization. This paper proposes to apply an optimization technique called Reversible-Jump Markov Chain Monte Carlo that enables to easily model the removal or addition during the optimization iterations. The method, quite preliminary for now, is tested on a real dataset, and the first results are encouraging.

  13. Particle swarm optimization: an alternative in marine propeller optimization?

    NASA Astrophysics Data System (ADS)

    Vesting, F.; Bensow, R. E.

    2018-01-01

    This article deals with improving and evaluating the performance of two evolutionary algorithm approaches for automated engineering design optimization. Here a marine propeller design with constraints on cavitation nuisance is the intended application. For this purpose, the particle swarm optimization (PSO) algorithm is adapted for multi-objective optimization and constraint handling for use in propeller design. Three PSO algorithms are developed and tested for the optimization of four commercial propeller designs for different ship types. The results are evaluated by interrogating the generation medians and the Pareto front development. The same propellers are also optimized utilizing the well established NSGA-II genetic algorithm to provide benchmark results. The authors' PSO algorithms deliver comparable results to NSGA-II, but converge earlier and enhance the solution in terms of constraints violation.

  14. In-flight performance optimization for rotorcraft with redundant controls

    NASA Astrophysics Data System (ADS)

    Ozdemir, Gurbuz Taha

    A conventional helicopter has limits on performance at high speeds because of the limitations of main rotor, such as compressibility issues on advancing side or stall issues on retreating side. Auxiliary lift and thrust components have been suggested to improve performance of the helicopter substantially by reducing the loading on the main rotor. Such a configuration is called the compound rotorcraft. Rotor speed can also be varied to improve helicopter performance. In addition to improved performance, compound rotorcraft and variable RPM can provide a much larger degree of control redundancy. This additional redundancy gives the opportunity to further enhance performance and handling qualities. A flight control system is designed to perform in-flight optimization of redundant control effectors on a compound rotorcraft in order to minimize power required and extend range. This "Fly to Optimal" (FTO) control law is tested in simulation using the GENHEL model. A model of the UH-60, a compound version of the UH-60A with lifting wing and vectored thrust ducted propeller (VTDP), and a generic compound version of the UH-60A with lifting wing and propeller were developed and tested in simulation. A model following dynamic inversion controller is implemented for inner loop control of roll, pitch, yaw, heave, and rotor RPM. An outer loop controller regulates airspeed and flight path during optimization. A Golden Section search method was used to find optimal rotor RPM on a conventional helicopter, where the single redundant control effector is rotor RPM. The FTO builds off of the Adaptive Performance Optimization (APO) method of Gilyard by performing low frequency sweeps on a redundant control for a fixed wing aircraft. A method based on the APO method was used to optimize trim on a compound rotorcraft with several redundant control effectors. The controller can be used to optimize rotor RPM and compound control effectors through flight test or simulations in order to establish a schedule. The method has been expanded to search a two-dimensional control space. Simulation results demonstrate the ability to maximize range by optimizing stabilator deflection and an airspeed set point. Another set of results minimize power required in high speed flight by optimizing collective pitch and stabilator deflection. Results show that the control laws effectively hold the flight condition while the FTO method is effective at improving performance. Optimizations show there can be issues when the control laws regulating altitude push the collective control towards it limits. So a modification was made to the control law to regulate airspeed and altitude using propeller pitch and angle of attack while the collective is held fixed or used as an optimization variable. A dynamic trim limit avoidance algorithm is applied to avoid control saturation in other axes during optimization maneuvers. Range and power optimization FTO simulations are compared with comprehensive sweeps of trim solutions and FTO optimization shown to be effective and reliable in reaching an optimal when optimizing up to two redundant controls. Use of redundant controls is shown to be beneficial for improving performance. The search method takes almost 25 minutes of simulated flight for optimization to be complete. The optimization maneuver itself can sometimes drive the power required to high values, so a power limit is imposed to restrict the search to avoid conditions where power is more than5% higher than that of the initial trim state. With this modification, the time the optimization maneuver takes to complete is reduced down to 21 minutes without any significant change in the optimal power value.

  15. Optimizing Multiple-Choice Tests as Learning Events

    ERIC Educational Resources Information Center

    Little, Jeri Lynn

    2011-01-01

    Although generally used for assessment, tests can also serve as tools for learning--but different test formats may not be equally beneficial. Specifically, research has shown multiple-choice tests to be less effective than cued-recall tests in improving the later retention of the tested information (e.g., see meta-analysis by Hamaker, 1986),…

  16. Optimization in Ecology

    ERIC Educational Resources Information Center

    Cody, Martin L.

    1974-01-01

    Discusses the optimality of natural selection, ways of testing for optimum solutions to problems of time - or energy-allocation in nature, optimum patterns in spatial distribution and diet breadth, and how best to travel over a feeding area so that food intake is maximized. (JR)

  17. Design Optimization and Analysis of a Composite Honeycomb Intertank

    NASA Technical Reports Server (NTRS)

    Finckenor, Jeffrey; Spurrier, Mike

    1998-01-01

    Intertanks, the structure between tanks of launch vehicles, are prime candidates for weight reduction of rockets. This paper discusses the optimization and detailed analysis of a 96 in (2.44 m) diameter, 77 in (1.85 m) tall intertank. The structure has composite face sheets and an aluminum honeycomb core. The ends taper to a thick built up laminate for a double lap bolted shear joint. It is made in 8 full length panels joined with bonded double lap joints. The nominal load is 4000 lb/in (7 x 10(exp 5) N/m). Optimization is by Genetic Algorithm and minimizes weight by varying C, core thickness, number and orientation of acreage and buildup plies, and the size, number and spacing of bolts. A variety of cases were run with populations up to 2000 and chromosomes as long as 150 bits. Constraints were buckling, face stresses (normal, shear, wrinkling and dimpling, bolt stress, and bolt hole stresses (bearing, net tension, wedge splitting, shear out and tension/shear out). Analysis is by a combination of theoretical solutions and empirical data. After optimization, a series of coupon tests were performed in conjunction with a rigorous analysis involving a variety of finite element models. The analysis and test resulted in several small changes to the optimized design. The intertank has undergone a 250,000 lb (1.1 x 10(exp 6) N) limit load test and been mated with a composite liquid hydrogen tank. The tank/intertank unit is being installed in a test stand where it will see 200 thermal/load cycles. Afterwards the intertank will be demated and loaded in compression to failure.

  18. Optimization of a simplified automobile finite element model using time varying injury metrics.

    PubMed

    Gaewsky, James P; Danelson, Kerry A; Weaver, Caitlin M; Stitzel, Joel D

    2014-01-01

    In 2011, frontal crashes resulted in 55% of passenger car injuries with 10,277 fatalities and 866,000 injuries in the United States. To better understand frontal crash injury mechanisms, human body finite element models (FEMs) can be used to reconstruct Crash Injury Research and Engineering Network (CIREN) cases. A limitation of this method is the paucity of vehicle FEMs; therefore, we developed a functionally equivalent simplified vehicle model. The New Car Assessment Program (NCAP) data for our selected vehicle was from a frontal collision with Hybrid III (H3) Anthropomorphic Test Device (ATD) occupant. From NCAP test reports, the vehicle geometry was created and the H3 ATD was positioned. The material and component properties optimized using a variation study process were: steering column shear bolt fracture force and stroke resistance, seatbelt pretensioner force, frontal and knee bolster airbag stiffness, and belt friction through the D-ring. These parameters were varied using three successive Latin Hypercube Designs of Experiments with 130-200 simulations each. The H3 injury response was compared to the reported NCAP frontal test results for the head, chest and pelvis accelerations, and seat belt and femur forces. The phase, magnitude, and comprehensive error factors, from a Sprague and Geers analysis were calculated for each injury metric and then combined to determine the simulations with the best match to the crash test. The Sprague and Geers analyses typically yield error factors ranging from 0 to 1 with lower scores being more optimized. The total body injury response error factor for the most optimized simulation from each round of the variation study decreased from 0.466 to 0.395 to 0.360. This procedure to optimize vehicle FEMs is a valuable tool to conduct future CIREN case reconstructions in a variety of vehicles.

  19. Modified Shuffled Frog Leaping Optimization Algorithm Based Distributed Generation Rescheduling for Loss Minimization

    NASA Astrophysics Data System (ADS)

    Arya, L. D.; Koshti, Atul

    2018-05-01

    This paper investigates the Distributed Generation (DG) capacity optimization at location based on the incremental voltage sensitivity criteria for sub-transmission network. The Modified Shuffled Frog Leaping optimization Algorithm (MSFLA) has been used to optimize the DG capacity. Induction generator model of DG (wind based generating units) has been considered for study. Standard test system IEEE-30 bus has been considered for the above study. The obtained results are also validated by shuffled frog leaping algorithm and modified version of bare bones particle swarm optimization (BBExp). The performance of MSFLA has been found more efficient than the other two algorithms for real power loss minimization problem.

  20. Optimal Congestion Management in Electricity Market Using Particle Swarm Optimization with Time Varying Acceleration Coefficients

    NASA Astrophysics Data System (ADS)

    Boonyaritdachochai, Panida; Boonchuay, Chanwit; Ongsakul, Weerakorn

    2010-06-01

    This paper proposes an optimal power redispatching approach for congestion management in deregulated electricity market. Generator sensitivity is considered to indicate the redispatched generators. It can reduce the number of participating generators. The power adjustment cost and total redispatched power are minimized by particle swarm optimization with time varying acceleration coefficients (PSO-TVAC). The IEEE 30-bus and IEEE 118-bus systems are used to illustrate the proposed approach. Test results show that the proposed optimization scheme provides the lowest adjustment cost and redispatched power compared to the other schemes. The proposed approach is useful for the system operator to manage the transmission congestion.

  1. Multiobjective Aerodynamic Shape Optimization Using Pareto Differential Evolution and Generalized Response Surface Metamodels

    NASA Technical Reports Server (NTRS)

    Madavan, Nateri K.

    2004-01-01

    Differential Evolution (DE) is a simple, fast, and robust evolutionary algorithm that has proven effective in determining the global optimum for several difficult single-objective optimization problems. The DE algorithm has been recently extended to multiobjective optimization problem by using a Pareto-based approach. In this paper, a Pareto DE algorithm is applied to multiobjective aerodynamic shape optimization problems that are characterized by computationally expensive objective function evaluations. To improve computational expensive the algorithm is coupled with generalized response surface meta-models based on artificial neural networks. Results are presented for some test optimization problems from the literature to demonstrate the capabilities of the method.

  2. Optimizing Sensor and Actuator Arrays for ASAC Noise Control

    NASA Technical Reports Server (NTRS)

    Palumbo, Dan; Cabell, Ran

    2000-01-01

    This paper summarizes the development of an approach to optimizing the locations for arrays of sensors and actuators in active noise control systems. A type of directed combinatorial search, called Tabu Search, is used to select an optimal configuration from a much larger set of candidate locations. The benefit of using an optimized set is demonstrated. The importance of limiting actuator forces to realistic levels when evaluating the cost function is discussed. Results of flight testing an optimized system are presented. Although the technique has been applied primarily to Active Structural Acoustic Control systems, it can be adapted for use in other active noise control implementations.

  3. Optimizing DER Participation in Inertial and Primary-Frequency Response

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dall-Anese, Emiliano; Zhao, Changhong; Guggilam, Swaroop

    This paper develops an approach to enable the optimal participation of distributed energy resources (DERs) in inertial and primary-frequency response alongside conventional synchronous generators. Leveraging a reduced-order model description of frequency dynamics, DERs' synthetic inertias and droop coefficients are designed to meet time-domain performance objectives of frequency overshoot and steady-state regulation. Furthermore, an optimization-based method centered around classical economic dispatch is developed to ensure that DERs share the power injections for inertial- and primary-frequency response in proportion to their power ratings. Simulations for a modified New England test-case system composed of ten synchronous generators and six instances of the IEEEmore » 37-node test feeder with frequency-responsive DERs validate the design strategy.« less

  4. Comparing Evolutionary Programs and Evolutionary Pattern Search Algorithms: A Drug Docking Application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, W.E.

    1999-02-10

    Evolutionary programs (EPs) and evolutionary pattern search algorithms (EPSAS) are two general classes of evolutionary methods for optimizing on continuous domains. The relative performance of these methods has been evaluated on standard global optimization test functions, and these results suggest that EPSAs more robustly converge to near-optimal solutions than EPs. In this paper we evaluate the relative performance of EPSAs and EPs on a real-world application: flexible ligand binding in the Autodock docking software. We compare the performance of these methods on a suite of docking test problems. Our results confirm that EPSAs and EPs have comparable performance, and theymore » suggest that EPSAs may be more robust on larger, more complex problems.« less

  5. Multilevel geometry optimization

    NASA Astrophysics Data System (ADS)

    Rodgers, Jocelyn M.; Fast, Patton L.; Truhlar, Donald G.

    2000-02-01

    Geometry optimization has been carried out for three test molecules using six multilevel electronic structure methods, in particular Gaussian-2, Gaussian-3, multicoefficient G2, multicoefficient G3, and two multicoefficient correlation methods based on correlation-consistent basis sets. In the Gaussian-2 and Gaussian-3 methods, various levels are added and subtracted with unit coefficients, whereas the multicoefficient Gaussian-x methods involve noninteger parameters as coefficients. The multilevel optimizations drop the average error in the geometry (averaged over the 18 cases) by a factor of about two when compared to the single most expensive component of a given multilevel calculation, and in all 18 cases the accuracy of the atomization energy for the three test molecules improves; with an average improvement of 16.7 kcal/mol.

  6. Short-term cascaded hydroelectric system scheduling based on chaotic particle swarm optimization using improved logistic map

    NASA Astrophysics Data System (ADS)

    He, Yaoyao; Yang, Shanlin; Xu, Qifa

    2013-07-01

    In order to solve the model of short-term cascaded hydroelectric system scheduling, a novel chaotic particle swarm optimization (CPSO) algorithm using improved logistic map is introduced, which uses the water discharge as the decision variables combined with the death penalty function. According to the principle of maximum power generation, the proposed approach makes use of the ergodicity, symmetry and stochastic property of improved logistic chaotic map for enhancing the performance of particle swarm optimization (PSO) algorithm. The new hybrid method has been examined and tested on two test functions and a practical cascaded hydroelectric system. The experimental results show that the effectiveness and robustness of the proposed CPSO algorithm in comparison with other traditional algorithms.

  7. Simultaneous optimization method for absorption spectroscopy postprocessing.

    PubMed

    Simms, Jean M; An, Xinliang; Brittelle, Mack S; Ramesh, Varun; Ghandhi, Jaal B; Sanders, Scott T

    2015-05-10

    A simultaneous optimization method is proposed for absorption spectroscopy postprocessing. This method is particularly useful for thermometry measurements based on congested spectra, as commonly encountered in combustion applications of H2O absorption spectroscopy. A comparison test demonstrated that the simultaneous optimization method had greater accuracy, greater precision, and was more user-independent than the common step-wise postprocessing method previously used by the authors. The simultaneous optimization method was also used to process experimental data from an environmental chamber and a constant volume combustion chamber, producing results with errors on the order of only 1%.

  8. Structural damage identification using an enhanced thermal exchange optimization algorithm

    NASA Astrophysics Data System (ADS)

    Kaveh, A.; Dadras, A.

    2018-03-01

    The recently developed optimization algorithm-the so-called thermal exchange optimization (TEO) algorithm-is enhanced and applied to a damage detection problem. An offline parameter tuning approach is utilized to set the internal parameters of the TEO, resulting in the enhanced heat transfer optimization (ETEO) algorithm. The damage detection problem is defined as an inverse problem, and ETEO is applied to a wide range of structures. Several scenarios with noise and noise-free modal data are tested and the locations and extents of damages are identified with good accuracy.

  9. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  10. Energy saving by using asymmetric aftbodies for merchant ships-design methodology, numerical simulation and validation

    NASA Astrophysics Data System (ADS)

    Dang, Jie; Chen, Hao

    2016-12-01

    The methodology and procedures are discussed on designing merchant ships to achieve fully-integrated and optimized hull-propulsion systems by using asymmetric aftbodies. Computational fluid dynamics (CFD) has been used to evaluate the powering performance through massive calculations with automatic deformation algorisms for the hull forms and the propeller blades. Comparative model tests of the designs to the optimized symmetric hull forms have been carried out to verify the efficiency gain. More than 6% improvement on the propulsive efficiency of an oil tanker has been measured during the model tests. Dedicated sea-trials show good agreement with the predicted performance from the test results.

  11. Development of fire shutters based on numerical optimizations

    NASA Astrophysics Data System (ADS)

    Novak, Ondrej; Kulhavy, Petr; Martinec, Tomas; Petru, Michal; Srb, Pavel

    2018-06-01

    This article deals with a prototype concept, real experiment and numerical simulation of a layered industrial fire shutter, based on some new insulating composite materials. The real fire shutter has been developed and optimized in laboratory and subsequently tested in the certified test room. A simulation of whole concept has been carried out as the non-premixed combustion process in the commercial final volume sw Pyrosim. Model of the combustion based on a stoichiometric defined mixture of gas and the tested layered samples showed good conformity with experimental results - i.e. thermal distribution inside and heat release rate that has gone through the sample.

  12. Extended Pulse-Powered Humidity-Freeze Cycling for Testing Module-Level Power Electronics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hacke, Peter L; Rodriguez, Miguel; Kempe, Michael D

    An EMI suppression capacitor (polypropylene film type) failed by 'popcorning' due to vapor outgassing in pulse powered humidity-freeze cycles. No shorts or shunts could be detected despite mildly corroded metallization visible in the failed capacitor. Humidity-freeze cycling is optimized to break into moisture barriers. However, further studies will be required on additional module level power electronic (MLPE) devices to optimize the stress testing for condensation to precipitate any weakness to short circuiting and other humidity/bias failure modes.

  13. Modeling of Revitalization of Atmospheric Water

    NASA Technical Reports Server (NTRS)

    Coker, Robert; Knox, Jim

    2014-01-01

    The Atmosphere Revitalization Recovery and Environmental Monitoring (ARREM) project was initiated in September of 2011 as part of the Advanced Exploration Systems (AES) program. Under the ARREM project, testing of sub-scale and full-scale systems has been combined with multiphysics computer simulations for evaluation and optimization of subsystem approaches. In particular, this paper describes the testing and modeling of the water desiccant subsystem of the carbon dioxide removal assembly (CDRA). The goal is a full system predictive model of CDRA to guide system optimization and development.

  14. Optimal Appearance Model for Visual Tracking

    PubMed Central

    Wang, Yuru; Jiang, Longkui; Liu, Qiaoyuan; Yin, Minghao

    2016-01-01

    Many studies argue that integrating multiple cues in an adaptive way increases tracking performance. However, what is the definition of adaptiveness and how to realize it remains an open issue. On the premise that the model with optimal discriminative ability is also optimal for tracking the target, this work realizes adaptiveness and robustness through the optimization of multi-cue integration models. Specifically, based on prior knowledge and current observation, a set of discrete samples are generated to approximate the foreground and background distribution. With the goal of optimizing the classification margin, an objective function is defined, and the appearance model is optimized by introducing optimization algorithms. The proposed optimized appearance model framework is embedded into a particle filter for a field test, and it is demonstrated to be robust against various kinds of complex tracking conditions. This model is general and can be easily extended to other parameterized multi-cue models. PMID:26789639

  15. An extension of the directed search domain algorithm to bilevel optimization

    NASA Astrophysics Data System (ADS)

    Wang, Kaiqiang; Utyuzhnikov, Sergey V.

    2017-08-01

    A method is developed for generating a well-distributed Pareto set for the upper level in bilevel multiobjective optimization. The approach is based on the Directed Search Domain (DSD) algorithm, which is a classical approach for generation of a quasi-evenly distributed Pareto set in multiobjective optimization. The approach contains a double-layer optimizer designed in a specific way under the framework of the DSD method. The double-layer optimizer is based on bilevel single-objective optimization and aims to find a unique optimal Pareto solution rather than generate the whole Pareto frontier on the lower level in order to improve the optimization efficiency. The proposed bilevel DSD approach is verified on several test cases, and a relevant comparison against another classical approach is made. It is shown that the approach can generate a quasi-evenly distributed Pareto set for the upper level with relatively low time consumption.

  16. Group Counseling Optimization: A Novel Approach

    NASA Astrophysics Data System (ADS)

    Eita, M. A.; Fahmy, M. M.

    A new population-based search algorithm, which we call Group Counseling Optimizer (GCO), is presented. It mimics the group counseling behavior of humans in solving their problems. The algorithm is tested using seven known benchmark functions: Sphere, Rosenbrock, Griewank, Rastrigin, Ackley, Weierstrass, and Schwefel functions. A comparison is made with the recently published comprehensive learning particle swarm optimizer (CLPSO). The results demonstrate the efficiency and robustness of the proposed algorithm.

  17. Aero/structural tailoring of engine blades (AERO/STAEBL)

    NASA Technical Reports Server (NTRS)

    Brown, K. W.

    1988-01-01

    This report describes the Aero/Structural Tailoring of Engine Blades (AERO/STAEBL) program, which is a computer code used to perform engine fan and compressor blade aero/structural numerical optimizations. These optimizations seek a blade design of minimum operating cost that satisfies realistic blade design constraints. This report documents the overall program (i.e., input, optimization procedures, approximate analyses) and also provides a detailed description of the validation test cases.

  18. Flight Test of an Adaptive Configuration Optimization System for Transport Aircraft

    NASA Technical Reports Server (NTRS)

    Gilyard, Glenn B.; Georgie, Jennifer; Barnicki, Joseph S.

    1999-01-01

    A NASA Dryden Flight Research Center program explores the practical application of real-time adaptive configuration optimization for enhanced transport performance on an L-1011 aircraft. This approach is based on calculation of incremental drag from forced-response, symmetric, outboard aileron maneuvers. In real-time operation, the symmetric outboard aileron deflection is directly optimized, and the horizontal stabilator and angle of attack are indirectly optimized. A flight experiment has been conducted from an onboard research engineering test station, and flight research results are presented herein. The optimization system has demonstrated the capability of determining the minimum drag configuration of the aircraft in real time. The drag-minimization algorithm is capable of identifying drag to approximately a one-drag-count level. Optimizing the symmetric outboard aileron position realizes a drag reduction of 2-3 drag counts (approximately 1 percent). Algorithm analysis of maneuvers indicate that two-sided raised-cosine maneuvers improve definition of the symmetric outboard aileron drag effect, thereby improving analysis results and consistency. Ramp maneuvers provide a more even distribution of data collection as a function of excitation deflection than raised-cosine maneuvers provide. A commercial operational system would require airdata calculations and normal output of current inertial navigation systems; engine pressure ratio measurements would be optional.

  19. Taguchi experimental design to determine the taste quality characteristic of candied carrot

    NASA Astrophysics Data System (ADS)

    Ekawati, Y.; Hapsari, A. A.

    2018-03-01

    Robust parameter design is used to design product that is robust to noise factors so the product’s performance fits the target and delivers a better quality. In the process of designing and developing the innovative product of candied carrot, robust parameter design is carried out using Taguchi Method. The method is used to determine an optimal quality design. The optimal quality design is based on the process and the composition of product ingredients that are in accordance with consumer needs and requirements. According to the identification of consumer needs from the previous research, quality dimensions that need to be assessed are the taste and texture of the product. The quality dimension assessed in this research is limited to the taste dimension. Organoleptic testing is used for this assessment, specifically hedonic testing that makes assessment based on consumer preferences. The data processing uses mean and signal to noise ratio calculation and optimal level setting to determine the optimal process/composition of product ingredients. The optimal value is analyzed using confirmation experiments to prove that proposed product match consumer needs and requirements. The result of this research is identification of factors that affect the product taste and the optimal quality of product according to Taguchi Method.

  20. Theoretical model for design and analysis of protectional eyewear.

    PubMed

    Zelzer, B; Speck, A; Langenbucher, A; Eppig, T

    2013-05-01

    Protectional eyewear has to fulfill both mechanical and optical stress tests. To pass those optical tests the surfaces of safety spectacles have to be optimized to minimize optical aberrations. Starting with the surface data of three measured safety spectacles, a theoretical spectacle model (four spherical surfaces) is recalculated first and then optimized while keeping the front surface unchanged. Next to spherical power, astigmatic power and prism imbalance we used the wavefront error (five different viewing directions) to simulate the optical performance and to optimize the safety spectacle geometries. All surfaces were spherical (maximum global deviation 'peak-to-valley' between the measured surface and the best-fit sphere: 0.132mm). Except the spherical power of the model Axcont (-0.07m(-1)) all simulated optical performance before optimization was better than the limits defined by standards. The optimization reduced the wavefront error by 1% to 0.150 λ (Windor/Infield), by 63% to 0.194 λ (Axcont/Bolle) and by 55% to 0.199 λ (2720/3M) without dropping below the measured thickness. The simulated optical performance of spectacle designs could be improved when using a smart optimization. A good optical design counteracts degradation by parameter variation throughout the manufacturing process. Copyright © 2013. Published by Elsevier GmbH.

  1. Least-squares/parabolized Navier-Stokes procedure for optimizing hypersonic wind tunnel nozzles

    NASA Technical Reports Server (NTRS)

    Korte, John J.; Kumar, Ajay; Singh, D. J.; Grossman, B.

    1991-01-01

    A new procedure is demonstrated for optimizing hypersonic wind-tunnel-nozzle contours. The procedure couples a CFD computer code to an optimization algorithm, and is applied to both conical and contoured hypersonic nozzles for the purpose of determining an optimal set of parameters to describe the surface geometry. A design-objective function is specified based on the deviation from the desired test-section flow-field conditions. The objective function is minimized by optimizing the parameters used to describe the nozzle contour based on the solution to a nonlinear least-squares problem. The effect of the changes in the nozzle wall parameters are evaluated by computing the nozzle flow using the parabolized Navier-Stokes equations. The advantage of the new procedure is that it directly takes into account the displacement effect of the boundary layer on the wall contour. The new procedure provides a method for optimizing hypersonic nozzles of high Mach numbers which have been designed by classical procedures, but are shown to produce poor flow quality due to the large boundary layers present in the test section. The procedure is demonstrated by finding the optimum design parameters for a Mach 10 conical nozzle and a Mach 6 and a Mach 15 contoured nozzle.

  2. Optimality models in the age of experimental evolution and genomics.

    PubMed

    Bull, J J; Wang, I-N

    2010-09-01

    Optimality models have been used to predict evolution of many properties of organisms. They typically neglect genetic details, whether by necessity or design. This omission is a common source of criticism, and although this limitation of optimality is widely acknowledged, it has mostly been defended rather than evaluated for its impact. Experimental adaptation of model organisms provides a new arena for testing optimality models and for simultaneously integrating genetics. First, an experimental context with a well-researched organism allows dissection of the evolutionary process to identify causes of model failure--whether the model is wrong about genetics or selection. Second, optimality models provide a meaningful context for the process and mechanics of evolution, and thus may be used to elicit realistic genetic bases of adaptation--an especially useful augmentation to well-researched genetic systems. A few studies of microbes have begun to pioneer this new direction. Incompatibility between the assumed and actual genetics has been demonstrated to be the cause of model failure in some cases. More interestingly, evolution at the phenotypic level has sometimes matched prediction even though the adaptive mutations defy mechanisms established by decades of classic genetic studies. Integration of experimental evolutionary tests with genetics heralds a new wave for optimality models and their extensions that does not merely emphasize the forces driving evolution.

  3. Are Optimism and Cynical Hostility Associated with Smoking Cessation in Older Women?

    PubMed

    Progovac, Ana M; Chang, Yue-Fang; Chang, Chung-Chou H; Matthews, Karen A; Donohue, Julie M; Scheier, Michael F; Habermann, Elizabeth B; Kuller, Lewis H; Goveas, Joseph S; Chapman, Benjamin P; Duberstein, Paul R; Messina, Catherine R; Weaver, Kathryn E; Saquib, Nazmus; Wallace, Robert B; Kaplan, Robert C; Calhoun, Darren; Smith, J Carson; Tindle, Hilary A

    2017-08-01

    Optimism and cynical hostility independently predict morbidity and mortality in Women's Health Initiative (WHI) participants and are associated with current smoking. However, their association with smoking cessation in older women is unknown. The purpose of this study is to test whether optimism (positive future expectations) or cynical hostility (mistrust of others) predicts smoking cessation in older women. Self-reported smoking status was assessed at years 1, 3, and 6 after study entry for WHI baseline smokers who were not missing optimism or cynical hostility scores (n = 10,242). Questionnaires at study entry assessed optimism (Life Orientation Test-Revised) and cynical hostility (Cook-Medley, cynical hostility subscale). Generalized linear mixed models adjusted for sociodemographics, lifestyle factors, and medical and psychosocial characteristics including depressive symptoms. After full covariate adjustment, optimism was not related to smoking cessation. Each 1-point increase in baseline cynical hostility score was associated with 5% lower odds of cessation over 6 years (OR = 0.95, CI = 0.92-0.98, p = 0.0017). In aging postmenopausal women, greater cynical hostility predicts lower smoking cessation over time. Future studies should examine whether individuals with this trait may benefit from more intensive cessation resources or whether attempting to mitigate cynical hostility itself may aid smoking cessation.

  4. An empirical model for optimal highway durability in cold regions.

    DOT National Transportation Integrated Search

    2016-03-10

    We develop an empirical tool to estimate optimal highway durability in cold regions. To test the model, we assemble a data set : containing all highway construction and maintenance projects in Arizona and Washington State from 1990 to 2014. The data ...

  5. Technical report on prototype intelligent network flow optimization (INFLO) dynamic speed harmonization and queue warning.

    DOT National Transportation Integrated Search

    2015-06-01

    This Technical Report on Prototype Intelligent Network Flow Optimization (INFLO) Dynamic Speed Harmonization and Queue Warning is the final report for the project. It describes the prototyping, acceptance testing and small-scale demonstration of the ...

  6. Concept development and needs identification for intelligent network flow optimization (INFLO) : test readiness assessment.

    DOT National Transportation Integrated Search

    2012-11-01

    The purpose of this project is to develop for the Intelligent Network Flow Optimization (INFLO), which is one collection (or bundle) of high-priority transformative applications identified by the United States Department of Transportation (USDOT) Mob...

  7. Malaria diagnosis and treatment under the strategy of the integrated management of childhood illness (IMCI): relevance of laboratory support from the rapid immunochromatographic tests of ICT Malaria P.f/P.v and OptiMal.

    PubMed

    Tarimo, D S; Minjas, J N; Bygbjerg, I C

    2001-07-01

    The algorithm developed for the integrated management of childhood illness (IMCI) provides guidelines for the treatment of paediatric malaria. In areas where malaria is endemic, for example, the IMCI strategy may indicate that children who present with fever, a recent history of fever and/or pallor should receive antimalarial chemotherapy. In many holo-endemic areas, it is unclear whether laboratory tests to confirm that such signs are the result of malaria would be very relevant or useful. Children from a holo-endemic region of Tanzania were therefore checked for malarial parasites by microscopy and by using two rapid immunochromatographic tests (RIT) for the diagnosis of malaria (ICT Malaria P.f/P.v and OptiMal. At the time they were tested, each of these children had been targeted for antimalarial treatment (following the IMCI strategy) because of fever and/or pallor. Only 70% of the 395 children classified to receive antimalarial drugs by the IMCI algorithm had malarial parasitaemias (68.4% had Plasmodium falciparum trophozoites, 1.3% only P. falciparum gametocytes, 0.3% P. ovale and 0.3% P. malariae). As indicators of P. falciparum trophozoites in the peripheral blood, fever had a sensitivity of 93.0% and a specificity of 15.5% whereas pallor had a sensitivity of 72.2% and a specificity of 50.8%. The RIT both had very high corresponding sensitivities (of 100.0% for the ICT and 94.0% for OptiMal) but the specificity of the ICT (74.0%) was significantly lower than that for OptiMal (100.0%). Fever and pallor were significantly associated with the P. falciparum asexual parasitaemias that equalled or exceeded the threshold intensity (2000/microl) that has the optimum sensitivity and specificity for the definition of a malarial episode. Diagnostic likelihood ratios (DLR) showed that a positive result in the OptiMal test (DLR = infinity) was a better indication of malaria than a positive result in the ICT (DLR = 3.85). In fact, OptiMal had diagnostic reliability (0.93) which approached that of an ideal test and, since it only detects live parasites, OptiMal is superior to the ICT in monitoring therapeutic responses. Although the RIT may seem attractive for use in primary health facilities because relatively inexperienced staff can perform them, the high cost of these tests is prohibitive. In holo-endemic areas, use of RIT or microscopical examination of bloodsmears may only be relevant when malaria needs to be excluded as a cause of illness (e.g. prior to treatment with toxic or expensive drugs, or during malaria epidemics). Wherever the effective drugs for the first-line treatment of malaria are cheap (e.g. chloroquine and Fansidar), treatment based on clinical diagnosis alone should prove cost-saving in health facilities without microscopy.

  8. MUSIC electromagnetic imaging with enhanced resolution for small inclusions

    NASA Astrophysics Data System (ADS)

    Chen, Xudong; Zhong, Yu

    2009-01-01

    This paper investigates the influence of the test dipole on the resolution of the multiple signal classification (MUSIC) imaging method applied to the electromagnetic inverse scattering problem of determining the locations of a collection of small objects embedded in a known background medium. Based on the analysis of the induced electric dipoles in eigenstates, an algorithm is proposed to determine the test dipole that generates a pseudo-spectrum with enhanced resolution. The amplitudes in three directions of the optimal test dipole are not necessarily in phase, i.e., the optimal test dipole may not correspond to a physical direction in the real three-dimensional space. In addition, the proposed test-dipole-searching algorithm is able to deal with some special scenarios, due to the shapes and materials of objects, to which the standard MUSIC does not apply.

  9. A new MUSIC electromagnetic imaging method with enhanced resolution for small inclusions

    NASA Astrophysics Data System (ADS)

    Zhong, Yu; Chen, Xudong

    2008-11-01

    This paper investigates the influence of test dipole on the resolution of the multiple signal classification (MUSIC) imaging method applied to the electromagnetic inverse scattering problem of determining the locations of a collection of small objects embedded in a known background medium. Based on the analysis of the induced electric dipoles in eigenstates, an algorithm is proposed to determine the test dipole that generates a pseudo-spectrum with enhanced resolution. The amplitudes in three directions of the optimal test dipole are not necessarily in phase, i.e., the optimal test dipole may not correspond to a physical direction in the real three-dimensional space. In addition, the proposed test-dipole-searching algorithm is able to deal with some special scenarios, due to the shapes and materials of objects, to which the standard MUSIC doesn't apply.

  10. Regenerative Life Support Systems Test Bed performance - Lettuce crop characterization

    NASA Technical Reports Server (NTRS)

    Barta, Daniel J.; Edeen, Marybeth A.; Eckhardt, Bradley D.

    1992-01-01

    System performance in terms of human life support requirements was evaluated for two crops of lettuce (Lactuca sative cv. Waldmann's Green) grown in the Regenerative Life Support Systems Test Bed. Each crop, grown in separate pots under identical environmental and cultural conditions, was irrigated with half-strength Hoagland's nutrient solution, with the frequency of irrigation being increased as the crop aged over the 30-day crop tests. Averaging over both crop tests, the test bed met the requirements of 2.1 person-days of oxygen production, 2.4 person-days of CO2 removal, and 129 person-days of potential potable water production. Gains in the mass of water and O2 produced and CO2 removed could be achieved by optimizing environmental conditions to increase plant growth rate and by optimizing cultural management methods.

  11. Some Problems of Computer-Aided Testing and "Interview-Like Tests"

    ERIC Educational Resources Information Center

    Smoline, D.V.

    2008-01-01

    Computer-based testing--is an effective teacher's tool, intended to optimize course goals and assessment techniques in a comparatively short time. However, this is accomplished only if we deal with high-quality tests. It is strange, but despite the 100-year history of Testing Theory (see, Anastasi, A., Urbina, S. (1997). Psychological testing.…

  12. A systematic approach to designing statistically powerful heteroscedastic 2 × 2 factorial studies while minimizing financial costs.

    PubMed

    Jan, Show-Li; Shieh, Gwowen

    2016-08-31

    The 2 × 2 factorial design is widely used for assessing the existence of interaction and the extent of generalizability of two factors where each factor had only two levels. Accordingly, research problems associated with the main effects and interaction effects can be analyzed with the selected linear contrasts. To correct for the potential heterogeneity of variance structure, the Welch-Satterthwaite test is commonly used as an alternative to the t test for detecting the substantive significance of a linear combination of mean effects. This study concerns the optimal allocation of group sizes for the Welch-Satterthwaite test in order to minimize the total cost while maintaining adequate power. The existing method suggests that the optimal ratio of sample sizes is proportional to the ratio of the population standard deviations divided by the square root of the ratio of the unit sampling costs. Instead, a systematic approach using optimization technique and screening search is presented to find the optimal solution. Numerical assessments revealed that the current allocation scheme generally does not give the optimal solution. Alternatively, the suggested approaches to power and sample size calculations give accurate and superior results under various treatment and cost configurations. The proposed approach improves upon the current method in both its methodological soundness and overall performance. Supplementary algorithms are also developed to aid the usefulness and implementation of the recommended technique in planning 2 × 2 factorial designs.

  13. Analysis of temporal gene expression profiles: clustering by simulated annealing and determining the optimal number of clusters.

    PubMed

    Lukashin, A V; Fuchs, R

    2001-05-01

    Cluster analysis of genome-wide expression data from DNA microarray hybridization studies has proved to be a useful tool for identifying biologically relevant groupings of genes and samples. In the present paper, we focus on several important issues related to clustering algorithms that have not yet been fully studied. We describe a simple and robust algorithm for the clustering of temporal gene expression profiles that is based on the simulated annealing procedure. In general, this algorithm guarantees to eventually find the globally optimal distribution of genes over clusters. We introduce an iterative scheme that serves to evaluate quantitatively the optimal number of clusters for each specific data set. The scheme is based on standard approaches used in regular statistical tests. The basic idea is to organize the search of the optimal number of clusters simultaneously with the optimization of the distribution of genes over clusters. The efficiency of the proposed algorithm has been evaluated by means of a reverse engineering experiment, that is, a situation in which the correct distribution of genes over clusters is known a priori. The employment of this statistically rigorous test has shown that our algorithm places greater than 90% genes into correct clusters. Finally, the algorithm has been tested on real gene expression data (expression changes during yeast cell cycle) for which the fundamental patterns of gene expression and the assignment of genes to clusters are well understood from numerous previous studies.

  14. Shape Optimization for Additive Manufacturing of Removable Partial Dentures - A New Paradigm for Prosthetic CAD/CAM

    PubMed Central

    2015-01-01

    With ever-growing aging population and demand for denture treatments, pressure-induced mucosa lesion and residual ridge resorption remain main sources of clinical complications. Conventional denture design and fabrication are challenged for its labor and experience intensity, urgently necessitating an automatic procedure. This study aims to develop a fully automatic procedure enabling shape optimization and additive manufacturing of removable partial dentures (RPD), to maximize the uniformity of contact pressure distribution on the mucosa, thereby reducing associated clinical complications. A 3D heterogeneous finite element (FE) model was constructed from CT scan, and the critical tissue of mucosa was modeled as a hyperelastic material from in vivo clinical data. A contact shape optimization algorithm was developed based on the bi-directional evolutionary structural optimization (BESO) technique. Both initial and optimized dentures were prototyped by 3D printing technology and evaluated with in vitro tests. Through the optimization, the peak contact pressure was reduced by 70%, and the uniformity was improved by 63%. In vitro tests verified the effectiveness of this procedure, and the hydrostatic pressure induced in the mucosa is well below clinical pressure-pain thresholds (PPT), potentially lessening risk of residual ridge resorption. This proposed computational optimization and additive fabrication procedure provides a novel method for fast denture design and adjustment at low cost, with quantitative guidelines and computer aided design and manufacturing (CAD/CAM) for a specific patient. The integration of digitalized modeling, computational optimization, and free-form fabrication enables more efficient clinical adaptation. The customized optimal denture design is expected to minimize pain/discomfort and potentially reduce long-term residual ridge resorption. PMID:26161878

  15. Multi-Constraint Multi-Variable Optimization of Source-Driven Nuclear Systems

    NASA Astrophysics Data System (ADS)

    Watkins, Edward Francis

    1995-01-01

    A novel approach to the search for optimal designs of source-driven nuclear systems is investigated. Such systems include radiation shields, fusion reactor blankets and various neutron spectrum-shaping assemblies. The novel approach involves the replacement of the steepest-descents optimization algorithm incorporated in the code SWAN by a significantly more general and efficient sequential quadratic programming optimization algorithm provided by the code NPSOL. The resulting SWAN/NPSOL code system can be applied to more general, multi-variable, multi-constraint shield optimization problems. The constraints it accounts for may include simple bounds on variables, linear constraints, and smooth nonlinear constraints. It may also be applied to unconstrained, bound-constrained and linearly constrained optimization. The shield optimization capabilities of the SWAN/NPSOL code system is tested and verified in a variety of optimization problems: dose minimization at constant cost, cost minimization at constant dose, and multiple-nonlinear constraint optimization. The replacement of the optimization part of SWAN with NPSOL is found feasible and leads to a very substantial improvement in the complexity of optimization problems which can be efficiently handled.

  16. Automated Test-Form Generation

    ERIC Educational Resources Information Center

    van der Linden, Wim J.; Diao, Qi

    2011-01-01

    In automated test assembly (ATA), the methodology of mixed-integer programming is used to select test items from an item bank to meet the specifications for a desired test form and optimize its measurement accuracy. The same methodology can be used to automate the formatting of the set of selected items into the actual test form. Three different…

  17. Multi-Objective Parallel Test-Sheet Composition Using Enhanced Particle Swarm Optimization

    ERIC Educational Resources Information Center

    Ho, Tsu-Feng; Yin, Peng-Yeng; Hwang, Gwo-Jen; Shyu, Shyong Jian; Yean, Ya-Nan

    2009-01-01

    For large-scale tests, such as certification tests or entrance examinations, the composed test sheets must meet multiple assessment criteria. Furthermore, to fairly compare the knowledge levels of the persons who receive tests at different times owing to the insufficiency of available examination halls or the occurrence of certain unexpected…

  18. Practical Framework for an Electron Beam Induced Current Technique Based on a Numerical Optimization Approach

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Hideshi; Soeda, Takeshi

    2015-03-01

    A practical framework for an electron beam induced current (EBIC) technique has been established for conductive materials based on a numerical optimization approach. Although the conventional EBIC technique is useful for evaluating the distributions of dopants or crystal defects in semiconductor transistors, issues related to the reproducibility and quantitative capability of measurements using this technique persist. For instance, it is difficult to acquire high-quality EBIC images throughout continuous tests due to variation in operator skill or test environment. Recently, due to the evaluation of EBIC equipment performance and the numerical optimization of equipment items, the constant acquisition of high contrast images has become possible, improving the reproducibility as well as yield regardless of operator skill or test environment. The technique proposed herein is even more sensitive and quantitative than scanning probe microscopy, an imaging technique that can possibly damage the sample. The new technique is expected to benefit the electrical evaluation of fragile or soft materials along with LSI materials.

  19. Optimization of PZT ceramic IDT sensors for health monitoring of structures.

    PubMed

    Takpara, Rafatou; Duquennoy, Marc; Ouaftouh, Mohammadi; Courtois, Christian; Jenot, Frédéric; Rguiti, Mohamed

    2017-08-01

    Surface acoustic waves (SAW) are particularly suited to effectively monitoring and characterizing structural surfaces (condition of the surface, coating, thin layer, micro-cracks…) as their energy is localized on the surface, within approximately one wavelength. Conventionally, in non-destructive testing, wedge sensors are used to the generation guided waves but they are especially suited to flat surfaces and sized for a given type material (angle of refraction). Additionally, these sensors are quite expensive so it is quite difficult to leave the sensors permanently on the structure for its health monitoring. Therefore we are considering in this study, another type of ultrasonic sensors, able to generate SAW. These sensors are interdigital sensors or IDT sensors for InterDigital Transducer. This paper focuses on optimization of IDT sensors for non-destructive structural testing by using PZT ceramics. The challenge was to optimize the dimensional parameters of the IDT sensors in order to efficiently generate surface waves. Acoustic tests then confirmed these parameters. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Optimization of a matched-filter receiver for frequency hopping code acquisition in jamming

    NASA Astrophysics Data System (ADS)

    Pawlowski, P. R.; Polydoros, A.

    A matched-filter receiver for frequency hopping (FH) code acquisition is optimized when either partial-band tone jamming or partial-band Gaussian noise jamming is present. The receiver is matched to a segment of the FH code sequence, sums hard per-channel decisions to form a test, and uses multiple tests to verify acquisition. The length of the matched filter and the number of verification tests are fixed. Optimization is then choosing thresholds to maximize performance based upon the receiver's degree of knowledge about the jammer ('side-information'). Four levels of side-information are considered, ranging from none to complete. The latter level results in a constant-false-alarm-rate (CFAR) design. At each level, performance sensitivity to threshold choice is analyzed. Robust thresholds are chosen to maximize performance as the jammer varies its power distribution, resulting in simple design rules which aid threshold selection. Performance results, which show that optimum distributions for the jammer power over the total FH bandwidth exist, are presented.

  1. Item response theory analysis of the life orientation test-revised: age and gender differential item functioning analyses.

    PubMed

    Steca, Patrizia; Monzani, Dario; Greco, Andrea; Chiesi, Francesca; Primi, Caterina

    2015-06-01

    This study is aimed at testing the measurement properties of the Life Orientation Test-Revised (LOT-R) for the assessment of dispositional optimism by employing item response theory (IRT) analyses. The LOT-R was administered to a large sample of 2,862 Italian adults. First, confirmatory factor analyses demonstrated the theoretical conceptualization of the construct measured by the LOT-R as a single bipolar dimension. Subsequently, IRT analyses for polytomous, ordered response category data were applied to investigate the items' properties. The equivalence of the items across gender and age was assessed by analyzing differential item functioning. Discrimination and severity parameters indicated that all items were able to distinguish people with different levels of optimism and adequately covered the spectrum of the latent trait. Additionally, the LOT-R appears to be gender invariant and, with minor exceptions, age invariant. Results provided evidence that the LOT-R is a reliable and valid measure of dispositional optimism. © The Author(s) 2014.

  2. A method for the dynamic management of genetic variability in dairy cattle

    PubMed Central

    Colleau, Jean-Jacques; Moureaux, Sophie; Briend, Michèle; Bechu, Jérôme

    2004-01-01

    According to the general approach developed in this paper, dynamic management of genetic variability in selected populations of dairy cattle is carried out for three simultaneous purposes: procreation of young bulls to be further progeny-tested, use of service bulls already selected and approval of recently progeny-tested bulls for use. At each step, the objective is to minimize the average pairwise relationship coefficient in the future population born from programmed matings and the existing population. As a common constraint, the average estimated breeding value of the new population, for a selection goal including many important traits, is set to a desired value. For the procreation of young bulls, breeding costs are additionally constrained. Optimization is fully analytical and directly considers matings. Corresponding algorithms are presented in detail. The efficiency of these procedures was tested on the current Norman population. Comparisons between optimized and real matings, clearly showed that optimization would have saved substantial genetic variability without reducing short-term genetic gains. PMID:15231230

  3. Evaluation and optimization of a commercial blocking ELISA for detecting antibodies to influenza A virus for research and surveillance of mallards.

    PubMed

    Shriner, Susan A; VanDalen, Kaci K; Root, J Jeffrey; Sullivan, Heather J

    2016-02-01

    The availability of a validated commercial assay is an asset for any wildlife investigation. However, commercial products are often developed for use in livestock and are not optimized for wildlife. Consequently, it is incumbent upon researchers and managers to apply commercial products appropriately to optimize program outcomes. We tested more than 800 serum samples from mallards for antibodies to influenza A virus with the IDEXX AI MultiS-Screen Ab test to evaluate assay performance. Applying the test per manufacturer's recommendations resulted in good performance with 84% sensitivity and 100% specificity. However, performance was improved to 98% sensitivity and 98% specificity by increasing the recommended cut-off. Using this alternative threshold for identifying positive and negative samples would greatly improve sample classification, especially for field samples collected months after infection when antibody titers have waned from the initial primary immune response. Furthermore, a threshold that balances sensitivity and specificity reduces estimation bias in seroprevalence estimates. Published by Elsevier B.V.

  4. Risk Mitigation Testing with the BepiColombo MPO SADA

    NASA Astrophysics Data System (ADS)

    Zemann, J.; Heinrich, B.; Skulicz, A.; Madsen, M.; Weisenstein, W.; Modugno, F.; Althaus, F.; Panhofer, T.; Osterseher, G.

    2013-09-01

    A Solar Array (SA) Drive Assembly (SADA) for the BepiColombo mission is being developed and qualified at RUAG Space Zürich (RSSZ). The system is consisting of the Solar Array Drive Mechanism (SADM) and the Solar Array Drive Electronics (SADE) which is subcontracted to RUAG Space Austria (RSA).This paper deals with the risk mitigation activities and the lesson learnt from this development. In specific following topics substantiated by bread board (BB) test results will be addressed in detail:Slipring Bread Board Test: Verification of lifetime and electrical performance of carbon brush technology Potentiometer BB Tests: Focus on lifetime verification (> 650000 revolution) and accuracy requirement SADM EM BB Test: Subcomponent (front-bearing and gearbox) characterization; complete test campaign equivalent to QM test.EM SADM/ SADE Combined Test: Verification of combined performance (accuracy, torque margin) and micro-vibration testing of SADA systemSADE Bread Board Test: Parameter optimization; Test campaign equivalent to QM testThe main improvements identified in frame of BB testing and already implemented in the SADM EM/QM and SADE EQM are:• Improved preload device for gearbox• Improved motor ball-bearing assembly• Position sensor improvements• Calibration process for potentiometer• SADE motor controller optimization toachieve required running smoothness• Overall improvement of test equipment.

  5. Alternative Chemical Cleaning Methods for High Level Waste Tanks: Actual Waste Testing with SRS Tank 5F Sludge

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, William D.; Hay, Michael S.

    Solubility testing with actual High Level Waste tank sludge has been conducted in order to evaluate several alternative chemical cleaning technologies for the dissolution of sludge residuals remaining in the tanks after the exhaustion of mechanical cleaning and sludge sluicing efforts. Tests were conducted with archived Savannah River Site (SRS) radioactive sludge solids that had been retrieved from Tank 5F in order to determine the effectiveness of an optimized, dilute oxalic/nitric acid cleaning reagent toward dissolving the bulk non-radioactive waste components. Solubility tests were performed by direct sludge contact with the oxalic/nitric acid reagent and with sludge that had beenmore » pretreated and acidified with dilute nitric acid. For comparison purposes, separate samples were also contacted with pure, concentrated oxalic acid following current baseline tank chemical cleaning methods. One goal of testing with the optimized reagent was to compare the total amounts of oxalic acid and water required for sludge dissolution using the baseline and optimized cleaning methods. A second objective was to compare the two methods with regard to the dissolution of actinide species known to be drivers for SRS tank closure Performance Assessments (PA). Additionally, solubility tests were conducted with Tank 5 sludge using acidic and caustic permanganate-based methods focused on the “targeted” dissolution of actinide species.« less

  6. Keeping Your Compressor Healthy: Developing the Right Lubricant Formulation is the Key

    NASA Astrophysics Data System (ADS)

    Karnaz, Joseph A.; Kultgen, Derek W.

    2015-08-01

    Selecting the correct compressor lubricant is crucial to the duration of the compressor and the refrigerant systems’ useful life. However, developing an optimized lubricant for a refrigeration system requires a multitude of screenings and tests. The compatibility and stability of the lubricant with the refrigerant and compressor components needs to be examined at various accelerated conditions. The lubricant and refrigerant working viscosity must be determined at various refrigerant concentrations, temperatures and pressures as the diluted refrigerant in the lubricant has a significant effect on the viscosity. The correct lubricant formulation needs to be investigated for optimal performance. A compressor lubricant can provide many benefits to a refrigeration system such as bearing durability, sealing, and increased efficiency. Sometimes it is necessary to formulate the lubricant in order to optimize system performance. Specifically, this study investigated anti-wear properties of different oil additives to create a more robust refrigeration system. Many different additives and concentrations were considered and screened. Pending a successful screen test; these different additives’ anti-wear properties were analyzed using bench top tribology tests. To reduce uncertainty and provide more in-situ results the different additives were operated in a refrigerant compressor on a gas-loop testing apparatus. Oil samples were taken periodically during the test duration for analysis. Lastly, upon test completion the compressors were dismantled and the parts were examined to determine the effectiveness of the anti-wear additives.

  7. Anger and health in dementia caregivers: exploring the mediation effect of optimism.

    PubMed

    López, J; Romero-Moreno, R; Márquez-González, M; Losada, A

    2015-04-01

    Although previous studies indicate a negative association between caregivers' anger and health, the potential mechanisms linking this relationship are not yet fully understood. The aim of this study was to explore the potential mediating role of optimism in the relationship between anger and caregivers' physical health. Dementia caregivers (n = 108) were interviewed and filled out instruments assessing their anger (reaction), optimism and health (vitality). A mediational model was tested to determine whether optimism partially mediated the relationship between anger and vitality. Angry reaction was negatively associated with optimism and vitality; optimism was positively associated with vitality. Finally, the relationship between angry reaction and vitality decreased when optimism was entered simultaneously. A non-parametric bootstrap approach confirmed that optimism significantly mediated some of the relationship between angry reaction and vitality. These findings suggest that low optimism may help explain the association between caregivers' anger and reduced sense of vitality. The results provide a specific target for intervention with caregivers. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Optimal Sequential Rules for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Vos, Hans J.

    1998-01-01

    Formulates sequential rules for adapting the appropriate amount of instruction to learning needs in the context of computer-based instruction. Topics include Bayesian decision theory, threshold and linear-utility structure, psychometric model, optimal sequential number of test questions, and an empirical example of sequential instructional…

  9. Vortex generator design for aircraft inlet distortion as a numerical optimization problem

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Levy, Ralph

    1991-01-01

    Aerodynamic compatibility of aircraft/inlet/engine systems is a difficult design problem for aircraft that must operate in many different flight regimes. Takeoff, subsonic cruise, supersonic cruise, transonic maneuvering, and high altitude loiter each place different constraints on inlet design. Vortex generators, small wing like sections mounted on the inside surfaces of the inlet duct, are used to control flow separation and engine face distortion. The design of vortex generator installations in an inlet is defined as a problem addressable by numerical optimization techniques. A performance parameter is suggested to account for both inlet distortion and total pressure loss at a series of design flight conditions. The resulting optimization problem is difficult since some of the design parameters take on integer values. If numerical procedures could be used to reduce multimillion dollar development test programs to a small set of verification tests, numerical optimization could have a significant impact on both cost and elapsed time to design new aircraft.

  10. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    PubMed Central

    Hsieh, Helen V.; Dantzler, Jeffrey L.; Weigl, Bernhard H.

    2017-01-01

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness. PMID:28555034

  11. Reward rate optimization in two-alternative decision making: empirical tests of theoretical predictions.

    PubMed

    Simen, Patrick; Contreras, David; Buck, Cara; Hu, Peter; Holmes, Philip; Cohen, Jonathan D

    2009-12-01

    The drift-diffusion model (DDM) implements an optimal decision procedure for stationary, 2-alternative forced-choice tasks. The height of a decision threshold applied to accumulating information on each trial determines a speed-accuracy tradeoff (SAT) for the DDM, thereby accounting for a ubiquitous feature of human performance in speeded response tasks. However, little is known about how participants settle on particular tradeoffs. One possibility is that they select SATs that maximize a subjective rate of reward earned for performance. For the DDM, there exist unique, reward-rate-maximizing values for its threshold and starting point parameters in free-response tasks that reward correct responses (R. Bogacz, E. Brown, J. Moehlis, P. Holmes, & J. D. Cohen, 2006). These optimal values vary as a function of response-stimulus interval, prior stimulus probability, and relative reward magnitude for correct responses. We tested the resulting quantitative predictions regarding response time, accuracy, and response bias under these task manipulations and found that grouped data conformed well to the predictions of an optimally parameterized DDM.

  12. Experimental analysis of the performance of optimized fin structures in a latent heat energy storage test rig

    NASA Astrophysics Data System (ADS)

    Johnson, Maike; Hübner, Stefan; Reichmann, Carsten; Schönberger, Manfred; Fiß, Michael

    2017-06-01

    Energy storage systems are a key technology for developing a more sustainable energy supply system and lowering overall CO2 emissions. Among the variety of storage technologies, high temperature phase change material (PCM) storage is a promising option with a wide range of applications. PCM storages using an extended finned tube storage concept have been designed and techno-economically optimized for solar thermal power plant operations. These finned tube components were experimentally tested in order to validate the optimized design and simulation models used. Analysis of the charging and discharging characteristics of the storage at the pilot scale gives insight into the heat distribution both axially as well as radially in the storage material, thereby allowing for a realistic validation of the design. The design was optimized for discharging of the storage, as this is the more critical operation mode in power plant applications. The data show good agreement between the model and the experiments for discharging.

  13. Automatic Parameter Tuning for the Morpheus Vehicle Using Particle Swarm Optimization

    NASA Technical Reports Server (NTRS)

    Birge, B.

    2013-01-01

    A high fidelity simulation using a PC based Trick framework has been developed for Johnson Space Center's Morpheus test bed flight vehicle. There is an iterative development loop of refining and testing the hardware, refining the software, comparing the software simulation to hardware performance and adjusting either or both the hardware and the simulation to extract the best performance from the hardware as well as the most realistic representation of the hardware from the software. A Particle Swarm Optimization (PSO) based technique has been developed that increases speed and accuracy of the iterative development cycle. Parameters in software can be automatically tuned to make the simulation match real world subsystem data from test flights. Special considerations for scale, linearity, discontinuities, can be all but ignored with this technique, allowing fast turnaround both for simulation tune up to match hardware changes as well as during the test and validation phase to help identify hardware issues. Software models with insufficient control authority to match hardware test data can be immediately identified and using this technique requires very little to no specialized knowledge of optimization, freeing model developers to concentrate on spacecraft engineering. Integration of the PSO into the Morpheus development cycle will be discussed as well as a case study highlighting the tool's effectiveness.

  14. "RCL-Pooling Assay": A Simplified Method for the Detection of Replication-Competent Lentiviruses in Vector Batches Using Sequential Pooling.

    PubMed

    Corre, Guillaume; Dessainte, Michel; Marteau, Jean-Brice; Dalle, Bruno; Fenard, David; Galy, Anne

    2016-02-01

    Nonreplicative recombinant HIV-1-derived lentiviral vectors (LV) are increasingly used in gene therapy of various genetic diseases, infectious diseases, and cancer. Before they are used in humans, preparations of LV must undergo extensive quality control testing. In particular, testing of LV must demonstrate the absence of replication-competent lentiviruses (RCL) with suitable methods, on representative fractions of vector batches. Current methods based on cell culture are challenging because high titers of vector batches translate into high volumes of cell culture to be tested in RCL assays. As vector batch size and titers are continuously increasing because of the improvement of production and purification methods, it became necessary for us to modify the current RCL assay based on the detection of p24 in cultures of indicator cells. Here, we propose a practical optimization of this method using a pairwise pooling strategy enabling easier testing of higher vector inoculum volumes. These modifications significantly decrease material handling and operator time, leading to a cost-effective method, while maintaining optimal sensibility of the RCL testing. This optimized "RCL-pooling assay" ameliorates the feasibility of the quality control of large-scale batches of clinical-grade LV while maintaining the same sensitivity.

  15. SU-F-J-133: Adaptive Radiation Therapy with a Four-Dimensional Dose Calculation Algorithm That Optimizes Dose Distribution Considering Breathing Motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, I; Algan, O; Ahmad, S

    Purpose: To model patient motion and produce four-dimensional (4D) optimized dose distributions that consider motion-artifacts in the dose calculation during the treatment planning process. Methods: An algorithm for dose calculation is developed where patient motion is considered in dose calculation at the stage of the treatment planning. First, optimal dose distributions are calculated for the stationary target volume where the dose distributions are optimized considering intensity-modulated radiation therapy (IMRT). Second, a convolution-kernel is produced from the best-fitting curve which matches the motion trajectory of the patient. Third, the motion kernel is deconvolved with the initial dose distribution optimized for themore » stationary target to produce a dose distribution that is optimized in four-dimensions. This algorithm is tested with measured doses using a mobile phantom that moves with controlled motion patterns. Results: A motion-optimized dose distribution is obtained from the initial dose distribution of the stationary target by deconvolution with the motion-kernel of the mobile target. This motion-optimized dose distribution is equivalent to that optimized for the stationary target using IMRT. The motion-optimized and measured dose distributions are tested with the gamma index with a passing rate of >95% considering 3% dose-difference and 3mm distance-to-agreement. If the dose delivery per beam takes place over several respiratory cycles, then the spread-out of the dose distributions is only dependent on the motion amplitude and not affected by motion frequency and phase. This algorithm is limited to motion amplitudes that are smaller than the length of the target along the direction of motion. Conclusion: An algorithm is developed to optimize dose in 4D. Besides IMRT that provides optimal dose coverage for a stationary target, it extends dose optimization to 4D considering target motion. This algorithm provides alternative to motion management techniques such as beam-gating or breath-holding and has potential applications in adaptive radiation therapy.« less

  16. Practical input optimization for aircraft parameter estimation experiments. Ph.D. Thesis, 1990

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene A.

    1993-01-01

    The object of this research was to develop an algorithm for the design of practical, optimal flight test inputs for aircraft parameter estimation experiments. A general, single pass technique was developed which allows global optimization of the flight test input design for parameter estimation using the principles of dynamic programming with the input forms limited to square waves only. Provision was made for practical constraints on the input, including amplitude constraints, control system dynamics, and selected input frequency range exclusions. In addition, the input design was accomplished while imposing output amplitude constraints required by model validity and considerations of safety during the flight test. The algorithm has multiple input design capability, with optional inclusion of a constraint that only one control move at a time, so that a human pilot can implement the inputs. It is shown that the technique can be used to design experiments for estimation of open loop model parameters from closed loop flight test data. The report includes a new formulation of the optimal input design problem, a description of a new approach to the solution, and a summary of the characteristics of the algorithm, followed by three example applications of the new technique which demonstrate the quality and expanded capabilities of the input designs produced by the new technique. In all cases, the new input design approach showed significant improvement over previous input design methods in terms of achievable parameter accuracies.

  17. Development of an Optimal Controller and Validation Test Stand for Fuel Efficient Engine Operation

    NASA Astrophysics Data System (ADS)

    Rehn, Jack G., III

    There are numerous motivations for improvements in automotive fuel efficiency. As concerns over the environment grow at a rate unmatched by hybrid and electric automotive technologies, the need for reductions in fuel consumed by current road vehicles has never been more present. Studies have shown that a major cause of poor fuel consumption in automobiles is improper driving behavior, which cannot be mitigated by purely technological means. The emergence of autonomous driving technologies has provided an opportunity to alleviate this inefficiency by removing the necessity of a driver. Before autonomous technology can be relied upon to reduce gasoline consumption on a large scale, robust programming strategies must be designed and tested. The goal of this thesis work was to design and deploy an autonomous control algorithm to navigate a four cylinder, gasoline combustion engine through a series of changing load profiles in a manner that prioritizes fuel efficiency. The experimental setup is analogous to a passenger vehicle driving over hilly terrain at highway speeds. The proposed approach accomplishes this using a model-predictive, real-time optimization algorithm that was calibrated to the engine. Performance of the optimal control algorithm was tested on the engine against contemporary cruise control. Results indicate that the "efficient'' strategy achieved one to two percent reductions in total fuel consumed for all load profiles tested. The consumption data gathered also suggests that further improvements could be realized on a different subject engine and using extended models and a slightly modified optimal control approach.

  18. Cut-off optimization for 13C-urea breath test in a community-based trial by mathematic, histology and serology approach.

    PubMed

    Li, Zhe-Xuan; Huang, Lei-Lei; Liu, Cong; Formichella, Luca; Zhang, Yang; Wang, Yu-Mei; Zhang, Lian; Ma, Jun-Ling; Liu, Wei-Dong; Ulm, Kurt; Wang, Jian-Xi; Zhang, Lei; Bajbouj, Monther; Li, Ming; Vieth, Michael; Quante, Michael; Zhou, Tong; Wang, Le-Hua; Suchanek, Stepan; Soutschek, Erwin; Schmid, Roland; Classen, Meinhard; You, Wei-Cheng; Gerhard, Markus; Pan, Kai-Feng

    2017-05-18

    The performance of diagnostic tests in intervention trials of Helicobacter pylori (H.pylori) eradication is crucial, since even minor inaccuracies can have major impact. To determine the cut-off point for 13 C-urea breath test ( 13 C-UBT) and to assess if it can be further optimized by serologic testing, mathematic modeling, histopathology and serologic validation were applied. A finite mixture model (FMM) was developed in 21,857 subjects, and an independent validation by modified Giemsa staining was conducted in 300 selected subjects. H.pylori status was determined using recomLine H.pylori assay in 2,113 subjects with a borderline 13 C-UBT results. The delta over baseline-value (DOB) of 3.8 was an optimal cut-off point by a FMM in modelling dataset, which was further validated as the most appropriate cut-off point by Giemsa staining (sensitivity = 94.53%, specificity = 92.93%). In the borderline population, 1,468 subjects were determined as H.pylori positive by recomLine (69.5%). A significant correlation between the number of positive H.pylori serum responses and DOB value was found (r s  = 0.217, P < 0.001). A mathematical approach such as FMM might be an alternative measure in optimizing the cut-off point for 13 C-UBT in community-based studies, and a second method to determine H.pylori status for subjects with borderline value of 13 C-UBT was necessary and recommended.

  19. Rapid optimization of multiple-burn rocket flights.

    NASA Technical Reports Server (NTRS)

    Brown, K. R.; Harrold, E. F.; Johnson, G. W.

    1972-01-01

    Different formulations of the fuel optimization problem for multiple burn trajectories are considered. It is shown that certain customary idealizing assumptions lead to an ill-posed optimization problem for which no solution exists. Several ways are discussed for avoiding such difficulties by more realistic problem statements. An iterative solution of the boundary value problem is presented together with efficient coast arc computations, the right end conditions for various orbital missions, and some test results.

  20. Anaerobic hydrolysis and acidification of organic substrates: determination of anaerobic hydrolytic potential.

    PubMed

    Rajagopal, Rajinikanth; Béline, Fabrice

    2011-05-01

    This study aimed to develop a biochemical-test mainly to evaluate the hydrolytic-potential of different substrates and to apply this test to characterize various organic substrates. Outcome of this study can be used for optimization of the WWTPs through enhancement of N/P removal or anaerobic digestion. Out of four series of batch experiments, the first two tests were conducted to determine the optimal operating conditions (test duration, inoculum-ratio etc.) for the hydrolytic-potential test using secondary and synthetically-prepared sludges. Based on the results (generation of CODs, pH and VFA), test duration was fixed between 6 and 7d allowing to attain maximum hydrolysis and to avoid methanogenesis. Effect of inoculum-ratios on acid fermentation of sludge was not significantly noticed. Using this methodology, third and fourth tests were performed to characterize various organic substrates namely secondary, pre-treated sludge, pig and two different cattle slurries. VFA production was shown to be substantially dependent on substrates types. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Selecting Items for Criterion-Referenced Tests.

    ERIC Educational Resources Information Center

    Mellenbergh, Gideon J.; van der Linden, Wim J.

    1982-01-01

    Three item selection methods for criterion-referenced tests are examined: the classical theory of item difficulty and item-test correlation; the latent trait theory of item characteristic curves; and a decision-theoretic approach for optimal item selection. Item contribution to the standardized expected utility of mastery testing is discussed. (CM)

  2. Optimal Testlet Pool Assembly for Multistage Testing Designs

    ERIC Educational Resources Information Center

    Ariel, Adelaide; Veldkamp, Bernard P.; Breithaupt, Krista

    2006-01-01

    Computerized multistage testing (MST) designs require sets of test questions (testlets) to be assembled to meet strict, often competing criteria. Rules that govern testlet assembly may dictate the number of questions on a particular subject or may describe desirable statistical properties for the test, such as measurement precision. In an MST…

  3. Creating IRT-Based Parallel Test Forms Using the Genetic Algorithm Method

    ERIC Educational Resources Information Center

    Sun, Koun-Tem; Chen, Yu-Jen; Tsai, Shu-Yen; Cheng, Chien-Fen

    2008-01-01

    In educational measurement, the construction of parallel test forms is often a combinatorial optimization problem that involves the time-consuming selection of items to construct tests having approximately the same test information functions (TIFs) and constraints. This article proposes a novel method, genetic algorithm (GA), to construct parallel…

  4. A Three-Stage Enhanced Reactive Power and Voltage Optimization Method for High Penetration of Solar

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ke, Xinda; Huang, Renke; Vallem, Mallikarjuna R.

    This paper presents a three-stage enhanced volt/var optimization method to stabilize voltage fluctuations in transmission networks by optimizing the usage of reactive power control devices. In contrast with existing volt/var optimization algorithms, the proposed method optimizes the voltage profiles of the system, while keeping the voltage and real power output of the generators as close to the original scheduling values as possible. This allows the method to accommodate realistic power system operation and market scenarios, in which the original generation dispatch schedule will not be affected. The proposed method was tested and validated on a modified IEEE 118-bus system withmore » photovoltaic data.« less

  5. Simulator for multilevel optimization research

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Young, K. C.

    1986-01-01

    A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.

  6. A Rigorous Framework for Optimization of Expensive Functions by Surrogates

    NASA Technical Reports Server (NTRS)

    Booker, Andrew J.; Dennis, J. E., Jr.; Frank, Paul D.; Serafini, David B.; Torczon, Virginia; Trosset, Michael W.

    1998-01-01

    The goal of the research reported here is to develop rigorous optimization algorithms to apply to some engineering design problems for which design application of traditional optimization approaches is not practical. This paper presents and analyzes a framework for generating a sequence of approximations to the objective function and managing the use of these approximations as surrogates for optimization. The result is to obtain convergence to a minimizer of an expensive objective function subject to simple constraints. The approach is widely applicable because it does not require, or even explicitly approximate, derivatives of the objective. Numerical results are presented for a 31-variable helicopter rotor blade design example and for a standard optimization test example.

  7. Aeroelastic Optimization Study Based on the X-56A Model

    NASA Technical Reports Server (NTRS)

    Li, Wesley W.; Pak, Chan-Gi

    2014-01-01

    One way to increase the aircraft fuel efficiency is to reduce structural weight while maintaining adequate structural airworthiness, both statically and aeroelastically. A design process which incorporates the object-oriented multidisciplinary design, analysis, and optimization (MDAO) tool and the aeroelastic effects of high fidelity finite element models to characterize the design space was successfully developed and established. This paper presents two multidisciplinary design optimization studies using an object-oriented MDAO tool developed at NASA Armstrong Flight Research Center. The first study demonstrates the use of aeroelastic tailoring concepts to minimize the structural weight while meeting the design requirements including strength, buckling, and flutter. Such an approach exploits the anisotropic capabilities of the fiber composite materials chosen for this analytical exercise with ply stacking sequence. A hybrid and discretization optimization approach improves accuracy and computational efficiency of a global optimization algorithm. The second study presents a flutter mass balancing optimization study for the fabricated flexible wing of the X-56A model since a desired flutter speed band is required for the active flutter suppression demonstration during flight testing. The results of the second study provide guidance to modify the wing design and move the design flutter speeds back into the flight envelope so that the original objective of X-56A flight test can be accomplished successfully. The second case also demonstrates that the object-oriented MDAO tool can handle multiple analytical configurations in a single optimization run.

  8. A Study of Remitted and Treatment-Resistant Depression Using MMPI and Including Pessimism and Optimism Scales

    PubMed Central

    Suzuki, Masatoshi; Takahashi, Michio; Muneoka, Katsumasa; Sato, Koichi; Hashimoto, Kenji; Shirayama, Yukihiko

    2014-01-01

    Background The psychological aspects of treatment-resistant and remitted depression are not well documented. Methods We administered the Minnesota Multiphasic Personality Inventory (MMPI) to patients with treatment-resistant depression (n = 34), remitted depression (n = 25), acute depression (n = 21), and healthy controls (n = 64). Pessimism and optimism were also evaluated by MMPI. Results ANOVA and post-hoc tests demonstrated that patients with treatment-resistant and acute depression showed similarly high scores for frequent scale (F), hypochondriasis, depression, conversion hysteria, psychopathic device, paranoia, psychasthenia and schizophrenia on the MMPI compared with normal controls. Patients with treatment-resistant depression, but not acute depression registered high on the scale for cannot say answer. Using Student's t-test, patients with remitted depression registered higher on depression and social introversion scales, compared with normal controls. For pessimism and optimism, patients with treatment-resistant depression demonstrated similar changes to acutely depressed patients. Remitted depression patients showed lower optimism than normal controls by Student's t-test, even though these patients were deemed recovered from depression using HAM-D. Conclusions The patients with remitted depression and treatment-resistant depression showed subtle alterations on the MMPI, which may explain the hidden psychological features in these cohorts. PMID:25279466

  9. Large-area triple-junction a-Si alloy production scaleup. Annual subcontract report, 17 March 1993--18 March 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oswald, R.; Morris, J.

    1994-11-01

    The objective of this subcontract over its three-year duration is to advance Solarex`s photovoltaic manufacturing technologies, reduce its a-Si:H module production costs, increase module performance and expand the Solarex commercial production capacity. Solarex shall meet these objectives by improving the deposition and quality of the transparent front contact, by optimizing the laser patterning process, scaling-up the semiconductor deposition process, improving the back contact deposition, scaling-up and improving the encapsulation and testing of its a-Si:H modules. In the Phase 2 portion of this subcontract, Solarex focused on improving deposition of the front contact, investigating alternate feed stocks for the front contact,more » maximizing throughput and area utilization for all laser scribes, optimizing a-Si:H deposition equipment to achieve uniform deposition over large-areas, optimizing the triple-junction module fabrication process, evaluating the materials to deposit the rear contact, and optimizing the combination of isolation scribe and encapsulant to pass the wet high potential test. Progress is reported on the following: Front contact development; Laser scribe process development; Amorphous silicon based semiconductor deposition; Rear contact deposition process; Frit/bus/wire/frame; Materials handling; and Environmental test, yield and performance analysis.« less

  10. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1992-01-01

    Fundamental equations of aerodynamic sensitivity analysis and approximate analysis for the two dimensional thin layer Navier-Stokes equations are reviewed, and special boundary condition considerations necessary to apply these equations to isolated lifting airfoils on 'C' and 'O' meshes are discussed in detail. An efficient strategy which is based on the finite element method and an elastic membrane representation of the computational domain is successfully tested, which circumvents the costly 'brute force' method of obtaining grid sensitivity derivatives, and is also useful in mesh regeneration. The issue of turbulence modeling is addressed in a preliminary study. Aerodynamic shape sensitivity derivatives are efficiently calculated, and their accuracy is validated on two viscous test problems, including: (1) internal flow through a double throat nozzle, and (2) external flow over a NACA 4-digit airfoil. An automated aerodynamic design optimization strategy is outlined which includes the use of a design optimization program, an aerodynamic flow analysis code, an aerodynamic sensitivity and approximate analysis code, and a mesh regeneration and grid sensitivity analysis code. Application of the optimization methodology to the two test problems in each case resulted in a new design having a significantly improved performance in the aerodynamic response of interest.

  11. Threshold matrix for digital halftoning by genetic algorithm optimization

    NASA Astrophysics Data System (ADS)

    Alander, Jarmo T.; Mantere, Timo J.; Pyylampi, Tero

    1998-10-01

    Digital halftoning is used both in low and high resolution high quality printing technologies. Our method is designed to be mainly used for low resolution ink jet marking machines to produce both gray tone and color images. The main problem with digital halftoning is pink noise caused by the human eye's visual transfer function. To compensate for this the random dot patterns used are optimized to contain more blue than pink noise. Several such dot pattern generator threshold matrices have been created automatically by using genetic algorithm optimization, a non-deterministic global optimization method imitating natural evolution and genetics. A hybrid of genetic algorithm with a search method based on local backtracking was developed together with several fitness functions evaluating dot patterns for rectangular grids. By modifying the fitness function, a family of dot generators results, each with its particular statistical features. Several versions of genetic algorithms, backtracking and fitness functions were tested to find a reasonable combination. The generated threshold matrices have been tested by simulating a set of test images using the Khoros image processing system. Even though the work was focused on developing low resolution marking technology, the resulting family of dot generators can be applied also in other halftoning application areas including high resolution printing technology.

  12. A Novel Quantum-Behaved Bat Algorithm with Mean Best Position Directed for Numerical Optimization

    PubMed Central

    Zhu, Wenyong; Liu, Zijuan; Duan, Qingyan; Cao, Long

    2016-01-01

    This paper proposes a novel quantum-behaved bat algorithm with the direction of mean best position (QMBA). In QMBA, the position of each bat is mainly updated by the current optimal solution in the early stage of searching and in the late search it also depends on the mean best position which can enhance the convergence speed of the algorithm. During the process of searching, quantum behavior of bats is introduced which is beneficial to jump out of local optimal solution and make the quantum-behaved bats not easily fall into local optimal solution, and it has better ability to adapt complex environment. Meanwhile, QMBA makes good use of statistical information of best position which bats had experienced to generate better quality solutions. This approach not only inherits the characteristic of quick convergence, simplicity, and easy implementation of original bat algorithm, but also increases the diversity of population and improves the accuracy of solution. Twenty-four benchmark test functions are tested and compared with other variant bat algorithms for numerical optimization the simulation results show that this approach is simple and efficient and can achieve a more accurate solution. PMID:27293424

  13. A Sensor Dynamic Measurement Error Prediction Model Based on NAPSO-SVM.

    PubMed

    Jiang, Minlan; Jiang, Lan; Jiang, Dingde; Li, Fei; Song, Houbing

    2018-01-15

    Dynamic measurement error correction is an effective way to improve sensor precision. Dynamic measurement error prediction is an important part of error correction, and support vector machine (SVM) is often used for predicting the dynamic measurement errors of sensors. Traditionally, the SVM parameters were always set manually, which cannot ensure the model's performance. In this paper, a SVM method based on an improved particle swarm optimization (NAPSO) is proposed to predict the dynamic measurement errors of sensors. Natural selection and simulated annealing are added in the PSO to raise the ability to avoid local optima. To verify the performance of NAPSO-SVM, three types of algorithms are selected to optimize the SVM's parameters: the particle swarm optimization algorithm (PSO), the improved PSO optimization algorithm (NAPSO), and the glowworm swarm optimization (GSO). The dynamic measurement error data of two sensors are applied as the test data. The root mean squared error and mean absolute percentage error are employed to evaluate the prediction models' performances. The experimental results show that among the three tested algorithms the NAPSO-SVM method has a better prediction precision and a less prediction errors, and it is an effective method for predicting the dynamic measurement errors of sensors.

  14. Microhard MHX2420 Orbital Performance Evaluation Using RT Logic T400CS

    NASA Technical Reports Server (NTRS)

    TintoreGazulla, Oriol; Lombardi, Mark

    2012-01-01

    RT Logic allows simulation of Ground Station - satellite communications: Static tests have been successful. Dynamic tests have been performed for simple passes. Future dynamic tests are needed to simulate real orbit communications. Satellite attitude changes antenna gain. Atmospheric and rain losses need to be added. STK Plug-in will be the next step to improve the dynamic tests. There is a possibility of running longer simulations. Simulation of different losses available in the STK Plug-in. Microhard optimization: Effect of Microhard settings on the data throughput have been understood. Optimized settings improve data throughput for LEO communications. Longer hop intervals make transfer of larger packets more efficient (more time between hops in frequency). Use of FEC (Reed-Solomon) reduces the number of retransmissions for long-range or noisy communications.

  15. Genetic Algorithm Optimization of a Cost Competitive Hybrid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Story, George

    2015-01-01

    Performance, reliability and cost have always been drivers in the rocket business. Hybrid rockets have been late entries into the launch business due to substantial early development work on liquid rockets and solid rockets. Slowly the technology readiness level of hybrids has been increasing due to various large scale testing and flight tests of hybrid rockets. One remaining issue is the cost of hybrids versus the existing launch propulsion systems. This paper will review the known state-of-the-art hybrid development work to date and incorporate it into a genetic algorithm to optimize the configuration based on various parameters. A cost module will be incorporated to the code based on the weights of the components. The design will be optimized on meeting the performance requirements at the lowest cost.

  16. Modal Test/Analysis Correlation of Space Station Structures Using Nonlinear Sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlation. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  17. Modal test/analysis correlation of Space Station structures using nonlinear sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlations. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  18. Genetic Algorithm Optimization of a Cost Competitive Hybrid Rocket Booster

    NASA Technical Reports Server (NTRS)

    Story, George

    2014-01-01

    Performance, reliability and cost have always been drivers in the rocket business. Hybrid rockets have been late entries into the launch business due to substantial early development work on liquid rockets and later on solid rockets. Slowly the technology readiness level of hybrids has been increasing due to various large scale testing and flight tests of hybrid rockets. A remaining issue is the cost of hybrids vs the existing launch propulsion systems. This paper will review the known state of the art hybrid development work to date and incorporate it into a genetic algorithm to optimize the configuration based on various parameters. A cost module will be incorporated to the code based on the weights of the components. The design will be optimized on meeting the performance requirements at the lowest cost.

  19. A Novel Particle Swarm Optimization Algorithm for Global Optimization

    PubMed Central

    Wang, Chun-Feng; Liu, Kui

    2016-01-01

    Particle Swarm Optimization (PSO) is a recently developed optimization method, which has attracted interest of researchers in various areas due to its simplicity and effectiveness, and many variants have been proposed. In this paper, a novel Particle Swarm Optimization algorithm is presented, in which the information of the best neighbor of each particle and the best particle of the entire population in the current iteration is considered. Meanwhile, to avoid premature, an abandoned mechanism is used. Furthermore, for improving the global convergence speed of our algorithm, a chaotic search is adopted in the best solution of the current iteration. To verify the performance of our algorithm, standard test functions have been employed. The experimental results show that the algorithm is much more robust and efficient than some existing Particle Swarm Optimization algorithms. PMID:26955387

  20. A Power Transformers Fault Diagnosis Model Based on Three DGA Ratios and PSO Optimization SVM

    NASA Astrophysics Data System (ADS)

    Ma, Hongzhe; Zhang, Wei; Wu, Rongrong; Yang, Chunyan

    2018-03-01

    In order to make up for the shortcomings of existing transformer fault diagnosis methods in dissolved gas-in-oil analysis (DGA) feature selection and parameter optimization, a transformer fault diagnosis model based on the three DGA ratios and particle swarm optimization (PSO) optimize support vector machine (SVM) is proposed. Using transforming support vector machine to the nonlinear and multi-classification SVM, establishing the particle swarm optimization to optimize the SVM multi classification model, and conducting transformer fault diagnosis combined with the cross validation principle. The fault diagnosis results show that the average accuracy of test method is better than the standard support vector machine and genetic algorithm support vector machine, and the proposed method can effectively improve the accuracy of transformer fault diagnosis is proved.

  1. Optimization of Wireless Power Transfer Systems Enhanced by Passive Elements and Metasurfaces

    NASA Astrophysics Data System (ADS)

    Lang, Hans-Dieter; Sarris, Costas D.

    2017-10-01

    This paper presents a rigorous optimization technique for wireless power transfer (WPT) systems enhanced by passive elements, ranging from simple reflectors and intermedi- ate relays all the way to general electromagnetic guiding and focusing structures, such as metasurfaces and metamaterials. At its core is a convex semidefinite relaxation formulation of the otherwise nonconvex optimization problem, of which tightness and optimality can be confirmed by a simple test of its solutions. The resulting method is rigorous, versatile, and general -- it does not rely on any assumptions. As shown in various examples, it is able to efficiently and reliably optimize such WPT systems in order to find their physical limitations on performance, optimal operating parameters and inspect their working principles, even for a large number of active transmitters and passive elements.

  2. The optimal design of UAV wing structure

    NASA Astrophysics Data System (ADS)

    Długosz, Adam; Klimek, Wiktor

    2018-01-01

    The paper presents an optimal design of UAV wing, made of composite materials. The aim of the optimization is to improve strength and stiffness together with reduction of the weight of the structure. Three different types of functionals, which depend on stress, stiffness and the total mass are defined. The paper presents an application of the in-house implementation of the evolutionary multi-objective algorithm in optimization of the UAV wing structure. Values of the functionals are calculated on the basis of results obtained from numerical simulations. Numerical FEM model, consisting of different composite materials is created. Adequacy of the numerical model is verified by results obtained from the experiment, performed on a tensile testing machine. Examples of multi-objective optimization by means of Pareto-optimal set of solutions are presented.

  3. Optimal Limited Contingency Planning

    NASA Technical Reports Server (NTRS)

    Meuleau, Nicolas; Smith, David E.

    2003-01-01

    For a given problem, the optimal Markov policy over a finite horizon is a conditional plan containing a potentially large number of branches. However, there are applications where it is desirable to strictly limit the number of decision points and branches in a plan. This raises the question of how one goes about finding optimal plans containing only a limited number of branches. In this paper, we present an any-time algorithm for optimal k-contingency planning. It is the first optimal algorithm for limited contingency planning that is not an explicit enumeration of possible contingent plans. By modelling the problem as a partially observable Markov decision process, it implements the Bellman optimality principle and prunes the solution space. We present experimental results of applying this algorithm to some simple test cases.

  4. Heart Rate Dynamics During A Treadmill Cardiopulmonary Exercise Test in Optimized Beta-Blocked Heart Failure Patients

    PubMed Central

    Carvalho, Vitor Oliveira; Guimarães, Guilherme Veiga; Ciolac, Emmanuel Gomes; Bocchi, Edimar Alcides

    2008-01-01

    BACKGROUND Calculating the maximum heart rate for age is one method to characterize the maximum effort of an individual. Although this method is commonly used, little is known about heart rate dynamics in optimized beta-blocked heart failure patients. AIM The aim of this study was to evaluate heart rate dynamics (basal, peak and % heart rate increase) in optimized beta-blocked heart failure patients compared to sedentary, normal individuals (controls) during a treadmill cardiopulmonary exercise test. METHODS Twenty-five heart failure patients (49±11 years, 76% male), with an average LVEF of 30±7%, and fourteen controls were included in the study. Patients with atrial fibrillation, a pacemaker or noncardiovascular functional limitations or whose drug therapy was not optimized were excluded. Optimization was considered to be 50 mg/day or more of carvedilol, with a basal heart rate between 50 to 60 bpm that was maintained for 3 months. RESULTS Basal heart rate was lower in heart failure patients (57±3 bpm) compared to controls (89±14 bpm; p<0.0001). Similarly, the peak heart rate (% maximum predicted for age) was lower in HF patients (65.4±11.1%) compared to controls (98.6±2.2; p<0.0001). Maximum respiratory exchange ratio did not differ between the groups (1.2±0.5 for controls and 1.15±1 for heart failure patients; p=0.42). All controls reached the maximum heart rate for their age, while no patients in the heart failure group reached the maximum. Moreover, the % increase of heart rate from rest to peak exercise between heart failure (48±9%) and control (53±8%) was not different (p=0.157). CONCLUSION No patient in the heart failure group reached the maximum heart rate for their age during a treadmill cardiopulmonary exercise test, despite the fact that the percentage increase of heart rate was similar to sedentary normal subjects. A heart rate increase in optimized beta-blocked heart failure patients during cardiopulmonary exercise test over 65% of the maximum age-adjusted value should be considered an effort near the maximum. This information may be useful in rehabilitation programs and ischemic tests, although further studies are required. PMID:18719758

  5. Counteracting Obstacles with Optimistic Predictions

    ERIC Educational Resources Information Center

    Zhang, Ying; Fishbach, Ayelet

    2010-01-01

    This research tested for counteractive optimism: a self-control strategy of generating optimistic predictions of future goal attainment in order to overcome anticipated obstacles in goal pursuit. In support of the counteractive optimism model, participants in 5 studies predicted better performance, more time invested in goal activities, and lower…

  6. Braided Composite Technologies for Rotorcraft Structures

    NASA Technical Reports Server (NTRS)

    Jessie, Nathan

    2015-01-01

    A&P Technology has developed a braided material approach for fabricating lightweight, high-strength hybrid gears for aerospace drive systems. The conventional metallic web was replaced with a composite element made from A&P's quasi-isotropic braid. The 0deg, +/-60deg braid architecture was chosen so that inplane stiffness properties and strength would be nearly equal in all directions. The test results from the Phase I Small Spur Gear program demonstrated satisfactory endurance and strength while providing a 20 percent weight savings. (Greater weight savings is anticipated with structural optimization.) The hybrid gears were subjected to a proof-of-concept test of 1 billion cycles in a gearbox at 10,000 revolutions per minute and 490 in-lb torque with no detectable damage to the gears. After this test the maximum torque capability was also tested, and the static strength capability of the gears was 7x the maximum operating condition. Additional proof-of-concept tests are in progress using a higher oil temperature, and a loss-of-oil test is planned. The success of Phase I led to a Phase II program to develop, fabricate, and optimize full-scale gears, specifically Bull Gears. The design of these Bull Gears will be refined using topology optimization, and the full-scale Bull Gears will be tested in a full-scale gear rig. The testing will quantify benefits of weight savings, as well as noise and vibration reduction. The expectation is that vibration and noise will be reduced through the introduction of composite material in the vibration transmission path between the contacting gear teeth and the shaft-and-bearing system.

  7. Braided Composite Technologies for Rotorcraft Structures

    NASA Technical Reports Server (NTRS)

    Jessie, Nathan

    2014-01-01

    A&P Technology has developed a braided material approach for fabricating lightweight, high-strength hybrid gears for aerospace drive systems. The conventional metallic web was replaced with a composite element made from A&P's quasi-isotropic braid. The 0deg, plus or minus 60 deg braid architecture was chosen so that inplane stiffness properties and strength would be nearly equal in all directions. The test results from the Phase I Small Spur Gear program demonstrated satisfactory endurance and strength while providing a 20 percent weight savings. (Greater weight savings is anticipated with structural optimization.) The hybrid gears were subjected to a proof-of-concept test of 1 billion cycles in a gearbox at 10,000 revolutions per minute and 490 in-lb torque with no detectable damage to the gears. After this test the maximum torque capability was also tested, and the static strength capability of the gears was 7x the maximum operating condition. Additional proof-of-concept tests are in progress using a higher oil temperature, and a loss-of-oil test is planned. The success of Phase I led to a Phase II program to develop, fabricate, and optimize full-scale gears, specifically Bull Gears. The design of these Bull Gears will be refined using topology optimization, and the full-scale Bull Gears will be tested in a full-scale gear rig. The testing will quantify benefits of weight savings, as well as noise and vibration reduction. The expectation is that vibration and noise will be reduced through the introduction of composite material in the vibration transmission path between the contacting gear teeth and the shaft-and-bearing system.

  8. Commercialization of Medium Voltage HTS Triax TM Cable Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knoll, David

    2012-12-31

    The original project scope that was established in 2007 aimed to install a 1,700 meter (1.1 mile) medium voltage HTS Triax{TM} cable system into the utility grid in New Orleans, LA. In 2010, however, the utility partner withdrew from the project, so the 1,700 meter cable installation was cancelled and the scope of work was reduced. The work then concentrated on the specific barriers to commercialization of HTS cable technology. The modified scope included long-length HTS cable design and testing, high voltage factory test development, optimized cooling system development, and HTS cable life-cycle analysis. In 2012, Southwire again analyzed themore » market for HTS cables and deemed the near term market acceptance to be low. The scope of work was further reduced to the completion of tasks already started and to testing of the existing HTS cable system in Columbus, OH. The work completed under the project included: • Long-length cable modeling and analysis • HTS wire evaluation and testing • Cable testing for AC losses • Optimized cooling system design • Life cycle testing of the HTS cable in Columbus, OH • Project management. The 200 meter long HTS Triax{TM} cable in Columbus, OH was incorporated into the project under the initial scope changes as a test bed for life cycle testing as well as the site for an optimized HTS cable cooling system. The Columbus cable utilizes the HTS TriaxTM design, so it provided an economical tool for these of the project tasks.« less

  9. Seasonal performance of a malaria rapid diagnosis test at community health clinics in a malaria-hyperendemic region of Burkina Faso

    PubMed Central

    2012-01-01

    Backgound Treatment of confirmed malaria patients with Artemisinin-based Combination Therapy (ACT) at remote areas is the goal of many anti-malaria programs. Introduction of effective and affordable malaria Rapid Diagnosis Test (RDT) in remote areas could be an alternative tool for malaria case management. This study aimed to assess performance of the OptiMAL dipstick for rapid malaria diagnosis in children under five. Methods Malaria symptomatic and asymptomatic children were recruited in a passive manner in two community clinics (CCs). Malaria diagnosis by microscopy and RDT were performed. Performance of the tests was determined. Results RDT showed similar ability (61.2%) to accurately diagnose malaria as microscopy (61.1%). OptiMAL showed a high level of sensitivity and specificity, compared with microscopy, during both transmission seasons (high & low), with a sensitivity of 92.9% vs. 74.9% and a specificity of 77.2% vs. 87.5%. Conclusion By improving the performance of the test through accurate and continuous quality control of the device in the field, OptiMAL could be suitable for use at CCs for the management and control of malaria. PMID:22647557

  10. Clinical Performance of an Ultrahigh Resolution Chromosomal Microarray Optimized for Neurodevelopmental Disorders.

    PubMed

    Ho, Karen S; Twede, Hope; Vanzo, Rena; Harward, Erin; Hensel, Charles H; Martin, Megan M; Page, Stephanie; Peiffer, Andreas; Mowery-Rushton, Patricia; Serrano, Moises; Wassman, E Robert

    2016-01-01

    Copy number variants (CNVs) as detected by chromosomal microarray analysis (CMA) significantly contribute to the etiology of neurodevelopmental disorders, such as developmental delay (DD), intellectual disability (ID), and autism spectrum disorder (ASD). This study summarizes the results of 3.5 years of CMA testing by a CLIA-certified clinical testing laboratory 5487 patients with neurodevelopmental conditions were clinically evaluated for rare copy number variants using a 2.8-million probe custom CMA optimized for the detection of CNVs associated with neurodevelopmental disorders. We report an overall detection rate of 29.4% in our neurodevelopmental cohort, which rises to nearly 33% when cases with DD/ID and/or MCA only are considered. The detection rate for the ASD cohort is also significant, at 25%. Additionally, we find that detection rate and pathogenic yield of CMA vary significantly depending on the primary indications for testing, the age of the individuals tested, and the specialty of the ordering doctor. We also report a significant difference between the detection rate on the ultrahigh resolution optimized array in comparison to the array from which it originated. This increase in detection can significantly contribute to the efficient and effective medical management of neurodevelopmental conditions in the clinic.

  11. Optimization of Friction and Wear Properties of Electroless Ni-P Coatings Under Lubrication Using Grey Fuzzy Logic

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Arkadeb; Duari, Santanu; Barman, Tapan Kumar; Sahoo, Prasanta

    2017-10-01

    The present study aims to evaluate the friction and wear behaviour of electroless Ni-P coatings sliding against hardened chromium coated steel under lubrication. Tribological tests are carried out on a block-on-roller configuration multi tribotester. The effect of variation of applied normal load, rotation speed of the counterface roller and test duration on the coefficient of friction and wear depth is analyzed using Taguchi's robust design philosophy and design of experiments. Optimal setting of the tribo-testing parameters is evaluated using a hybrid grey fuzzy reasoning analysis in a quest to achieve optimal tribological performance of the coatings under lubrication. Analysis of variance reveals the highest contribution by applied normal load in controlling the tribological behaviour under lubrication. Whereas the interaction effect of load and time is also seen to cast a significant effect. Surface morphology studies reveal a typical nodular structure of the deposits. The coatings are seen to be amorphous in its as-deposited condition which becomes crystalline on heat treatment. Further, the synergistic effects of test parameters, microstructure of the coatings, lubrication, etc. on the tribological behaviour are assessed.

  12. Updating the Finite Element Model of the Aerostructures Test Wing Using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Lung, Shun-Fat; Pak, Chan-Gi

    2009-01-01

    Improved and/or accelerated decision making is a crucial step during flutter certification processes. Unfortunately, most finite element structural dynamics models have uncertainties associated with model validity. Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. The model tuning process requires not only satisfactory correlations between analytical and experimental results, but also the retention of the mass and stiffness properties of the structures. Minimizing the difference between analytical and experimental results is a type of optimization problem. By utilizing the multidisciplinary design, analysis, and optimization (MDAO) tool in order to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes can be matched to the target data to retain the mass matrix orthogonality. This approach has been applied to minimize the model uncertainties for the structural dynamics model of the aerostructures test wing (ATW), which was designed and tested at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California). This study has shown that natural frequencies and corresponding mode shapes from the updated finite element model have excellent agreement with corresponding measured data.

  13. Updating the Finite Element Model of the Aerostructures Test Wing using Ground Vibration Test Data

    NASA Technical Reports Server (NTRS)

    Lung, Shun-fat; Pak, Chan-gi

    2009-01-01

    Improved and/or accelerated decision making is a crucial step during flutter certification processes. Unfortunately, most finite element structural dynamics models have uncertainties associated with model validity. Tuning the finite element model using measured data to minimize the model uncertainties is a challenging task in the area of structural dynamics. The model tuning process requires not only satisfactory correlations between analytical and experimental results, but also the retention of the mass and stiffness properties of the structures. Minimizing the difference between analytical and experimental results is a type of optimization problem. By utilizing the multidisciplinary design, analysis, and optimization (MDAO) tool in order to optimize the objective function and constraints; the mass properties, the natural frequencies, and the mode shapes can be matched to the target data to retain the mass matrix orthogonality. This approach has been applied to minimize the model uncertainties for the structural dynamics model of the Aerostructures Test Wing (ATW), which was designed and tested at the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (DFRC) (Edwards, California). This study has shown that natural frequencies and corresponding mode shapes from the updated finite element model have excellent agreement with corresponding measured data.

  14. A novel suction/coagulation integrated probe for achieving better hemostasis: development and clinical use.

    PubMed

    Takahashi, Hidekazu; Haraguchi, Naotsugu; Nishimura, Junichi; Hata, Taishi; Matsuda, Chu; Yamamoto, Hirofumi; Mizushima, Tsunekazu; Mori, Masaki; Doki, Yuichiro; Nakajima, Kiyokazu

    2018-06-01

    Modern electrosurgical tools have a specific coagulation mode called "soft coagulation". However, soft coagulation has not been widely accepted for surgical operations. To optimize the soft coagulation environment, we developed a novel suction device integrated with an electrosurgical probe, called the "Suction ball coagulator" (SBC). In this study, we aimed to optimize the SBC design with a prototyping process involving a bench test and preclinical study; then, we aimed to demonstrate the feasibility, safety, and potential effectiveness of the SBC for laparoscopic surgery in clinical settings. SBC prototyping was performed with a bench test. Device optimization was performed in a preclinical study with a domestic swine bleeding model. Then, SBC was tested in a clinical setting during 17 clinical laparoscopic colorectal surgeries. In the bench tests, two tip hole sizes and patterns showed a good suction capacity. The preclinical study indicated the best tip shape for accuracy. In clinical use, no device-related adverse event was observed. Moreover, the SBC was feasible for prompt hemostasis and blunt dissections. In addition, SBC could evacuate vapors generated by tissue ablation using electroprobe during laparoscopic surgery. We successfully developed a novel, integrated suction/coagulation probe for hemostasis and commercialized it.

  15. Sensitive and specific detection of viable Mycobacterium avium subsp. paratuberculosis in raw milk by the peptide-mediated magnetic separation-phage assay.

    PubMed

    Foddai, A C G; Grant, I R

    2017-05-01

    To validate an optimized peptide-mediated magnetic separation (PMS)-phage assay for detection of viable Mycobacterium avium subsp. paratuberculosis (MAP) in milk. Inclusivity, specificity and limit of detection 50% (LOD 50 ) of the optimized PMS-phage assay were assessed. Plaques were obtained for all 43 MAP strains tested. Of 12 other Mycobacterium sp. tested, only Mycobacterium bovis BCG produced small numbers of plaques. LOD 50 of the PMS-phage assay was 0·93 MAP cells per 50 ml milk, which was better than both PMS-qPCR and PMS-culture. When individual milks (n = 146) and bulk tank milk (BTM, n = 22) obtained from Johne's affected herds were tested by the PMS-phage assay, viable MAP were detected in 31 (21·2%) of 146 individual milks and 13 (59·1%) of 22 BTM, with MAP numbers detected ranging from 6-948 plaque-forming-units per 50 ml milk. PMS-qPCR and PMS-MGIT culture proved to be less sensitive tests than the PMS-phage assay. The optimized PMS-phage assay is the most sensitive and specific method available for the detection of viable MAP in milk. Further work is needed to streamline the PMS-phage assay, because the assay's multistep format currently makes it unsuitable for adoption by the dairy industry as a screening test. The inclusivity (ability to detect all MAP strains), specificity (ability to detect only MAP) and detection sensitivity (ability to detect low numbers of MAP) of the optimized PMS-phage assay have been comprehensively demonstrated for the first time. © 2017 The Society for Applied Microbiology.

  16. The influence of dispositional optimism and gender on adolescents' perception of academic stress.

    PubMed

    Huan, Vivien S; Yeo, Lay See; Ang, Rebecca P; Chong, Wan Har

    2006-01-01

    This study investigated the role of optimism together with gender, on students' perception of academic stress. Four hundred and thirty secondary school students from Singapore participated in this study and data were collected using two self-report measures: the Life Orientation Test and the Academic Expectation Stress Inventory. Results revealed a significant negative relationship between optimism and academic stress in students. Gender was not a significant predictor of academic stress and no two-way interactions were found between optimism and gender of the participants. Possible explanations for the results were suggested and implications of the findings were discussed.

  17. The MusIC method: a fast and quasi-optimal solution to the muscle forces estimation problem.

    PubMed

    Muller, A; Pontonnier, C; Dumont, G

    2018-02-01

    The present paper aims at presenting a fast and quasi-optimal method of muscle forces estimation: the MusIC method. It consists in interpolating a first estimation in a database generated offline thanks to a classical optimization problem, and then correcting it to respect the motion dynamics. Three different cost functions - two polynomial criteria and a min/max criterion - were tested on a planar musculoskeletal model. The MusIC method provides a computation frequency approximately 10 times higher compared to a classical optimization problem with a relative mean error of 4% on cost function evaluation.

  18. DNASynth: a software application to optimization of artificial gene synthesis

    NASA Astrophysics Data System (ADS)

    Muczyński, Jan; Nowak, Robert M.

    2017-08-01

    DNASynth is a client-server software application in which the client runs in a web browser. The aim of this program is to support and optimize process of artificial gene synthesizing using Ligase Chain Reaction. Thanks to LCR it is possible to obtain DNA strand coding defined by user peptide. The DNA sequence is calculated by optimization algorithm that consider optimal codon usage, minimal energy of secondary structures and minimal number of required LCR. Additionally absence of sequences characteristic for defined by user set of restriction enzymes is guaranteed. The presented software was tested on synthetic and real data.

  19. Automation of On-Board Flightpath Management

    NASA Technical Reports Server (NTRS)

    Erzberger, H.

    1981-01-01

    The status of concepts and techniques for the design of onboard flight path management systems is reviewed. Such systems are designed to increase flight efficiency and safety by automating the optimization of flight procedures onboard aircraft. After a brief review of the origins and functions of such systems, two complementary methods are described for attacking the key design problem, namely, the synthesis of efficient trajectories. One method optimizes en route, the other optimizes terminal area flight; both methods are rooted in optimal control theory. Simulation and flight test results are reviewed to illustrate the potential of these systems for fuel and cost savings.

  20. A Surrogate Approach to the Experimental Optimization of Multielement Airfoils

    NASA Technical Reports Server (NTRS)

    Otto, John C.; Landman, Drew; Patera, Anthony T.

    1996-01-01

    The incorporation of experimental test data into the optimization process is accomplished through the use of Bayesian-validated surrogates. In the surrogate approach, a surrogate for the experiment (e.g., a response surface) serves in the optimization process. The validation step of the framework provides a qualitative assessment of the surrogate quality, and bounds the surrogate-for-experiment error on designs "near" surrogate-predicted optimal designs. The utility of the framework is demonstrated through its application to the experimental selection of the trailing edge ap position to achieve a design lift coefficient for a three-element airfoil.

  1. Do Optimal Prognostic Thresholds in Continuous Physiological Variables Really Exist? Analysis of Origin of Apparent Thresholds, with Systematic Review for Peak Oxygen Consumption, Ejection Fraction and BNP

    PubMed Central

    Leong, Tora; Rehman, Michaela B.; Pastormerlo, Luigi Emilio; Harrell, Frank E.; Coats, Andrew J. S.; Francis, Darrel P.

    2014-01-01

    Background Clinicians are sometimes advised to make decisions using thresholds in measured variables, derived from prognostic studies. Objectives We studied why there are conflicting apparently-optimal prognostic thresholds, for example in exercise peak oxygen uptake (pVO2), ejection fraction (EF), and Brain Natriuretic Peptide (BNP) in heart failure (HF). Data Sources and Eligibility Criteria Studies testing pVO2, EF or BNP prognostic thresholds in heart failure, published between 1990 and 2010, listed on Pubmed. Methods First, we examined studies testing pVO2, EF or BNP prognostic thresholds. Second, we created repeated simulations of 1500 patients to identify whether an apparently-optimal prognostic threshold indicates step change in risk. Results 33 studies (8946 patients) tested a pVO2 threshold. 18 found it prognostically significant: the actual reported threshold ranged widely (10–18 ml/kg/min) but was overwhelmingly controlled by the individual study population's mean pVO2 (r = 0.86, p<0.00001). In contrast, the 15 negative publications were testing thresholds 199% further from their means (p = 0.0001). Likewise, of 35 EF studies (10220 patients), the thresholds in the 22 positive reports were strongly determined by study means (r = 0.90, p<0.0001). Similarly, in the 19 positives of 20 BNP studies (9725 patients): r = 0.86 (p<0.0001). Second, survival simulations always discovered a “most significant” threshold, even when there was definitely no step change in mortality. With linear increase in risk, the apparently-optimal threshold was always near the sample mean (r = 0.99, p<0.001). Limitations This study cannot report the best threshold for any of these variables; instead it explains how common clinical research procedures routinely produce false thresholds. Key Findings First, shifting (and/or disappearance) of an apparently-optimal prognostic threshold is strongly determined by studies' average pVO2, EF or BNP. Second, apparently-optimal thresholds always appear, even with no step in prognosis. Conclusions Emphatic therapeutic guidance based on thresholds from observational studies may be ill-founded. We should not assume that optimal thresholds, or any thresholds, exist. PMID:24475020

  2. Static and Dynamic Model Update of an Inflatable/Rigidizable Torus Structure

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Reaves, mercedes C.

    2006-01-01

    The present work addresses the development of an experimental and computational procedure for validating finite element models. A torus structure, part of an inflatable/rigidizable Hexapod, is used to demonstrate the approach. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with optimization is used to modify key model parameters. Static test results are used to update stiffness parameters and dynamic test results are used to update the mass distribution. Updated parameters are computed using gradient and non-gradient based optimization algorithms. Results show significant improvements in model predictions after parameters are updated. Lessons learned in the areas of test procedures, modeling approaches, and uncertainties quantification are presented.

  3. New lab-made coagulant based on Schinopsis balansae tannin extract: synthesis optimization and preliminary tests on refractory water pollutants

    NASA Astrophysics Data System (ADS)

    Sánchez-Martín, J.; Beltrán-Heredia, J.; Coco-Rivero, B.

    2014-09-01

    Quebracho colorado tannin extract was used as a coagulant raw material for water and wastewater treatment. The chemical synthesis follows a Mannich reaction mechanism and provides a fully working coagulant that can remove several pollutants from water. This paper addresses the optimization of such synthesis and confirms the feasibility of the coagulant by testing it in a preliminary screening for the elimination of dyes and detergents. The optimum combination of reagents was 6.81 g of diethanolamine (DEA) and 2.78 g of formaldehyde (F) per g of tannin extract. So obtained coagulant was succesfully tested on the removal of 9 dyes and 8 detergents.

  4. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  5. Increasing the statistical significance of entanglement detection in experiments.

    PubMed

    Jungnitsch, Bastian; Niekamp, Sönke; Kleinmann, Matthias; Gühne, Otfried; Lu, He; Gao, Wei-Bo; Chen, Yu-Ao; Chen, Zeng-Bing; Pan, Jian-Wei

    2010-05-28

    Entanglement is often verified by a violation of an inequality like a Bell inequality or an entanglement witness. Considerable effort has been devoted to the optimization of such inequalities in order to obtain a high violation. We demonstrate theoretically and experimentally that such an optimization does not necessarily lead to a better entanglement test, if the statistical error is taken into account. Theoretically, we show for different error models that reducing the violation of an inequality can improve the significance. Experimentally, we observe this phenomenon in a four-photon experiment, testing the Mermin and Ardehali inequality for different levels of noise. Furthermore, we provide a way to develop entanglement tests with high statistical significance.

  6. Hologram interferometry in automotive component vibration testing

    NASA Astrophysics Data System (ADS)

    Brown, Gordon M.; Forbes, Jamie W.; Marchi, Mitchell M.; Wales, Raymond R.

    1993-02-01

    An ever increasing variety of automotive component vibration testing is being pursued at Ford Motor Company, U.S.A. The driving force for use of hologram interferometry in these tests is the continuing need to design component structures to meet more stringent functional performance criteria. Parameters such as noise and vibration, sound quality, and reliability must be optimized for the lightest weight component possible. Continually increasing customer expectations and regulatory pressures on fuel economy and safety mandate that vehicles be built from highly optimized components. This paper includes applications of holographic interferometry for powertrain support structure tuning, body panel noise reduction, wiper system noise and vibration path analysis, and other vehicle component studies.

  7. A Randomized Rounding Approach for Optimization of Test Sheet Composing and Exposure Rate Control in Computer-Assisted Testing

    ERIC Educational Resources Information Center

    Wang, Chu-Fu; Lin, Chih-Lung; Deng, Jien-Han

    2012-01-01

    Testing is an important stage of teaching as it can assist teachers in auditing students' learning results. A good test is able to accurately reflect the capability of a learner. Nowadays, Computer-Assisted Testing (CAT) is greatly improving traditional testing, since computers can automatically and quickly compose a proper test sheet to meet user…

  8. Thermal interface material characterization for cryogenic electronic packaging solutions

    NASA Astrophysics Data System (ADS)

    Dillon, A.; McCusker, K.; Van Dyke, J.; Isler, B.; Christiansen, M.

    2017-12-01

    As applications of superconducting logic technologies continue to grow, the need for efficient and reliable cryogenic packaging becomes crucial to development and testing. A trade study of materials was done to develop a practical understanding of the properties of interface materials around 4 K. While literature exists for varying interface tests, discrepancies are found in the reported performance of different materials and in the ranges of applied force in which they are optimal. In considering applications extending from top cooling a silicon chip to clamping a heat sink, a range of forces from approximately 44 N to approximately 445 N was chosen for testing different interface materials. For each range of forces a single material was identified to optimize the thermal conductance of the joint. Of the tested interfaces, indium foil clamped at approximately 445 N showed the highest thermal conductance. Results are presented from these characterizations and useful methodologies for efficient testing are defined.

  9. Development and flight testing of UV optimized Photon Counting CCDs

    NASA Astrophysics Data System (ADS)

    Hamden, Erika T.

    2018-06-01

    I will discuss the latest results from the Hamden UV/Vis Detector Lab and our ongoing work using a UV optimized EMCCD in flight. Our lab is currently testing efficiency and performance of delta-doped, anti-reflection coated EMCCDs, in collaboration with JPL. The lab has been set-up to test quantum efficiency, dark current, clock-induced-charge, and read noise. I will describe our improvements to our circuit boards for lower noise, updates from a new, more flexible NUVU controller, and the integration of an EMCCD in the FIREBall-2 UV spectrograph. I will also briefly describe future plans to conduct radiation testing on delta-doped EMCCDs (both warm, unbiased and cold, biased configurations) thus summer and longer term plans for testing newer photon counting CCDs as I move the HUVD Lab to the University of Arizona in the Fall of 2018.

  10. Genetic particle swarm parallel algorithm analysis of optimization arrangement on mistuned blades

    NASA Astrophysics Data System (ADS)

    Zhao, Tianyu; Yuan, Huiqun; Yang, Wenjun; Sun, Huagang

    2017-12-01

    This article introduces a method of mistuned parameter identification which consists of static frequency testing of blades, dichotomy and finite element analysis. A lumped parameter model of an engine bladed-disc system is then set up. A bladed arrangement optimization method, namely the genetic particle swarm optimization algorithm, is presented. It consists of a discrete particle swarm optimization and a genetic algorithm. From this, the local and global search ability is introduced. CUDA-based co-evolution particle swarm optimization, using a graphics processing unit, is presented and its performance is analysed. The results show that using optimization results can reduce the amplitude and localization of the forced vibration response of a bladed-disc system, while optimization based on the CUDA framework can improve the computing speed. This method could provide support for engineering applications in terms of effectiveness and efficiency.

  11. Optimal Testing Environment. Research Brief

    ERIC Educational Resources Information Center

    Walker, Karen

    2010-01-01

    Even though it often feels like standardized testing is a relatively recent phenomena, it has been around at least since the 1800s, when in China, those that wanted a government job were required to take a test on their expertise of Confucian philosophy and poetry. During the Industrial Revolution, standardized tests were a quick way to test large…

  12. Optical performance of random anti-reflection structured surfaces (rARSS) on spherical lenses

    NASA Astrophysics Data System (ADS)

    Taylor, Courtney D.

    Random anti-reflection structured surfaces (rARSS) have been reported to improve transmittance of optical-grade fused silica planar substrates to values greater than 99%. These textures are fabricated directly on the substrates using reactive-ion/inductively-coupled plasma etching (RIE/ICP) techniques, and often result in transmitted spectra with no measurable interference effects (fringes) for a wide range of wavelengths. The RIE/ICP processes used in the fabrication process to etch the rARSS is anisotropic and thus well suited for planar components. The improvement in spectral transmission has been found to be independent of optical incidence angles for values from 0° to +/-30°. Qualifying and quantifying the rARSS performance on curved substrates, such as convex lenses, is required to optimize the fabrication of the desired AR effect on optical-power elements. In this work, rARSS was fabricated on fused silica plano-convex (PCX) and plano-concave (PCV) lenses using a planar-substrate optimized RIE process to maximize optical transmission in the range from 500 to 1100 nm. An additional set of lenses were etched in a non-optimized ICP process to provide additional comparisons. Results are presented from optical transmission and beam propagation tests (optimized lenses only) of rARSS lenses for both TE and TM incident polarizations at a wavelength of 633 nm and over a 70° full field of view in both singlet and doublet configurations. These results suggest optimization of the fabrication process is not required, mainly due to the wide angle-of-incidence AR tolerance performance of the rARSS lenses. Non-optimized recipe lenses showed low transmission enhancement, and confirmed the need to optimized etch recipes prior to process transfer of PCX/PCV lenses. Beam propagation tests indicated no major beam degradation through the optimized lens elements. Scanning electron microscopy (SEM) images confirmed different structure between optimized and non-optimized samples. SEM images also indicated isotropically-oriented surface structures on both types of lenses.

  13. Experimental Investigation of the Application of Microramp Flow Control to an Oblique Shock Interaction

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie M.; Anderson, Bernhard H.

    2009-01-01

    The effectiveness of microramp flow control devices in controlling an oblique shock interaction was tested in the 15- by 15-Centimeter Supersonic Wind Tunnel at NASA Glenn Research Center. Fifteen microramp geometries were tested varying the height, chord length, and spacing between ramps. Measurements of the boundary layer properties downstream of the shock reflection were analyzed using design of experiments methods. Results from main effects, D-optimal, full factorial, and central composite designs were compared. The designs provided consistent results for a single variable optimization.

  14. Modal phase measuring deflectometry

    DOE PAGES

    Huang, Lei; Xue, Junpeng; Gao, Bo; ...

    2016-10-14

    Here in this work, a model based method is applied to phase measuring deflectometry, which is named as modal phase measuring deflectometry. The height and slopes of the surface under test are represented by mathematical models and updated by optimizing the model coefficients to minimize the discrepancy between the reprojection in ray tracing and the actual measurement. The pose of the screen relative to the camera is pre-calibrated and further optimized together with the shape coefficients of the surface under test. Simulations and experiments are conducted to demonstrate the feasibility of the proposed approach.

  15. Design and Manufacturing of Composite Tower Structure for Wind Turbine Equipment

    NASA Astrophysics Data System (ADS)

    Park, Hyunbum

    2018-02-01

    This study proposes the composite tower design process for large wind turbine equipment. In this work, structural design of tower and analysis using finite element method was performed. After structural design, prototype blade manufacturing and test was performed. The used material is a glass fiber and epoxy resin composite. And also, sand was used in the middle part. The optimized structural design and analysis was performed. The parameter for optimized structural design is weight reduction and safety of structure. Finally, structure of tower will be confirmed by structural test.

  16. Quality assurance for high dose rate brachytherapy treatment planning optimization: using a simple optimization to verify a complex optimization

    NASA Astrophysics Data System (ADS)

    Deufel, Christopher L.; Furutani, Keith M.

    2014-02-01

    As dose optimization for high dose rate brachytherapy becomes more complex, it becomes increasingly important to have a means of verifying that optimization results are reasonable. A method is presented for using a simple optimization as quality assurance for the more complex optimization algorithms typically found in commercial brachytherapy treatment planning systems. Quality assurance tests may be performed during commissioning, at regular intervals, and/or on a patient specific basis. A simple optimization method is provided that optimizes conformal target coverage using an exact, variance-based, algebraic approach. Metrics such as dose volume histogram, conformality index, and total reference air kerma agree closely between simple and complex optimizations for breast, cervix, prostate, and planar applicators. The simple optimization is shown to be a sensitive measure for identifying failures in a commercial treatment planning system that are possibly due to operator error or weaknesses in planning system optimization algorithms. Results from the simple optimization are surprisingly similar to the results from a more complex, commercial optimization for several clinical applications. This suggests that there are only modest gains to be made from making brachytherapy optimization more complex. The improvements expected from sophisticated linear optimizations, such as PARETO methods, will largely be in making systems more user friendly and efficient, rather than in finding dramatically better source strength distributions.

  17. 40 CFR 89.6 - Reference materials.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... November 89: Recommended Practice for Engine Testing with Low Temperature Charge Air Cooler Systems in a Dynamometer Test Cell 89.327-96 SAE Paper 770141: Optimization of a Flame Ionization Detector for...

  18. Deformation effect simulation and optimization for double front axle steering mechanism

    NASA Astrophysics Data System (ADS)

    Wu, Jungang; Zhang, Siqin; Yang, Qinglong

    2013-03-01

    This paper research on tire wear problem of heavy vehicles with Double Front Axle Steering Mechanism from the flexible effect of Steering Mechanism, and proposes a structural optimization method which use both traditional static structural theory and dynamic structure theory - Equivalent Static Load (ESL) method to optimize key parts. The good simulated and test results show this method has high engineering practice and reference value for tire wear problem of Double Front Axle Steering Mechanism design.

  19. Acceleration techniques in the univariate Lipschitz global optimization

    NASA Astrophysics Data System (ADS)

    Sergeyev, Yaroslav D.; Kvasov, Dmitri E.; Mukhametzhanov, Marat S.; De Franco, Angela

    2016-10-01

    Univariate box-constrained Lipschitz global optimization problems are considered in this contribution. Geometric and information statistical approaches are presented. The novel powerful local tuning and local improvement techniques are described in the contribution as well as the traditional ways to estimate the Lipschitz constant. The advantages of the presented local tuning and local improvement techniques are demonstrated using the operational characteristics approach for comparing deterministic global optimization algorithms on the class of 100 widely used test functions.

  20. Numerical Optimization Using Computer Experiments

    NASA Technical Reports Server (NTRS)

    Trosset, Michael W.; Torczon, Virginia

    1997-01-01

    Engineering design optimization often gives rise to problems in which expensive objective functions are minimized by derivative-free methods. We propose a method for solving such problems that synthesizes ideas from the numerical optimization and computer experiment literatures. Our approach relies on kriging known function values to construct a sequence of surrogate models of the objective function that are used to guide a grid search for a minimizer. Results from numerical experiments on a standard test problem are presented.

Top