Creating Empowering Educational Experiences through a Feminist Approach
ERIC Educational Resources Information Center
Chuang, Pei-Fen
2012-01-01
This article presents an innovative approach to preparing future teachers at a teachers college in Taiwan. The approach outlined here is designed to emphasize the beneficial effects of a feminist leader in providing an empowering educational experience to preservice teachers. The traditional university classroom approach often does not promote…
Resolution of an Orbital Issue: A Designed Experiment
NASA Technical Reports Server (NTRS)
Huddleston, Lisa L.
2011-01-01
Design of Experiments (DOE) is a systematic approach to investigation of a system or process. A series of structured tests are designed in which planned changes are made to the input variables of a process or system. The effects of these changes on a pre-defined output are then assessed. DOE is a formal method of maximizing information gained while minimizing resources required.
Case Studies for the Statistical Design of Experiments Applied to Powered Rotor Wind Tunnel Tests
NASA Technical Reports Server (NTRS)
Overmeyer, Austin D.; Tanner, Philip E.; Martin, Preston B.; Commo, Sean A.
2015-01-01
The application of statistical Design of Experiments (DOE) to helicopter wind tunnel testing was explored during two powered rotor wind tunnel entries during the summers of 2012 and 2013. These tests were performed jointly by the U.S. Army Aviation Development Directorate Joint Research Program Office and NASA Rotary Wing Project Office, currently the Revolutionary Vertical Lift Project, at NASA Langley Research Center located in Hampton, Virginia. Both entries were conducted in the 14- by 22-Foot Subsonic Tunnel with a small portion of the overall tests devoted to developing case studies of the DOE approach as it applies to powered rotor testing. A 16-47 times reduction in the number of data points required was estimated by comparing the DOE approach to conventional testing methods. The average error for the DOE surface response model for the OH-58F test was 0.95 percent and 4.06 percent for drag and download, respectively. The DOE surface response model of the Active Flow Control test captured the drag within 4.1 percent of measured data. The operational differences between the two testing approaches are identified, but did not prevent the safe operation of the powered rotor model throughout the DOE test matrices.
Design of experiments applications in bioprocessing: concepts and approach.
Kumar, Vijesh; Bhalla, Akriti; Rathore, Anurag S
2014-01-01
Most biotechnology unit operations are complex in nature with numerous process variables, feed material attributes, and raw material attributes that can have significant impact on the performance of the process. Design of experiments (DOE)-based approach offers a solution to this conundrum and allows for an efficient estimation of the main effects and the interactions with minimal number of experiments. Numerous publications illustrate application of DOE towards development of different bioprocessing unit operations. However, a systematic approach for evaluation of the different DOE designs and for choosing the optimal design for a given application has not been published yet. Through this work we have compared the I-optimal and D-optimal designs to the commonly used central composite and Box-Behnken designs for bioprocess applications. A systematic methodology is proposed for construction of the model and for precise prediction of the responses for the three case studies involving some of the commonly used unit operations in downstream processing. Use of Akaike information criterion for model selection has been examined and found to be suitable for the applications under consideration. © 2013 American Institute of Chemical Engineers.
Accelerating Vaccine Formulation Development Using Design of Experiment Stability Studies.
Ahl, Patrick L; Mensch, Christopher; Hu, Binghua; Pixley, Heidi; Zhang, Lan; Dieter, Lance; Russell, Ryann; Smith, William J; Przysiecki, Craig; Kosinski, Mike; Blue, Jeffrey T
2016-10-01
Vaccine drug product thermal stability often depends on formulation input factors and how they interact. Scientific understanding and professional experience typically allows vaccine formulators to accurately predict the thermal stability output based on formulation input factors such as pH, ionic strength, and excipients. Thermal stability predictions, however, are not enough for regulators. Stability claims must be supported by experimental data. The Quality by Design approach of Design of Experiment (DoE) is well suited to describe formulation outputs such as thermal stability in terms of formulation input factors. A DoE approach particularly at elevated temperatures that induce accelerated degradation can provide empirical understanding of how vaccine formulation input factors and interactions affect vaccine stability output performance. This is possible even when clear scientific understanding of particular formulation stability mechanisms are lacking. A DoE approach was used in an accelerated 37(°)C stability study of an aluminum adjuvant Neisseria meningitidis serogroup B vaccine. Formulation stability differences were identified after only 15 days into the study. We believe this study demonstrates the power of combining DoE methodology with accelerated stress stability studies to accelerate and improve vaccine formulation development programs particularly during the preformulation stage. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Unal, Zafer; Unal, Aslihan
2012-01-01
This study provided a basis for answering the following essential question: Does the years of experience affect teachers' classroom management approaches? Data were collected from 268 primary school teachers. The findings of this study demonstrated that experienced teachers are more likely to prefer to be in control in their classrooms than…
A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)
NASA Technical Reports Server (NTRS)
Rhew, Ray D.; Parker, Peter A.
2007-01-01
Design of Experiments (DOE) was applied to the LAS geometric parameter study to efficiently identify and rank primary contributors to integrated drag over the vehicles ascent trajectory in an order of magnitude fewer CFD configurations thereby reducing computational resources and solution time. SME s were able to gain a better understanding on the underlying flowphysics of different geometric parameter configurations through the identification of interaction effects. An interaction effect, which describes how the effect of one factor changes with respect to the levels of other factors, is often the key to product optimization. A DOE approach emphasizes a sequential approach to learning through successive experimentation to continuously build on previous knowledge. These studies represent a starting point for expanded experimental activities that will eventually cover the entire design space of the vehicle and flight trajectory.
Yoshino, Hiroyuki; Hara, Yuko; Dohi, Masafumi; Yamashita, Kazunari; Hakomori, Tadashi; Kimura, Shin-Ichiro; Iwao, Yasunori; Itai, Shigeru
2018-04-01
Scale-up approaches for film coating process have been established for each type of film coating equipment from thermodynamic and mechanical analyses for several decades. The objective of the present study was to establish a versatile scale-up approach for film coating process applicable to commercial production that is based on critical quality attribute (CQA) using the Quality by Design (QbD) approach and is independent of the equipment used. Experiments on a pilot scale using the Design of Experiment (DoE) approach were performed to find a suitable CQA from surface roughness, contact angle, color difference, and coating film properties by terahertz spectroscopy. Surface roughness was determined to be a suitable CQA from a quantitative appearance evaluation. When surface roughness was fixed as the CQA, the water content of the film-coated tablets was determined to be the critical material attribute (CMA), a parameter that does not depend on scale or equipment. Finally, to verify the scale-up approach determined from the pilot scale, experiments on a commercial scale were performed. The good correlation between the surface roughness (CQA) and the water content (CMA) identified at the pilot scale was also retained at the commercial scale, indicating that our proposed method should be useful as a scale-up approach for film coating process.
Song, Yong-Hong; Sun, Xue-Wen; Jiang, Bo; Liu, Ji-En; Su, Xian-Hui
2015-12-01
Design of experiment (DoE) is a statistics-based technique for experimental design that could overcome the shortcomings of traditional one-factor-at-a-time (OFAT) approach for protein purification optimization. In this study, a DoE approach was applied for optimizing purification of a recombinant single-chain variable fragment (scFv) against type 1 insulin-like growth factor receptor (IGF-1R) expressed in Escherichia coli. In first capture step using Capto L, a 2-level fractional factorial analysis and successively a central composite circumscribed (CCC) design were used to identify the optimal elution conditions. Two main effects, pH and trehalose, were identified, and high recovery (above 95%) and low aggregates ratio (below 10%) were achieved at the pH range from 2.9 to 3.0 with 32-35% (w/v) trehalose added. In the second step using cation exchange chromatography, an initial screening of media and elution pH and a following CCC design were performed, whereby the optimal selectivity of the scFv was obtained on Capto S at pH near 6.0, and the optimal conditions for fulfilling high DBC and purity were identified as pH range of 5.9-6.1 and loading conductivity range of 5-12.5 mS/cm. Upon a further gel filtration, the final purified scFv with a purity of 98% was obtained. Finally, the optimized conditions were verified by a 20-fold scale-up experiment. The purities and yields of intermediate and final products all fell within the regions predicted by DoE approach, suggesting the robustness of the optimized conditions. We proposed that the DoE approach described here is also applicable in production of other recombinant antibody constructs. Copyright © 2015 Elsevier Inc. All rights reserved.
Ray, Chad A; Patel, Vimal; Shih, Judy; Macaraeg, Chris; Wu, Yuling; Thway, Theingi; Ma, Mark; Lee, Jean W; Desilva, Binodh
2009-02-20
Developing a process that generates robust immunoassays that can be used to support studies with tight timelines is a common challenge for bioanalytical laboratories. Design of experiments (DOEs) is a tool that has been used by many industries for the purpose of optimizing processes. The approach is capable of identifying critical factors and their interactions with a minimal number of experiments. The challenge for implementing this tool in the bioanalytical laboratory is to develop a user-friendly approach that scientists can understand and apply. We have successfully addressed these challenges by eliminating the screening design, introducing automation, and applying a simple mathematical approach for the output parameter. A modified central composite design (CCD) was applied to three ligand binding assays. The intra-plate factors selected were coating, detection antibody concentration, and streptavidin-HRP concentrations. The inter-plate factors included incubation times for each step. The objective was to maximize the logS/B (S/B) of the low standard to the blank. The maximum desirable conditions were determined using JMP 7.0. To verify the validity of the predictions, the logS/B prediction was compared against the observed logS/B during pre-study validation experiments. The three assays were optimized using the multi-factorial DOE. The total error for all three methods was less than 20% which indicated method robustness. DOE identified interactions in one of the methods. The model predictions for logS/B were within 25% of the observed pre-study validation values for all methods tested. The comparison between the CCD and hybrid screening design yielded comparable parameter estimates. The user-friendly design enables effective application of multi-factorial DOE to optimize ligand binding assays for therapeutic proteins. The approach allows for identification of interactions between factors, consistency in optimal parameter determination, and reduced method development time.
Patel, Ashaben; Erb, Steven M; Strange, Linda; Shukla, Ravi S; Kumru, Ozan S; Smith, Lee; Nelson, Paul; Joshi, Sangeeta B; Livengood, Jill A; Volkin, David B
2018-05-24
A combination experimental approach, utilizing semi-empirical excipient screening followed by statistical modeling using design of experiments (DOE), was undertaken to identify stabilizing candidate formulations for a lyophilized live attenuated Flavivirus vaccine candidate. Various potential pharmaceutical compounds used in either marketed or investigative live attenuated viral vaccine formulations were first identified. The ability of additives from different categories of excipients, either alone or in combination, were then evaluated for their ability to stabilize virus against freeze-thaw, freeze-drying, and accelerated storage (25°C) stresses by measuring infectious virus titer. An exploratory data analysis and predictive DOE modeling approach was subsequently undertaken to gain a better understanding of the interplay between the key excipients and stability of virus as well as to determine which combinations were interacting to improve virus stability. The lead excipient combinations were identified and tested for stabilizing effects using a tetravalent mixture of viruses in accelerated and real time (2-8°C) stability studies. This work demonstrates the utility of combining semi-empirical excipient screening and DOE experimental design strategies in the formulation development of lyophilized live attenuated viral vaccine candidates. Copyright © 2017 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Rice, Amber H.; Kitchel, Tracy
2016-01-01
This study explored experiences of beginning agriculture teachers' approaches to teaching content. The research question guiding the study was: how does agriculture teachers' knowledge of content and students influence their process of breaking down content knowledge for teaching? The researchers employed a grounded theory approach in which five…
NASA Astrophysics Data System (ADS)
Cho, G. S.
2017-09-01
For performance optimization of Refrigerated Warehouses, design parameters are selected based on the physical parameters such as number of equipment and aisles, speeds of forklift for ease of modification. This paper provides a comprehensive framework approach for the system design of Refrigerated Warehouses. We propose a modeling approach which aims at the simulation optimization so as to meet required design specifications using the Design of Experiment (DOE) and analyze a simulation model using integrated aspect-oriented modeling approach (i-AOMA). As a result, this suggested method can evaluate the performance of a variety of Refrigerated Warehouses operations.
Does Class Matter? Mentoring Small Businesses' Owner-Managers
ERIC Educational Resources Information Center
Greenbank, Paul
2006-01-01
Purpose: This paper examines the way social class influences the relationship between business mentors and small business owner-managers. Design/methodology/approach: The paper is based on the author's experience of mentoring businesses with The Prince's Trust. Three businesses were selected as cases. The methodological approach involved…
Combining Education and Work: Experiences in Asia and Oceania: Australia.
ERIC Educational Resources Information Center
United Nations Educational, Scientific, and Cultural Organization, Bangkok (Thailand). Regional Office for Education in Asia and Oceania.
Although there is currently no national approach to career education in Australia, recent economic and labor trends have prompted the reconsideration of work experience and career education programs. Career education does not exist below secondary levels and prevocational training serves only as an introduction to extensive postsecondary technical…
Multiple Perspective: When Child Development Professionals Raise Twins
ERIC Educational Resources Information Center
Stark, Deborah Roderick; Harden, Brenda Jones; Chazan-Cohen, Rachel; Cohen, Daniel J.; Rice, Kathleen Fitzgerald
2006-01-01
Do child development professionals have expectations about what it will be like to parent twins based on their professional experiences? Does their professional knowledge influence their approach to caregiving? And do their personal experiences as parents of twins change their research interests or how they work with children and families? To…
NASA Technical Reports Server (NTRS)
Miller, Adam M.; Edeen, Marybeth; Sirko, Robert J.
1992-01-01
This paper describes the approach and results of an effort to characterize plant growth under various environmental conditions at the Johnson Space Center variable pressure growth chamber. Using a field of applied mathematics and statistics known as design of experiments (DOE), we developed a test plan for varying environmental parameters during a lettuce growth experiment. The test plan was developed using a Box-Behnken approach to DOE. As a result of the experimental runs, we have developed empirical models of both the transpiration process and carbon dioxide assimilation for Waldman's Green lettuce over specified ranges of environmental parameters including carbon dioxide concentration, light intensity, dew-point temperature, and air velocity. This model also predicts transpiration and carbon dioxide assimilation for different ages of the plant canopy.
WHC significant lessons learned 1993--1995
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bickford, J.C.
1997-12-12
A lesson learned as defined in DOE-STD-7501-95, Development of DOE Lessons Learned Programs, is: A ``good work practice`` or innovative approach that is captured and shared to promote repeat applications or an adverse work practice or experience that is captured and shared to avoid a recurrence. The key word in both parts of this definition is ``shared``. This document was published to share a wide variety of recent Hanford experiences with other DOE sites. It also provides a valuable tool to be used in new employee and continuing training programs at Hanford facilities and at other DOE locations. This manualmore » is divided into sections to facilitate extracting appropriate subject material when developing training modules. Many of the bulletins could be categorized into more than one section, however, so examination of other related sections is encouraged.« less
A practical approach to automate randomized design of experiments for ligand-binding assays.
Tsoi, Jennifer; Patel, Vimal; Shih, Judy
2014-03-01
Design of experiments (DOE) is utilized in optimizing ligand-binding assay by modeling factor effects. To reduce the analyst's workload and error inherent with DOE, we propose the integration of automated liquid handlers to perform the randomized designs. A randomized design created from statistical software was imported into custom macro converting the design into a liquid-handler worklist to automate reagent delivery. An optimized assay was transferred to a contract research organization resulting in a successful validation. We developed a practical solution for assay optimization by integrating DOE and automation to increase assay robustness and enable successful method transfer. The flexibility of this process allows it to be applied to a variety of assay designs.
How Sensory Experiences of Children With and Without Autism Affect Family Occupations
Bagby, Molly Shields; Dickie, Virginia A.; Baranek, Grace T.
2012-01-01
We used a grounded theory approach to data analysis to discover what effect, if any, children's sensory experiences have on family occupations. We chose this approach because the existing literature does not provide a theory to account for the effect of children's sensory experiences on family occupations. Parents of six children who were typically developing and six children who had autism were interviewed. We analyzed the data using open, axial, and selective coding techniques. Children's sensory experiences affect family occupations in three ways: (1) what a family chooses to do or not do; (2) how the family prepares; and (3) the extent to which experiences, meaning, and feelings are shared. PMID:22389942
Achievement goals affect metacognitive judgments
Ikeda, Kenji; Yue, Carole L.; Murayama, Kou; Castel, Alan D.
2017-01-01
The present study examined the effect of achievement goals on metacognitive judgments, such as judgments of learning (JOLs) and metacomprehension judgments, and actual recall performance. We conducted five experiments manipulating the instruction of achievement goals. In each experiment, participants were instructed to adopt mastery-approach goals (i.e., develop their own mental ability through a memory task) or performance-approach goals (i.e., demonstrate their strong memory ability through getting a high score on a memory task). The results of Experiments 1 and 2 showed that JOLs of word pairs in the performance-approach goal condition tended to be higher than those in the mastery-approach goal condition. In contrast, cued recall performance did not differ between the two goal conditions. Experiment 3 also demonstrated that metacomprehension judgments of text passages were higher in the performance-approach goal condition than in the mastery-approach goals condition, whereas test performance did not differ between conditions. These findings suggest that achievement motivation affects metacognitive judgments during learning, even when achievement motivation does not influence actual performance. PMID:28983496
The Changing Nature of Volunteering and the Cross-Border Mobility: Where Does Learning Come from?
ERIC Educational Resources Information Center
Pantea, Maria-Carmen
2013-01-01
This paper revisits the more conventional approaches of volunteering, by looking into the experiences of young people involved in long-term cross-border volunteering in Romania. Drawing on qualitative interviews with European Voluntary Service volunteers, the paper examines how this experience is intersecting their learning trajectories. The…
Appreciating the Complexity of Project Management Execution: Using Simulation in the Classroom
ERIC Educational Resources Information Center
Hartman, Nathan S.; Watts, Charles A.; Treleven, Mark D.
2013-01-01
As the popularity and importance of project management increase, so does the need for well-prepared project managers. This article discusses our experiences using a project management simulation in undergraduate and MBA classes to help students better grasp the complexity of project management. This approach gives students hands-on experience with…
ERIC Educational Resources Information Center
Triomphe, Bernard; Floquet, Anne; Kamau, Geoffrey; Letty, Brigid; Vodouhe, Simplice Davo; Ng'ang'a, Teresiah; Stevens, Joe; van den Berg, Jolanda; Selemna, Nour; Bridier, Bernard; Crane, Todd; Almekinders, Cornelia; Waters-Bayer, Ann; Hocde, Henri
2013-01-01
Purpose: Within the context of the European-funded JOLISAA project (JOint Learning in and about Innovation Systems in African Agriculture), an inventory of agricultural innovation experiences was made in Benin, Kenya and South Africa. The objective was to assess multi-stakeholder agricultural innovation processes involving smallholders. Approach:…
The Earth's Core: How Does It Work? Perspectives in Science. Number 1.
ERIC Educational Resources Information Center
Carnegie Institution of Washington, Washington, DC.
Various research studies designed to enhance knowledge about the earth's core are discussed. Areas addressed include: (1) the discovery of the earth's core; (2) experimental approaches used in studying the earth's core (including shock-wave experiments and experiments at high static pressures), the search for the core's light elements, the…
"Make It New": Introducing Poetry Through Writing Poetry.
ERIC Educational Resources Information Center
Lim, Shirley
One approach to introducing students to poetry is to have them write and analyze their own poems. Although this approach has some disadvantages, it does serve to tap students' experiences and expressive potential with creative projects and to give them an immediate and direct relationship with the traditional published works. By writing poems…
Schneiderman, Steven J; Johnson, Roger W; Menkhaus, Todd J; Gilcrease, Patrick C
2015-03-01
While softwoods represent a potential feedstock for second generation ethanol production, compounds present in their hydrolysates can inhibit fermentation. In this study, a novel Design of Experiments (DoE) approach was used to identify significant inhibitory effects on Saccharomyces cerevisiae D5A for the purpose of guiding kinetic model development. Although acetic acid, furfural and 5-hydroxymethyl furfural (HMF) were present at potentially inhibitory levels, initial factorial experiments only identified ethanol as a significant rate inhibitor. It was hypothesized that high ethanol levels masked the effects of other inhibitors, and a subsequent factorial design without ethanol found significant effects for all other compounds. When these non-ethanol effects were accounted for in the kinetic model, R¯(2) was significantly improved over an ethanol-inhibition only model (R¯(2)=0.80 vs. 0.76). In conclusion, when ethanol masking effects are removed, DoE is a valuable tool to identify significant non-ethanol inhibitors and guide kinetic model development. Copyright © 2014 Elsevier Ltd. All rights reserved.
Community Visions for the Paducah Gaseous Diffusion Plant Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ormsbee, Lindell e; Kipp, James A
2011-09-01
This report focuses on assessing community preferences for the future use of the PGDP site, given the site's pending closure by US DOE. The project approach fostered interaction and engagement with the public based on lessons learned at other complex DOE environmental cleanup sites and upon the integration of a number of principles and approaches to public engagement from the Project Team's local, state, regional and international public engagement experience. The results of the study provide the community with a record of the diversity of values and preferences related to the environmental cleanup and future use of the site.
Bergeest, Jan-Philip; Rohr, Karl
2012-10-01
In high-throughput applications, accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression and the understanding of cell function. We propose an approach for segmenting cell nuclei which is based on active contours using level sets and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We consider three different well-known energy functionals for active contour-based segmentation and introduce convex formulations of these functionals. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images from different experiments comprising different cell types. We have also performed a quantitative comparison with previous segmentation approaches. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodnarczuk, M.
In this paper, I describe a conceptual framework that uses DOE Order 5700.6C and more than 140 other DOE Orders as an integrated management system -- but I describe it within the context of the broader sociological and cultural issues of doing research at DOE funded facilities. The conceptual framework has two components. The first involves an interpretation of the 10 criteria of DOE 5700.6C that is tailored for a research environment. The second component involves using the 10 criteria as functional categories that orchestrate and integrate the other DOE Orders into a total management system. The Fermilab approach aimsmore » at reducing (or eliminating) the redundancy and overlap within the DOE Orders system at the contractor level.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bodnarczuk, M.
In this paper, I describe a conceptual framework that uses DOE Order 5700.6C and more than 140 other DOE Orders as an integrated management system -- but I describe it within the context of the broader sociological and cultural issues of doing research at DOE funded facilities. The conceptual framework has two components. The first involves an interpretation of the 10 criteria of DOE 5700.6C that is tailored for a research environment. The second component involves using the 10 criteria as functional categories that orchestrate and integrate the other DOE Orders into a total management system. The Fermilab approach aimsmore » at reducing (or eliminating) the redundancy and overlap within the DOE Orders system at the contractor level.« less
Robust Evaluation Matrix: Towards a More Principled Offline Exploration of Instructional Policies
ERIC Educational Resources Information Center
Doroudi, Shayan; Aleven, Vincent; Brunskill, Emma
2017-01-01
The gold standard for identifying more effective pedagogical approaches is to perform an experiment. Unfortunately, frequently a hypothesized alternate way of teaching does not yield an improved effect. Given the expense and logistics of each experiment, and the enormous space of potential ways to improve teaching, it would be highly preferable if…
ERIC Educational Resources Information Center
Perone, Sam P.
The objective of this project has been the development of a successful approach for the incorporation of on-line computer technology into the undergraduate chemistry laboratory. This approach assumes no prior programing, electronics or instrumental analysis experience on the part of the student; it does not displace the chemistry content with…
A Cognitive Approach to Experimental Amnesia
ERIC Educational Resources Information Center
Lewis, Donald J.
1976-01-01
A review of selected experiments indicates that not all examples of experimental amnesia are due to the failure of a memory to fixate. In sum, the empirical retrograde amnesia gradient does not necessarily support traditional consolidation theory. (Editor)
An Approach Toward Synthesis of Bridgmanite in Dynamic Compression Experiments
NASA Astrophysics Data System (ADS)
Reppart, J. J.
2015-12-01
Bridgmanite occurs in heavily shocked meteorites and provides a useful constraint on pressure-temperature conditions during shock-metamorphism. Its occurrence also provides constraints on the shock release path. Shock-release and shock duration are important parameters in estimating the size of impactors that generate the observed shock metamorphic record. Thus, it is timely to examine if bridgmanite can be synthesized in dynamic compression experiments with the goal of establishing a correlation between shock duration and grainsize. Up to now only one high pressure polymorph of an Mg-silicate has been synthesized AND recovered in a shock experiment (wadsleyite). Therefore, it is not given that shock synthesis of bridgmanite is possible. This project started recently, so we present an outline of shock experiment designs and potentially results from the first experiments. FUNDING ACKNOWLEDGMENT UNLV HiPSEC: This research was sponsored (or sponsored in part) by the National Nuclear Security Administration under the Stewardship Science Academic Alliances program through DOE Cooperative Agreement #DE-NA0001982. HPCAT: "[Portions of this work were]/[This work was] performed at HPCAT (Sector 16), Advanced Photon Source (APS), Argonne National Laboratory. HPCAT operations are supported by DOE-NNSA under Award No. DE-NA0001974 and DOE-BES under Award No. DE-FG02-99ER45775, with partial instrumentation funding by NSF. APS is supported by DOE-BES, under Contract No. DE-AC02-06CH11357."
ERIC Educational Resources Information Center
Smart, Fiona
2017-01-01
Learning from experience is integral to professional development, with the processes by which it is expected and enabled, varying depending on context and discipline. There is general consensus that it does not just happen. Rather learning from experience is a deliberate act. In higher education, much attention is given to reflective practice and…
Experience of the Small Grower
Emmett Jordon
2002-01-01
The small grower cannot afford costly facilities, equipment, or containers. The grower usually does about everything by hand and gets help from family members and friends to get started. I describe the approach I use to produce longleaf pine container stock.
Double Dutch: A Tool for Designing Combinatorial Libraries of Biological Systems.
Roehner, Nicholas; Young, Eric M; Voigt, Christopher A; Gordon, D Benjamin; Densmore, Douglas
2016-06-17
Recently, semirational approaches that rely on combinatorial assembly of characterized DNA components have been used to engineer biosynthetic pathways. In practice, however, it is not practical to assemble and test millions of pathway variants in order to elucidate how different DNA components affect the behavior of a pathway. To address this challenge, we apply a rigorous mathematical approach known as design of experiments (DOE) that can be used to construct empirical models of system behavior without testing all variants. To support this approach, we have developed a tool named Double Dutch, which uses a formal grammar and heuristic algorithms to automate the process of DOE library design. Compared to designing by hand, Double Dutch enables users to more efficiently and scalably design libraries of pathway variants that can be used in a DOE framework and uniquely provides a means to flexibly balance design considerations of statistical analysis, construction cost, and risk of homologous recombination, thereby demonstrating the utility of automating decision making when faced with complex design trade-offs.
Isolated effects of number of acquisition trials on extinction of rat conditioned approach behavior.
Gottlieb, Daniel A; Prince, Emily B
2012-05-01
Four conditioned approach experiments with rats assessed for effects of number of acquisition trials on extinction of conditioned responding, when number of acquisition sessions and total acquisition time were held constant. In Experiment 1, 32 trials per acquisition session led to more extinction responding than did 1 or 2 trials per session but less than did 4 trials per session. In Experiment 2, 2 trials per acquisition session led to more spontaneous recovery than did 32 trials per session. These latter findings are reminiscent of the overtraining extinction effect (OEE). Experiment 3 attempted to reduce the OEE with a preconditioning phase of partial reinforcement. Experiment 4 attempted to reduce the beneficial within-subject effects of increasing the number of acquisition trials on extinction observed by Gottlieb and Rescorla (2010) by extinguishing stimuli in different sessions. Overall, results suggest a procedural asymmetry: between-subject, increasing the number of trials between any pair of trials does not lead to greater persistence of responding during extinction; within-subject, it does. Results are discussed from an associative perspective, with a focus on explanations involving either frustration or comparator mechanisms, and from an information processing perspective, with a focus on Rate Estimation Theory. Copyright © 2012. Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Babanova, Sofia; Artyushkova, Kateryna; Ulyanova, Yevgenia; Singhal, Sameer; Atanassov, Plamen
2014-01-01
Two statistical methods, design of experiments (DOE) and principal component analysis (PCA) are employed to investigate and improve performance of air-breathing gas-diffusional enzymatic electrodes. DOE is utilized as a tool for systematic organization and evaluation of various factors affecting the performance of the composite system. Based on the results from the DOE, an improved cathode is constructed. The current density generated utilizing the improved cathode (755 ± 39 μA cm-2 at 0.3 V vs. Ag/AgCl) is 2-5 times higher than the highest current density previously achieved. Three major factors contributing to the cathode performance are identified: the amount of enzyme, the volume of phosphate buffer used to immobilize the enzyme, and the thickness of the gas-diffusion layer (GDL). PCA is applied as an independent confirmation tool to support conclusions made by DOE and to visualize the contribution of factors in individual cathode configurations.
NASA Astrophysics Data System (ADS)
Lingadurai, K.; Nagasivamuni, B.; Muthu Kamatchi, M.; Palavesam, J.
2012-06-01
Wire electrical discharge machining (WEDM) is a specialized thermal machining process capable of accurately machining parts of hard materials with complex shapes. Parts having sharp edges that pose difficulties to be machined by the main stream machining processes can be easily machined by WEDM process. Design of Experiments approach (DOE) has been reported in this work for stainless steel AISI grade-304 which is used in cryogenic vessels, evaporators, hospital surgical equipment, marine equipment, fasteners, nuclear vessels, feed water tubing, valves, refrigeration equipment, etc., is machined by WEDM with brass wire electrode. The DOE method is used to formulate the experimental layout, to analyze the effect of each parameter on the machining characteristics, and to predict the optimal choice for each WEDM parameter such as voltage, pulse ON, pulse OFF and wire feed. It is found that these parameters have a significant influence on machining characteristic such as metal removal rate (MRR), kerf width and surface roughness (SR). The analysis of the DOE reveals that, in general the pulse ON time significantly affects the kerf width and the wire feed rate affects SR, while, the input voltage mainly affects the MRR.
Decaris, Martin L; Leach, J Kent
2011-04-01
The presentation of extracellular matrix (ECM) proteins provides an opportunity to instruct the phenotype and behavior of responsive cells. Decellularized cell-secreted matrix coatings (DM) represent a biomimetic culture surface that retains the complexity of the natural ECM. Microenvironmental culture conditions alter the composition of these matrices and ultimately the ability of DMs to direct cell fate. We employed a design of experiments (DOE) multivariable analysis approach to determine the effects and interactions of four variables (culture duration, cell seeding density, oxygen tension, and media supplementation) on the capacity of DMs to direct the osteogenic differentiation of human mesenchymal stem cells (hMSCs). DOE analysis revealed that matrices created with extended culture duration, ascorbate-2-phosphate supplementation, and in ambient oxygen tension exhibited significant correlations with enhanced hMSC differentiation. We validated the DOE model results using DMs predicted to have superior (DM1) or lesser (DM2) osteogenic potential for naïve hMSCs. Compared to cells on DM2, hMSCs cultured on DM1 expressed 2-fold higher osterix levels and deposited 3-fold more calcium over 3 weeks. Cells on DM1 coatings also exhibited greater proliferation and viability compared to DM2-coated substrates. This study demonstrates that DOE-based analysis is a powerful tool for optimizing engineered systems by identifying significant variables that have the greatest contribution to the target output.
ERIC Educational Resources Information Center
Taft, Darryl
2011-01-01
Ned Miller does not take security lightly. As director of campus safety and emergency management at the Des Moines Area Community College (DMACC), any threat requires serious consideration. As community college administrators adopt a more proactive approach to campus safety, many institutions are experimenting with emerging technologies, including…
Insuring the uninsured: finding the road to success.
Chollet, Deborah
2005-01-01
This article outlines various strategies that have been proposed to expand health insurance. Many have been tried in limited ways, and the article describes the experience with those attempts. The discussion is organized from the perspective of the opposing points of view: approaches that would support private coverage and largely rely on demand incentives and approaches that presuppose a more direct government role. The article reaches no conclusion about which strategy might be a wiser course of action. However, it does take measure of the likely effects of each strategy where early experience or objective analysis is available.
Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo
2012-12-01
A large-scale design space was constructed using a Bayesian estimation method with a small-scale design of experiments (DoE) and small sets of large-scale manufacturing data without enforcing a large-scale DoE. The small-scale DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) in the lubricant blending process for theophylline tablets. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) on a small scale were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. The constant Froude number was applied as a scale-up rule. Three experiments under an optimal condition and two experiments under other conditions were performed on a large scale. The response surfaces on the small scale were corrected to those on a large scale by Bayesian estimation using the large-scale results. Large-scale experiments under three additional sets of conditions showed that the corrected design space was more reliable than that on the small scale, even if there was some discrepancy in the pharmaceutical quality between the manufacturing scales. This approach is useful for setting up a design space in pharmaceutical development when a DoE cannot be performed at a commercial large manufacturing scale.
NASA Astrophysics Data System (ADS)
Cooper, Leon N.
2015-01-01
Part I. Science and Society: 1. Science and human experience; 2. Does science undermine our values?; 3. Can science serve mankind?; 4. Modern science and contemporary discomfort: metaphor and reality; 5. Faith and science; 6. Art and science; 7. Fraud in science; 8. Why study science? The keys to the cathedral; 9. Is evolution a theory? A modest proposal; 10. The silence of the second; 11. Introduction to Copenhagen; 12. The unpaid debt; Part II. Thought and Consciousness: 13. Source and limits of human intellect; 14. Neural networks; 15. Thought and mental experience: the Turing test; 16. Mind as machine: will we rubbish human experience?; 17. Memory and memories: a physicist's approach to the brain; 18. On the problem of consciousness; Part III. On the Nature and Limits of Science: 19. What is a good theory?; 20. Shall we deconstruct science?; 21. Visible and invisible in physical theory; 22. Experience and order; 23. The language of physics; 24. The structure of space; 25. Superconductivity and other insoluble problems; 26. From gravity to light and consciousness: does science have limits?
NASA Astrophysics Data System (ADS)
Cooper, Leon N.
2014-12-01
Part I. Science and Society: 1. Science and human experience; 2. Does science undermine our values?; 3. Can science serve mankind?; 4. Modern science and contemporary discomfort: metaphor and reality; 5. Faith and science; 6. Art and science; 7. Fraud in science; 8. Why study science? The keys to the cathedral; 9. Is evolution a theory? A modest proposal; 10. The silence of the second; 11. Introduction to Copenhagen; 12. The unpaid debt; Part II. Thought and Consciousness: 13. Source and limits of human intellect; 14. Neural networks; 15. Thought and mental experience: the Turing test; 16. Mind as machine: will we rubbish human experience?; 17. Memory and memories: a physicist's approach to the brain; 18. On the problem of consciousness; Part III. On the Nature and Limits of Science: 19. What is a good theory?; 20. Shall we deconstruct science?; 21. Visible and invisible in physical theory; 22. Experience and order; 23. The language of physics; 24. The structure of space; 25. Superconductivity and other insoluble problems; 26. From gravity to light and consciousness: does science have limits?
Yazdi, Ashkan K; Smyth, Hugh D C
2017-03-01
To optimize air-jet milling conditions of ibuprofen (IBU) using design of experiment (DoE) method, and to test the generalizability of the optimized conditions for the processing of another non-steroidal anti-inflammatory drug (NSAID). Bulk IBU was micronized using an Aljet mill according to a circumscribed central composite (CCC) design with grinding and pushing nozzle pressures (GrindP, PushP) varying from 20 to 110 psi. Output variables included yield and particle diameters at the 50th and 90th percentile (D 50 , D 90 ). Following data analysis, the optimized conditions were identified and tested to produce IBU particles with a minimum size and an acceptable yield. Finally, indomethacin (IND) was milled using the optimized conditions as well as the control. CCC design included eight successful runs for milling IBU from the ten total runs due to powder "blowback" from the feed hopper. DoE analysis allowed the optimization of the GrindP and PushP at 75 and 65 psi. In subsequent validation experiments using the optimized conditions, the experimental D 50 and D 90 values (1.9 and 3.6 μm) corresponded closely with the DoE modeling predicted values. Additionally, the optimized conditions were superior over the control conditions for the micronization of IND where smaller D 50 and D 90 values (1.2 and 2.7 μm vs. 1.8 and 4.4 μm) were produced. The optimization of a single-step air-jet milling of IBU using the DoE approach elucidated the optimal milling conditions, which were used to micronize IND using the optimized milling conditions.
Kumar, Ramya; Lahann, Joerg
2016-07-06
The performance of polymer interfaces in biology is governed by a wide spectrum of interfacial properties. With the ultimate goal of identifying design parameters for stem cell culture coatings, we developed a statistical model that describes the dependence of brush properties on surface-initiated polymerization (SIP) parameters. Employing a design of experiments (DOE) approach, we identified operating boundaries within which four gel architecture regimes can be realized, including a new regime of associated brushes in thin films. Our statistical model can accurately predict the brush thickness and the degree of intermolecular association of poly[{2-(methacryloyloxy) ethyl} dimethyl-(3-sulfopropyl) ammonium hydroxide] (PMEDSAH), a previously reported synthetic substrate for feeder-free and xeno-free culture of human embryonic stem cells. DOE-based multifunctional predictions offer a powerful quantitative framework for designing polymer interfaces. For example, model predictions can be used to decrease the critical thickness at which the wettability transition occurs by simply increasing the catalyst quantity from 1 to 3 mol %.
Michaelis, Marc; Leopold, Claudia S
2015-12-30
The tack of a pressure sensitive adhesive (PSA) is not an inherent material property and strongly depends on the measurement conditions. Following the concept of a measurement system analysis (MSA), influencing factors of the probe tack test were investigated by a design of experiments (DoE) approach. A response surface design with 38 runs was built to evaluate the influence of detachment speed, dwell time, contact force, adhesive film thickness and API content on tack, determined as the maximum of the stress strain curve (σmax). It could be shown that all investigated factors have a significant effect on the response and that the DoE approach allowed to detect two-factorial interactions between the dwell time, the contact force, the adhesive film thickness and the API content. Surprisingly, it was found that tack increases with decreasing and not with increasing adhesive film thickness. Copyright © 2015. Published by Elsevier B.V.
Experiments in Error Propagation within Hierarchal Combat Models
2015-09-01
Bayesian Information Criterion CNO Chief of Naval Operations DOE Design of Experiments DOD Department of Defense MANA Map Aware Non-uniform Automata ...ground up” approach. First, it develops a mission-level model for one on one submarine combat in Map Aware Non-uniform Automata (MANA) simulation, an... Automata (MANA), an agent based simulation that can model the different postures of submarines. It feeds the results from MANA into stochastic
ERIC Educational Resources Information Center
Nunez, Mario A.
This report explores two bilingual educational approaches currently in use in Mexico and the United States. The study pursues a limited comparison between two modalities of bilingual instruction, as observed and reported in the consulted literature. The U.S. model featured is known as the two-way bilingual model, an additive approach to…
Northoff, Georg
2016-01-15
Psychopathology as the investigation and classification of experience, behavior and symptoms in psychiatric patients is an old discipline that ranges back to the end of the 19th century. Since then different approaches to psychopathology have been suggested. Recent investigations showing abnormalities in the brain on different levels raise the question how the gap between brain and psyche, between neural abnormalities and alteration in experience and behavior can be bridged. Historical approaches like descriptive (Jaspers) and structural (Minkoswki) psychopathology as well as the more current phenomenological psychopathology (Paarnas, Fuchs, Sass, Stanghellini) remain on the side of the psyche giving detailed description of the phenomenal level of experience while leaving open the link to the brain. In contrast, the recently introduced Research Domain Classification (RDoC) aims at explicitly linking brain and psyche by starting from so-called 'neuro-behavioral constructs'. How does Spatiotemporal Psychopathology, as demonstrated in the first paper on depression, stand in relation to these approaches? In a nutshell, Spatiotemporal Psychopathology aims to bridge the gap between brain and psyche. Specifically, as demonstrated in depression in the first paper, the focus is on the spatiotemporal features of the brain's intrinsic activity and how they are transformed into corresponding spatiotemporal features in experience on the phenomenal level and behavioral changes, which can well account for the symptoms in these patients. This second paper focuses on some of the theoretical background assumptions in Spatiotemporal Psychopathology by directly comparing it to descriptive, structural, and phenomenological psychopathology as well as to RDoC. Copyright © 2015 Elsevier B.V. All rights reserved.
Oetjen, Janina; Lachmund, Delf; Palmer, Andrew; Alexandrov, Theodore; Becker, Michael; Boskamp, Tobias; Maass, Peter
2016-09-01
A standardized workflow for matrix-assisted laser desorption/ionization imaging mass spectrometry (MALDI imaging MS) is a prerequisite for the routine use of this promising technology in clinical applications. We present an approach to develop standard operating procedures for MALDI imaging MS sample preparation of formalin-fixed and paraffin-embedded (FFPE) tissue sections based on a novel quantitative measure of dataset quality. To cover many parts of the complex workflow and simultaneously test several parameters, experiments were planned according to a fractional factorial design of experiments (DoE). The effect of ten different experiment parameters was investigated in two distinct DoE sets, each consisting of eight experiments. FFPE rat brain sections were used as standard material because of low biological variance. The mean peak intensity and a recently proposed spatial complexity measure were calculated for a list of 26 predefined peptides obtained by in silico digestion of five different proteins and served as quality criteria. A five-way analysis of variance (ANOVA) was applied on the final scores to retrieve a ranking of experiment parameters with increasing impact on data variance. Graphical abstract MALDI imaging experiments were planned according to fractional factorial design of experiments for the parameters under study. Selected peptide images were evaluated by the chosen quality metric (structure and intensity for a given peak list), and the calculated values were used as an input for the ANOVA. The parameters with the highest impact on the quality were deduced and SOPs recommended.
Real-time PCR probe optimization using design of experiments approach.
Wadle, S; Lehnert, M; Rubenwolf, S; Zengerle, R; von Stetten, F
2016-03-01
Primer and probe sequence designs are among the most critical input factors in real-time polymerase chain reaction (PCR) assay optimization. In this study, we present the use of statistical design of experiments (DOE) approach as a general guideline for probe optimization and more specifically focus on design optimization of label-free hydrolysis probes that are designated as mediator probes (MPs), which are used in reverse transcription MP PCR (RT-MP PCR). The effect of three input factors on assay performance was investigated: distance between primer and mediator probe cleavage site; dimer stability of MP and target sequence (influenza B virus); and dimer stability of the mediator and universal reporter (UR). The results indicated that the latter dimer stability had the greatest influence on assay performance, with RT-MP PCR efficiency increased by up to 10% with changes to this input factor. With an optimal design configuration, a detection limit of 3-14 target copies/10 μl reaction could be achieved. This improved detection limit was confirmed for another UR design and for a second target sequence, human metapneumovirus, with 7-11 copies/10 μl reaction detected in an optimum case. The DOE approach for improving oligonucleotide designs for real-time PCR not only produces excellent results but may also reduce the number of experiments that need to be performed, thus reducing costs and experimental times.
Rethinking the Concept of Brilliance
ERIC Educational Resources Information Center
Polyzoi, Eleoussa; Haydey, Donna Copsey
2012-01-01
Despite the richness of Joan Freeman's ethnographic approach to help us define the elusive qualities of giftedness from a personal, lived-experience perspective in "A Quality of Giftedness," she does not offer guidance on strategies for supporting individuals' talents over the lifespan. This commentary states that while Freeman's writing…
Madsen, Cecilie Maria; Feng, Kung-I; Leithead, Andrew; Canfield, Nicole; Jørgensen, Søren Astrup; Müllertz, Anette; Rades, Thomas
2018-01-01
The composition of the human intestinal fluids varies both intra- and inter-individually. This will influence the solubility of orally administered drug compounds, and hence, the absorption and efficacy of compounds displaying solubility limited absorption. The purpose of this study was to assess the influence of simulated intestinal fluid (SIF) composition on the solubility of poorly soluble compounds. Using a Design of Experiments (DoE) approach, a set of 24 SIF was defined within the known compositions of human fasted state intestinal fluid. The SIF were composed of phospholipid, bile salt, and different pH, buffer capacities and osmolarities. On a small scale semi-robotic system, the solubility of 6 compounds (aprepitant, carvedilol, felodipine, fenofibrate, probucol, and zafirlukast) was determined in the 24 SIF. Compound specific models, describing key factors influencing the solubility of each compound, were identified. Although all models were different, the level of phospholipid and bile salt, the pH, and the interactions between these, had the biggest influences on solubility overall. Thus, a reduction of the DoE from five to three factors was possible (11-13 media), making DoE solubility studies feasible compared to single SIF solubility studies. Applying this DoE approach will lead to a better understanding of the impact of intestinal fluid composition on the solubility of a given drug compound. Copyright © 2017 Elsevier B.V. All rights reserved.
Process optimization by use of design of experiments: Application for liposomalization of FK506.
Toyota, Hiroyasu; Asai, Tomohiro; Oku, Naoto
2017-05-01
Design of experiments (DoE) can accelerate the optimization of drug formulations, especially complexed formulas such as those of drugs, using delivery systems. Administration of FK506 encapsulated in liposomes (FK506 liposomes) is an effective approach to treat acute stroke in animal studies. To provide FK506 liposomes as a brain protective agent, it is necessary to manufacture these liposomes with good reproducibility. The objective of this study was to confirm the usefulness of DoE for the process-optimization study of FK506 liposomes. The Box-Behnken design was used to evaluate the effect of the process parameters on the properties of FK506 liposomes. The results of multiple regression analysis showed that there was interaction between the hydration temperature and the freeze-thaw cycle on both the particle size and encapsulation efficiency. An increase in the PBS hydration volume resulted in an increase in encapsulation efficiency. Process parameters had no effect on the ζ-potential. The multiple regression equation showed good predictability of the particle size and the encapsulation efficiency. These results indicated that manufacturing conditions must be taken into consideration to prepare liposomes with desirable properties. DoE would thus be promising approach to optimize the conditions for the manufacturing of liposomes. Copyright © 2017 Elsevier B.V. All rights reserved.
Fitness Instructors: How Does Their Knowledge on Weight Loss Measure Up?
ERIC Educational Resources Information Center
Forsyth, Glenys; Handcock, Phil; Rose, Elaine; Jenkins, Carolyn
2005-01-01
Objective: To examine the knowledge, approaches and attitudes of fitness instructors dealing with clients seeking weight loss advice. Design: A qualitative project whereby semi-structured interviews were conducted with ten fitness instructors representing a range of qualifications, work settings and years of experience. Setting: Interviews were…
Academic Advising: Does It Really Impact Student Success?
ERIC Educational Resources Information Center
Young-Jones, Adena D.; Burt, Tracie D.; Dixon, Stephanie; Hawthorne, Melissa J.
2013-01-01
Purpose: This study was designed to evaluate academic advising in terms of student needs, expectations, and success rather than through the traditional lens of student satisfaction with the process. Design/methodology/approach: Student participants (n = 611) completed a survey exploring their expectations of and experience with academic advising.…
45 CFR 2516.500 - How does the Corporation review the merits of an application?
Code of Federal Regulations, 2013 CFR
2013-10-01
..., innovation, replicability, and sustainability of the proposal on the basis of the following criteria: (1... from experience and replicating the approach of the program. (3) Sustainability, as indicated by the... proposal's quality, innovation, replicability, and sustainability; or (9) Address any other priority...
45 CFR 2516.500 - How does the Corporation review the merits of an application?
Code of Federal Regulations, 2012 CFR
2012-10-01
..., innovation, replicability, and sustainability of the proposal on the basis of the following criteria: (1... from experience and replicating the approach of the program. (3) Sustainability, as indicated by the... proposal's quality, innovation, replicability, and sustainability; or (9) Address any other priority...
45 CFR 2516.500 - How does the Corporation review the merits of an application?
Code of Federal Regulations, 2014 CFR
2014-10-01
..., innovation, replicability, and sustainability of the proposal on the basis of the following criteria: (1... from experience and replicating the approach of the program. (3) Sustainability, as indicated by the... proposal's quality, innovation, replicability, and sustainability; or (9) Address any other priority...
Documenting Art Therapy Clinical Knowledge Using Interviews
ERIC Educational Resources Information Center
Regev, Dafna
2017-01-01
Practicing art therapists have vast stores of knowledge and experience, but in most cases, their work is not documented, and their clinical knowledge does not enter the academic discourse. This article proposes a systematic approach to the collection of practice knowledge about art therapy based on conducting interviews with art therapists who…
Children's Voices on Bullying in Kindergarten
ERIC Educational Resources Information Center
Helgeland, Anne; Lund, Ingrid
2017-01-01
Research suggests that bullying does occur in kindergarten. The extent of bullying in Norway and other Scandinavian countries is estimated to be about 12%. The purpose of this study is to investigate children's understanding and experiences of bullying. We use a qualitative approach and have conducted individual interviews and focus group…
Flight evaluation of the terminal guidance system
NASA Technical Reports Server (NTRS)
Sandlin, D. R.
1981-01-01
The terminal guidance system (TGS) is avionic equipment which gives guidance along a curved descending flight path to a landing. A Cessna 182 was used as the test aircraft and the TGS was installed and connected to the altimeter, DME, RMI, and gyro compass. Approaches were flown by three different pilots. When the aircraft arrives at the termination point, it is set up on final approach for a landing. The TGS provides guidance for curved descending approaches with guideslopes of 6 deg which required, for experienced pilots, workloads that are approximately the same as for an ILS. The glideslope is difficult to track within 1/2 n.m. of the VOR/DME station. The system permits, for experienced pilots, satisfactory approaches with a turn radius as low as 1/2 n.m. and a glideslope of 6 deg. Turn angles have little relation to pilot workload for curved approaches. Pilot experience is a factor for curved approaches. Pilots with low instrument time have difficulty flying steep approaches with small turn radius. Turbulence increases the pilot workload for curved approaches. The TGS does not correct to a given flight path over the ground nor does it adequately compensate for wind drift.
Global alcohol policy and the alcohol industry.
Anderson, Peter
2009-05-01
The WHO is preparing its global strategy on alcohol, and, in so doing, has been asked to consult with the alcohol industry on ways it could contribute in reducing the harm done by alcohol. This review asks which is more effective in reducing harm: the regulatory approaches that the industry does not favour; or the educational approaches that it does favour. The current literature overwhelmingly finds that regulatory approaches (including those that manage the price, availability, and marketing of alcohol) reduce the risk of and the experience of alcohol-related harm, whereas educational approaches (including school-based education and public education campaigns) do not, with industry-funded education actually increasing the risk of harm. The alcohol industry should not be involved in making alcohol policy. Its involvement in implementing policy should be restricted to its role as a producer, distributor, and marketer of alcohol. In particular, the alcohol industry should not be involved in educational programmes, as such involvement could actually lead to an increase in harm.
Ji, Yu; Tian, Yang; Ahnfelt, Mattias; Sui, Lili
2014-06-27
Multivalent pneumococcal vaccines were used worldwide to protect human beings from pneumococcal diseases. In order to eliminate the toxic organic solutions used in the traditional vaccine purification process, an alternative chromatographic process for Streptococcus pneumoniae serotype 23F capsular polysaccharide (CPS) was proposed in this study. The strategy of Design of Experiments (DoE) was introduced into the process development to solve the complicated design procedure. An initial process analysis was given to review the whole flowchart, identify the critical factors of chromatography through FMEA and chose the flowthrough mode due to the property of the feed. A resin screening study was then followed to select candidate resins. DoE was utilized to generate a resolution IV fractional factorial design to further compare candidates and narrow down the design space. After Capto Adhere was selected, the Box-Behnken DoE was executed to model the process and characterize all effects of factors on the responses. Finally, Monte Carlo simulation was used to optimize the process, test the chosen optimal conditions and define the control limit. The results of three scale-up runs at set points verified the DoE and simulation predictions. The final results were well in accordance with the EU pharmacopeia requirements: Protein/CPS (w/w) 1.08%; DNA/CPS (w/w) 0.61%; the phosphorus content 3.1%; the nitrogen 0.315% and the Methyl-pentose percentage 47.9%. Other tests of final pure CPS also met the pharmacopeia specifications. This alternative chromatographic purification process for pneumococcal vaccine without toxic organic solvents was successfully developed by the DoE approach and proved scalability, robustness and suitability for large scale manufacturing. Copyright © 2014 Elsevier B.V. All rights reserved.
Letizia, M C; Cornaglia, M; Tranchida, G; Trouillon, R; Gijs, M A M
2018-01-22
When studying the drug effectiveness towards a target model, one should distinguish the effects of the drug itself and of all the other factors that could influence the screening outcome. This comprehensive knowledge is crucial, especially when model organisms are used to study the drug effect at a systemic level, as a higher number of factors can influence the drug-testing outcome. Covering the entire experimental domain and studying the effect of the simultaneous change in several factors would require numerous experiments, which are costly and time-consuming. Therefore, a design of experiment (DoE) approach in drug-testing is emerging as a robust and efficient method to reduce the use of resources, while maximizing the knowledge of the process. Here, we used a 3-factor-Doehlert DoE to characterize the concentration-dependent effect of the drug doxycycline on the development duration of the nematode Caenorhabditis elegans. To cover the experimental space, 13 experiments were designed and performed, where different doxycycline concentrations were tested, while also varying the temperature and the food amount, which are known to influence the duration of C. elegans development. A microfluidic platform was designed to isolate and culture C. elegans larvae, while testing the doxycycline effect with full control of temperature and feeding over the entire development. Our approach allowed predicting the doxycycline effect on C. elegans development in the complete drug concentration/temperature/feeding experimental space, maximizing the understanding of the effect of this antibiotic on the C. elegans development and paving the way towards a standardized and optimized drug-testing process.
Soema, Peter C; Willems, Geert-Jan; Jiskoot, Wim; Amorij, Jean-Pierre; Kersten, Gideon F
2015-08-01
In this study, the effect of liposomal lipid composition on the physicochemical characteristics and adjuvanticity of liposomes was investigated. Using a design of experiments (DoE) approach, peptide-containing liposomes containing various lipids (EPC, DOPE, DOTAP and DC-Chol) and peptide concentrations were formulated. Liposome size and zeta potential were determined for each formulation. Moreover, the adjuvanticity of the liposomes was assessed in an in vitro dendritic cell (DC) model, by quantifying the expression of DC maturation markers CD40, CD80, CD83 and CD86. The acquired data of these liposome characteristics were successfully fitted with regression models, and response contour plots were generated for each response factor. These models were applied to predict a lipid composition that resulted in a liposome with a target zeta potential. Subsequently, the expression of the DC maturation factors for this lipid composition was predicted and tested in vitro; the acquired maturation responses corresponded well with the predicted ones. These results show that a DoE approach can be used to screen various lipids and lipid compositions, and to predict their impact on liposome size, charge and adjuvanticity. Using such an approach may accelerate the formulation development of liposomal vaccine adjuvants. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.
Berglund, Erik; Westerling, Ragnar; Sundström, Johan; Lytsy, Per
2016-12-01
This study aimed to investigate patients' willingness to initiate a preventive treatment and compared two established effect measures to the newly developed Delay of Events (DoE) measure that expresses treatment effect as a gain in event-free time. In this cross-sectional, randomized survey experiment in the general Swedish population,1079 respondents (response rate 60.9%) were asked to consider a preventive cardiovascular treatment. Respondents were randomly allocated to one of three effect descriptions: DoE, relative risk reduction (RRR), or absolute risk reduction (ARR). Univariate and multivariate analyses were performed investigating willingness to initiate treatment, views on treatment benefit, motivation and importance to adhere and willingness to pay for treatment. Eighty-one percent were willing to take the medication when the effect was described as DoE, 83.0% when it was described as RRR and 62.8% when it was described as ARR. DoE and RRR was further associated with positive views on treatment benefit, motivation, importance to adhere and WTP. Presenting treatment effect as DoE or RRR was associated with a high willingness to initiate treatment. An approach based on the novel time-based measure DoE may be of value in clinical communication and shared decision making. Copyright © 2016 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.
Design of experiments (DoE) in pharmaceutical development.
N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios
2017-06-01
At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.
Weighted hybrid technique for recommender system
NASA Astrophysics Data System (ADS)
Suriati, S.; Dwiastuti, Meisyarah; Tulus, T.
2017-12-01
Recommender system becomes very popular and has important role in an information system or webpages nowadays. A recommender system tries to make a prediction of which item a user may like based on his activity on the system. There are some familiar techniques to build a recommender system, such as content-based filtering and collaborative filtering. Content-based filtering does not involve opinions from human to make the prediction, while collaborative filtering does, so collaborative filtering can predict more accurately. However, collaborative filtering cannot give prediction to items which have never been rated by any user. In order to cover the drawbacks of each approach with the advantages of other approach, both approaches can be combined with an approach known as hybrid technique. Hybrid technique used in this work is weighted technique in which the prediction score is combination linear of scores gained by techniques that are combined.The purpose of this work is to show how an approach of weighted hybrid technique combining content-based filtering and item-based collaborative filtering can work in a movie recommender system and to show the performance comparison when both approachare combined and when each approach works alone. There are three experiments done in this work, combining both techniques with different parameters. The result shows that the weighted hybrid technique that is done in this work does not really boost the performance up, but it helps to give prediction score for unrated movies that are impossible to be recommended by only using collaborative filtering.
Teaching undergraduate mathematics in interactive groups: how does it fit with students' learning?
NASA Astrophysics Data System (ADS)
Sheryn, Louise; Ell, Fiona
2014-08-01
Debates about how undergraduate mathematics should be taught are informed by different views of what it is to learn and to do mathematics. In this qualitative study 10 students enrolled in an advanced undergraduate course in mathematics shared their views about how they best learn mathematics. After participating in a semester-long course in combinatorics, taught using a non-traditional, formal group work approach, the 10 students shared their views about how such an approach fitted in with their experience of learning mathematics. A descriptive thematic analysis of the students' responses revealed that despite being very comfortable with the traditional approach to learning new mathematics, most students were open to a formal group work approach and could see benefits from it after their participation. The students' prior conceptions of the goal of undergraduate mathematics learning and their view of themselves as 'mathematicians' framed their experience of learning mathematics in a non-traditional class.
Comprehension Strategies for Middle Grade Learners: A Handbook for Content Area Teachers.
ERIC Educational Resources Information Center
Sadler, Charlotte Rose
Although students are expected to read and comprehend grade-level texts by the time they reach middle school, classroom teachers are constantly challenged to instruct students who have difficulty comprehending what they read. But how does a middle school teacher approach this task, particularly a teacher with limited experience in reading…
Does the Students' Preferred Pedagogy Relate to Their Ethnicity: UK and Asian Experience
ERIC Educational Resources Information Center
Winch, Junko
2016-01-01
An increasing number of international students, whose culture of teaching and learning practices are very different from UK students, are studying at British universities. This study investigates multicultural students' preferences using two different teaching approaches in the 2009/2010 academic year, which is explained in the framework of this…
ERIC Educational Resources Information Center
Yang, Wei
2017-01-01
This paper estimates the impact of "compulsory volunteerism" for adolescents on subsequent volunteer behavior exploiting the introduction of a mandatory community service program for high school (HS) students in Ontario, Canada. We use difference-in-differences approach with a large longitudinal dataset. Our estimates show that the…
Does Inquiry Based Learning Affect Students' Beliefs and Attitudes towards Mathematics?
ERIC Educational Resources Information Center
McGregor, Darren
2014-01-01
Ill-structured tasks presented in an inquiry learning environment have the potential to affect students' beliefs and attitudes towards mathematics. This empirical research followed a Design Experiment approach to explore how aspects of using ill-structured tasks may have affected students' beliefs and attitudes. Results showed this task type and…
The Disabled Student Experience: Does the SERVQUAL Scale Measure Up?
ERIC Educational Resources Information Center
Vaughan, Elizabeth; Woodruffe-Burton, Helen
2011-01-01
Purpose: The purpose of this paper is to empirically test a new disabled service user-specific service quality model ARCHSECRET against a modified SERVQUAL model in the context of disabled students within higher education. Design/methodology/approach: The application of SERVQUAL in the voluntary sector had raised serious issues on its portability…
Does Mixed Methods Research Matter to Understanding Childhood Well-Being?
ERIC Educational Resources Information Center
Jones, Nicola; Sumner, Andy
2009-01-01
There has been a rich debate in development studies on combining research methods in recent years. We explore the particular challenges and opportunities surrounding mixed methods approaches to childhood well-being. We argue that there are additional layers of complexity due to the distinctiveness of children's experiences of deprivation or…
Designing Personalized Online Teaching Professional Development through Self-Assessment
ERIC Educational Resources Information Center
Rhode, Jason; Richter, Stephanie; Miller, Tracy
2017-01-01
Many institutions use a one-size-fits-all approach to faculty development for online teaching, which does not meet the needs of faculty who often have different levels of experience, skill, and self-efficacy in online teaching and learning. To address these issues, the [university name removed] [center name removed] designed and implemented an…
Precise and Efficient Retrieval of Captioned Images: The MARIE Project.
ERIC Educational Resources Information Center
Rowe, Neil C.
1999-01-01
The MARIE project explores knowledge-based information retrieval of captioned images of the kind found in picture libraries and on the Internet. MARIE's five-part approach exploits the idea that images are easier to understand with context, especially descriptive text near them, but it also does image analysis. Experiments show MARIE prototypes…
Project Teaches Students to Diagnose an Ailing Windows OS
ERIC Educational Resources Information Center
Yang, Baijan
2007-01-01
Troubleshooting a corrupted Windows operating system (OS) is a must-learn experience for computer technology students. To teach OS troubleshooting, the simplest approach involves introducing the available tools followed by the "how-to's." But how does a teacher teach his or her students to apply their knowledge in real-life scenarios and help them…
First-Year Faculty of Color: Narratives about Entering the Academy
ERIC Educational Resources Information Center
Cole, Eddie R.; McGowan, Brian L.; Zerquera, Desiree D.
2017-01-01
The experiences of first-year, tenure-track faculty have been missing in the literature about new or junior faculty. Furthermore, the extant literature about new faculty does not offer a critical outlook on how oppressive institutional structures shape how first-year faculty of color approach faculty work. Drawing from analytical narratives, the…
Patel, Rashmin B; Patel, Nilay M; Patel, Mrunali R; Solanki, Ajay B
2017-03-01
The aim of this work was to develop and optimize a robust HPLC method for the separation and quantitation of ambroxol hydrochloride and roxithromycin utilizing Design of Experiment (DoE) approach. The Plackett-Burman design was used to assess the impact of independent variables (concentration of organic phase, mobile phase pH, flow rate and column temperature) on peak resolution, USP tailing and number of plates. A central composite design was utilized to evaluate the main, interaction, and quadratic effects of independent variables on the selected dependent variables. The optimized HPLC method was validated based on ICH Q2R1 guideline and was used to separate and quantify ambroxol hydrochloride and roxithromycin in tablet formulations. The findings showed that DoE approach could be effectively applied to optimize a robust HPLC method for quantification of ambroxol hydrochloride and roxithromycin in tablet formulations. Statistical comparison between results of proposed and reported HPLC method revealed no significant difference; indicating the ability of proposed HPLC method for analysis of ambroxol hydrochloride and roxithromycin in pharmaceutical formulations. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan
2016-05-01
The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.
Nekkanti, Vijaykumar; Marwah, Ashwani; Pillai, Raviraj
2015-01-01
Design of experiments (DOE), a component of Quality by Design (QbD), is systematic and simultaneous evaluation of process variables to develop a product with predetermined quality attributes. This article presents a case study to understand the effects of process variables in a bead milling process used for manufacture of drug nanoparticles. Experiments were designed and results were computed according to a 3-factor, 3-level face-centered central composite design (CCD). The factors investigated were motor speed, pump speed and bead volume. Responses analyzed for evaluating these effects and interactions were milling time, particle size and process yield. Process validation batches were executed using the optimum process conditions obtained from software Design-Expert® to evaluate both the repeatability and reproducibility of bead milling technique. Milling time was optimized to <5 h to obtain the desired particle size (d90 < 400 nm). The desirability function used to optimize the response variables and observed responses were in agreement with experimental values. These results demonstrated the reliability of selected model for manufacture of drug nanoparticles with predictable quality attributes. The optimization of bead milling process variables by applying DOE resulted in considerable decrease in milling time to achieve the desired particle size. The study indicates the applicability of DOE approach to optimize critical process parameters in the manufacture of drug nanoparticles.
Lelieveld, Gert-Jan; Van Dijk, Eric; Van Beest, Ilja; Van Kleef, Gerben A
2013-10-01
On the basis of a social-functional approach to emotion, scholars have argued that expressing disappointment in negotiations communicates weakness, which may evoke exploitation. Yet, it is also argued that communicating disappointment serves as a call for help, which may elicit generous offers. Our goal was to resolve this apparent inconsistency. We develop the argument that communicating disappointment elicits generous offers when it evokes guilt in the target, but elicits low offers when it does not. In 4 experiments using both verbal (Experiments 1-3) and nonverbal (Experiment 4) emotion manipulations, we demonstrate that the interpersonal effects of disappointment depend on (a) the opponent's group membership and (b) the type of negotiation. When the expresser was an outgroup member and in representative negotiations (i.e., when disappointment did not evoke guilt), the weakness that disappointment communicated elicited lower offers. When the expresser was an ingroup member and in individual negotiations (i.e., when disappointment did evoke guilt), the weakness that disappointment communicated elicited generous offers from participants. Thus, in contrast to the common belief that weakness is a liability in negotiations, expressing disappointment can be effective under particular circumstances. We discuss implications for theorizing about the social functions of emotions.
Wojcik, Pawel Jerzy; Pereira, Luís; Martins, Rodrigo; Fortunato, Elvira
2014-01-13
An efficient mathematical strategy in the field of solution processed electrochromic (EC) films is outlined as a combination of an experimental work, modeling, and information extraction from massive computational data via statistical software. Design of Experiment (DOE) was used for statistical multivariate analysis and prediction of mixtures through a multiple regression model, as well as the optimization of a five-component sol-gel precursor subjected to complex constraints. This approach significantly reduces the number of experiments to be realized, from 162 in the full factorial (L=3) and 72 in the extreme vertices (D=2) approach down to only 30 runs, while still maintaining a high accuracy of the analysis. By carrying out a finite number of experiments, the empirical modeling in this study shows reasonably good prediction ability in terms of the overall EC performance. An optimized ink formulation was employed in a prototype of a passive EC matrix fabricated in order to test and trial this optically active material system together with a solid-state electrolyte for the prospective application in EC displays. Coupling of DOE with chromogenic material formulation shows the potential to maximize the capabilities of these systems and ensures increased productivity in many potential solution-processed electrochemical applications.
Feng, Yan Wen; Ooishi, Ayako; Honda, Shinya
2012-01-05
A simple systematic approach using Fourier transform infrared (FTIR) spectroscopy, size exclusion chromatography (SEC) and design of experiments (DOE) techniques was applied to the analysis of aggregation factors for protein formulations in stress and accelerated testings. FTIR and SEC were used to evaluate protein conformational and storage stabilities, respectively. DOE was used to determine the suitable formulation and to analyze both the main effect of single factors and the interaction effect of combined factors on aggregation. Our results indicated that (i) analysis at a low protein concentration is not always applicable to high concentration formulations; (ii) an investigation of interaction effects of combined factors as well as main effects of single factors is effective for improving conformational stability of proteins; (iii) with the exception of pH, the results of stress testing with regard to aggregation factors would be available for suitable formulation instead of performing time-consuming accelerated testing; (iv) a suitable pH condition should not be determined in stress testing but in accelerated testing, because of inconsistent effects of pH on conformational and storage stabilities. In summary, we propose a three-step strategy, using FTIR, SEC and DOE techniques, to effectively analyze the aggregation factors and perform a rapid screening for suitable conditions of protein formulation. Copyright © 2011 Elsevier B.V. All rights reserved.
Measurement of Ligand–Target Residence Times by 1H Relaxation Dispersion NMR Spectroscopy
2016-01-01
A ligand-observed 1H NMR relaxation experiment is introduced for measuring the binding kinetics of low-molecular-weight compounds to their biomolecular targets. We show that this approach, which does not require any isotope labeling, is applicable to ligand–target systems involving proteins and nucleic acids of variable molecular size. The experiment is particularly useful for the systematic investigation of low affinity molecules with residence times in the micro- to millisecond time regime. PMID:27933946
ALFA: The new ALICE-FAIR software framework
NASA Astrophysics Data System (ADS)
Al-Turany, M.; Buncic, P.; Hristov, P.; Kollegger, T.; Kouzinopoulos, C.; Lebedev, A.; Lindenstruth, V.; Manafov, A.; Richter, M.; Rybalchenko, A.; Vande Vyvre, P.; Winckler, N.
2015-12-01
The commonalities between the ALICE and FAIR experiments and their computing requirements led to the development of large parts of a common software framework in an experiment independent way. The FairRoot project has already shown the feasibility of such an approach for the FAIR experiments and extending it beyond FAIR to experiments at other facilities[1, 2]. The ALFA framework is a joint development between ALICE Online- Offline (O2) and FairRoot teams. ALFA is designed as a flexible, elastic system, which balances reliability and ease of development with performance using multi-processing and multithreading. A message- based approach has been adopted; such an approach will support the use of the software on different hardware platforms, including heterogeneous systems. Each process in ALFA assumes limited communication and reliance on other processes. Such a design will add horizontal scaling (multiple processes) to vertical scaling provided by multiple threads to meet computing and throughput demands. ALFA does not dictate any application protocols. Potentially, any content-based processor or any source can change the application protocol. The framework supports different serialization standards for data exchange between different hardware and software languages.
Laboratory animals and the art of empathy
Thomas, D
2005-01-01
Consistency is the hallmark of a coherent ethical philosophy. When considering the morality of particular behaviour, one should look to identify comparable situations and test one's approach to the former against one's approach to the latter. The obvious comparator for animal experiments is non-consensual experiments on people. In both cases, suffering and perhaps death is knowingly caused to the victim, the intended beneficiary is someone else, and the victim does not consent. Animals suffer just as people do. As we condemn non-consensual experiments on people, we should, if we are to be consistent, condemn non-consensual experiments on animals. The alleged differences between the two practices often put forward do not stand up to scrutiny. The best guide to ethical behaviour is empathy—putting oneself in the potential victim's shoes. Again to be consistent, we should empathise with all who may be adversely affected by our behaviour. By this yardstick, too, animal experiments fail the ethical test. PMID:15800357
Anderson-Cook, Christine M.; Burke, Sarah E.
2016-10-18
First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson-Cook, Christine M.; Burke, Sarah E.
First, we would like to commend Dr. Woods on his thought-provoking paper and insightful presentation at the 4th Annual Stu Hunter conference. We think that the material presented highlights some important needs in the area of design of experiments for generalized linear models (GLMs). In addition, we agree with Dr. Woods that design of experiements of GLMs does implicitly require expert judgement about model parameters, and hence using a Bayesian approach to capture this knowledge is a natural strategy to summarize what is known with the opportunity to incorporate associated uncertainty about that information.
A sensorimotor account of vision and visual consciousness.
O'Regan, J K; Noë, A
2001-10-01
Many current neurophysiological, psychophysical, and psychological approaches to vision rest on the idea that when we see, the brain produces an internal representation of the world. The activation of this internal representation is assumed to give rise to the experience of seeing. The problem with this kind of approach is that it leaves unexplained how the existence of such a detailed internal representation might produce visual consciousness. An alternative proposal is made here. We propose that seeing is a way of acting. It is a particular way of exploring the environment. Activity in internal representations does not generate the experience of seeing. The outside world serves as its own, external, representation. The experience of seeing occurs when the organism masters what we call the governing laws of sensorimotor contingency. The advantage of this approach is that it provides a natural and principled way of accounting for visual consciousness, and for the differences in the perceived quality of sensory experience in the different sensory modalities. Several lines of empirical evidence are brought forward in support of the theory, in particular: evidence from experiments in sensorimotor adaptation, visual "filling in," visual stability despite eye movements, change blindness, sensory substitution, and color perception.
ERIC Educational Resources Information Center
Stritikus, Tom; Nguyen, Diem
2010-01-01
Drawing on data from a 3-year qualitative study on the transition of immigrant youth, we seek to understand the connections among district programming and policy, teacher perspectives, and the ways in which students experience initial schooling. Specifically, this article examines the following research questions: How does district leadership…
A Typological Approach to the Relationship Between the Motivation to Work and Job Satisfaction.
ERIC Educational Resources Information Center
Landy, Frank J.
Does knowledge of motivational type increase the efficiency of predicting job satisfaction patterns? Do changes in motivational type occur as a function of job experience? Answers to the above questions were sought in a study of 175 professional engineers employed in six organizations. Data were collected by the author with a satisfaction…
Where Does It Come From? Developmental Aspects of Art Appreciation
ERIC Educational Resources Information Center
Schabmann, Alfred; Gerger, Gernot; Schmidt, Barbara M.; Wögerer, Eva; Osipov, Igor; Leder, Helmut
2016-01-01
Art is a unique feature of human experience. It involves the complex interplay among stimuli, persons and contexts. Little is known of how the various features deemed important in art appreciation depend on development, thus are already present at a young age. Similarly to our previous approach with adults of differing levels of expertise, the…
ERIC Educational Resources Information Center
Owen, Hazel; Dunham, Nicola
2015-01-01
E-learning experiences are widely becoming common practice in many schools, tertiary institutions and other organisations. However despite this increased use of technology to enhance learning and the associated investment involved the result does not always equate to more engaged, knowledgeable and skilled learners. We have observed two key…
ERIC Educational Resources Information Center
Anunson, Paige N.; Winkler, Gates R.; Winkler, Jay R.; Parkinson, Bruce A.; Christus, Jennifer D. Schuttlefield
2013-01-01
In general, laboratory experiments focus on traditional chemical disciplines. While this approach allows students the ability to learn and explore fundamental concepts in a specific area, it does not always encourage students to explore interdisciplinary science. Often little transfer of knowledge from one area to another is observed, as students…
ERIC Educational Resources Information Center
Choo, Dawn; Dettman, Shani J.
2016-01-01
During the pre- and post-implant habilitation process, mothers of children using cochlear implants may be coached by clinicians to use appropriate communicative strategies during play according to the family's choice of communication approach. The present study compared observations made by experienced and inexperienced individuals in the analysis…
The Hundred Languages of Children: The Reggio Emilia Experience in Transformation. Third Edition
ERIC Educational Resources Information Center
Edwards, Carolyn, Ed.; Gandini, Lella, Ed.; Forman, George, Ed.
2011-01-01
Why does the city of Reggio Emilia in northern Italy feature one of the best public systems of early education in the world? This book documents the comprehensive and innovative approach that utilizes the "hundred languages of children" to support their well-being and foster their intellectual development. Reggio Emilia is a fast-growing…
A Teacher's Approach: Integrating Technology Appropriately into a First Grade Classroom
ERIC Educational Resources Information Center
Phalen, Loretta Jean
2004-01-01
How are first grade classrooms using technology? How are children using technology at home? Does the use of technology really improve academic achievement? An experiment was conducted to determine the effectiveness of using technology to teach a unit in Social Studies to first grade students. The study occurred in a Christian school in Lancaster,…
Art, science, and immersion: data-driven experiences
NASA Astrophysics Data System (ADS)
West, Ruth G.; Monroe, Laura; Ford Morie, Jacquelyn; Aguilera, Julieta
2013-03-01
This panel and dialog-paper explores the potentials at the intersection of art, science, immersion and highly dimensional, "big" data to create new forms of engagement, insight and cultural forms. We will address questions such as: "What kinds of research questions can be identified at the intersection of art + science + immersive environments that can't be expressed otherwise?" "How is art+science+immersion distinct from state-of-the art visualization?" "What does working with immersive environments and visualization offer that other approaches don't or can't?" "Where does immersion fall short?" We will also explore current trends in the application of immersion for gaming, scientific data, entertainment, simulation, social media and other new forms of big data. We ask what expressive, arts-based approaches can contribute to these forms in the broad cultural landscape of immersive technologies.
[Psychosomatics is "expensive"].
Hnízdil, J; Savlík, J
2005-01-01
Experience shoves that number of diseases where recognition and treatment is without limits of classical medicine is rising, however it is fully within the competence of psychosomatic approach. It does not concern the classical classification into organic and functional defects, but it concerns the possibility of complex approach. The theorem of "diagnosis per exclusionem" is still valid, as well as it is true that the means of medicine end at its biological limitations. We consider stressing in our article that psychosomatic diseases or psychosomatic patients do not exist and psychosomatics is not and independent specialization. Psychosomatics is, as inseparable unity of psychic and somatic activities, each human being. Complex biopsychosocial (psychosomatic) approach is the way of thinking and work, which considers human in unrepeatable oneness and context of his life. It does not mean to underestimate objective biological findings and results of instrumentally investigation, but their implanting into the complex network of consequences of the patient's life in order to choose the most appropriate methods of the care of the individual in healthiness and disease.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Culture, personal experience and agency.
McCarthy, John; Sullivan, Paul; Wright, Peter
2006-06-01
In this article, we explore what we perceive to be a gap between agency as articulated in practice theories and agency as personally experienced. The gap is not created by a turn to practice in theorizing, but by the tendency to produce theoretical representations that silence the particularity of experience and the diversity of voices in experience. In exploring the gap, we identify aspects of practice theories that explicitly commit to theoretical representation over personal experience and describe Bakhtin's commitment to action and personal experience as an alternative. In order to exemplify Bakhtin's approach in practice, we then present an analysis of one artist-teacher's experience of her own agency in making art and in teaching. Finally, we comment on what a commitment to representational theorizing does to accounts of an artist's activities and personal experience.
Zheng, Hong; Clausen, Morten Rahr; Dalsgaard, Trine Kastrup; Mortensen, Grith; Bertram, Hanne Christine
2013-08-06
We describe a time-saving protocol for the processing of LC-MS-based metabolomics data by optimizing parameter settings in XCMS and threshold settings for removing noisy and low-intensity peaks using design of experiment (DoE) approaches including Plackett-Burman design (PBD) for screening and central composite design (CCD) for optimization. A reliability index, which is based on evaluation of the linear response to a dilution series, was used as a parameter for the assessment of data quality. After identifying the significant parameters in the XCMS software by PBD, CCD was applied to determine their values by maximizing the reliability and group indexes. Optimal settings by DoE resulted in improvements of 19.4% and 54.7% in the reliability index for a standard mixture and human urine, respectively, as compared with the default setting, and a total of 38 h was required to complete the optimization. Moreover, threshold settings were optimized by using CCD for further improvement. The approach combining optimal parameter setting and the threshold method improved the reliability index about 9.5 times for a standards mixture and 14.5 times for human urine data, which required a total of 41 h. Validation results also showed improvements in the reliability index of about 5-7 times even for urine samples from different subjects. It is concluded that the proposed methodology can be used as a time-saving approach for improving the processing of LC-MS-based metabolomics data.
A Parametric Geometry Computational Fluid Dynamics (CFD) Study Utilizing Design of Experiments (DOE)
NASA Technical Reports Server (NTRS)
Rhew, Ray D.; Parker, Peter A.
2007-01-01
Design of Experiments (DOE) techniques were applied to the Launch Abort System (LAS) of the NASA Crew Exploration Vehicle (CEV) parametric geometry Computational Fluid Dynamics (CFD) study to efficiently identify and rank the primary contributors to the integrated drag over the vehicles ascent trajectory. Typical approaches to these types of activities involve developing all possible combinations of geometries changing one variable at a time, analyzing them with CFD, and predicting the main effects on an aerodynamic parameter, which in this application is integrated drag. The original plan for the LAS study team was to generate and analyze more than1000 geometry configurations to study 7 geometric parameters. By utilizing DOE techniques the number of geometries was strategically reduced to 84. In addition, critical information on interaction effects among the geometric factors were identified that would not have been possible with the traditional technique. Therefore, the study was performed in less time and provided more information on the geometric main effects and interactions impacting drag generated by the LAS. This paper discusses the methods utilized to develop the experimental design, execution, and data analysis.
NASA Technical Reports Server (NTRS)
Wallace, Terryl A.; Bey, Kim S.; Taminger, Karen M. B.; Hafley, Robert A.
2004-01-01
A study was conducted to evaluate the relative significance of input parameters on Ti- 6Al-4V deposits produced by an electron beam free form fabrication process under development at the NASA Langley Research Center. Five input parameters where chosen (beam voltage, beam current, translation speed, wire feed rate, and beam focus), and a design of experiments (DOE) approach was used to develop a set of 16 experiments to evaluate the relative importance of these parameters on the resulting deposits. Both single-bead and multi-bead stacks were fabricated using 16 combinations, and the resulting heights and widths of the stack deposits were measured. The resulting microstructures were also characterized to determine the impact of these parameters on the size of the melt pool and heat affected zone. The relative importance of each input parameter on the height and width of the multi-bead stacks will be discussed. .
An Easy Way to Show Memory Color Effects.
Witzel, Christoph
2016-01-01
This study proposes and evaluates a simple stimulus display that allows one to measure memory color effects (the effect of object knowledge and memory on color perception). The proposed approach is fast and easy and does not require running an extensive experiment. It shows that memory color effects are robust to minor variations due to a lack of color calibration.
Lessons from an Experiential Learning Process: The Case of Cowpea Farmer Field Schools in Ghana
ERIC Educational Resources Information Center
Nederlof, E. Suzanne; Odonkor, Ezekiehl N.
2006-01-01
The Farmer Field School (FFS) is a form of adult education using experiential learning methods, aimed at building farmers' decision-making capacity and expertise. The National Research Institute in West Africa conducted FFS in cowpea cultivation and we use this experience to analyse the implementation of the FFS approach. How does it work in…
Does Student Philanthropy Work? A Study of Long-Term Effects of the "Learning by Giving" Approach
ERIC Educational Resources Information Center
Olberding, Julie Cencula
2012-01-01
Student philanthropy is a teaching strategy designed to engage students actively in the curriculum, increase awareness of social needs and nonprofit organizations, and teach grant-writing and grant-making skills. This is the first study to examine long-term effects of student philanthropy by surveying alumni years after their experience with this…
ERIC Educational Resources Information Center
Parr, Brian A.; Edwards, M. Craig; Leising, James G.
2008-01-01
The purpose of this study was to empirically test the hypothesis that students who participated in a contextualized, mathematics-enhanced high school agricultural power and technology curriculum and aligned instructional approach would not experience significant diminishment in acquisition of technical skills related to agricultural power and…
To Flip or Not to Flip? Analysis of a Flipped Classroom Pedagogy in a General Biology Course
ERIC Educational Resources Information Center
Heyborne, William H.; Perrett, Jamis J.
2016-01-01
In an attempt to better understand the flipped technique and evaluate its purported superiority in terms of student learning gains, the authors conducted an experiment comparing a flipped classroom to a traditional lecture classroom. Although the outcomes were mixed, regarding the superiority of either pedagogical approach, there does seem to be a…
Three Ways of Looking at a Blackboard: A "Trivial" Approach to Writing and Speaking
ERIC Educational Resources Information Center
McDonald, Hal
2006-01-01
The author writes that his experience in teaching has taught him that the perfect text simply does not exist, however the closest approximation to perfection lies in the direction of the classical world. Hal McDonald says that he cannot see how one can teach rhetoric without passing through pedagogical territory first cleared by Aristotle,…
Tuning Parameters in Heuristics by Using Design of Experiments Methods
NASA Technical Reports Server (NTRS)
Arin, Arif; Rabadi, Ghaith; Unal, Resit
2010-01-01
With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.
(The feeling of) meaning-as-information.
Heintzelman, Samantha J; King, Laura A
2014-05-01
The desire for meaning is recognized as a central human motive. Yet, knowing that people want meaning does not explain its function. What adaptive problem does this experience solve? Drawing on the feelings-as-information hypothesis, we propose that the feeling of meaning provides information about the presence of reliable patterns and coherence in the environment, information that is not provided by affect. We review research demonstrating that manipulations of stimulus coherence influence subjective reports of meaning in life but not affect. We demonstrate that manipulations that foster an associative mindset enhance meaning. The meaning-as-information perspective embeds meaning in a network of foundational functions including associative learning, perception, cognition, and neural processing. This approach challenges assumptions about meaning, including its motivational appeal, the roles of expectancies and novelty in this experience, and the notion that meaning is inherently constructed. Implications for constructed meaning and existential meanings are discussed.
Escuder-Gilabert, L; Martín-Biosca, Y; Sagrado, S; Medina-Hernández, M J
2014-10-10
The design of experiments (DOE) is a good option for rationally limiting the number of experiments required to achieve the enantioresolution (Rs) of a chiral compound in capillary electrophoresis. In some cases, the modeled Rs after DOE analysis can be unsatisfactory, maybe because the range of the explored factors (DOE domain) was not the adequate. In these cases, anticipative strategies can be an alternative to the repetition of the process (e.g. a new DOE), to save time and money. In this work, multiple linear regression (MLR)-steepest ascent and a new anticipative strategy based on a multiple response-partial least squares model (called PLS2-prediction) are examined as post-DOE strategies to anticipate new experimental conditions providing satisfactory Rs values. The new anticipative strategy allows to include the analysis time (At) and uncertainty limits into the decision making process. To demonstrate their efficiency, the chiral separation of hexaconazole and penconazole, as model compounds, is studied using highly sulfated-β-cyclodextrin (HS-β-CD) in electrokinetic chromatography (EKC). Box-Behnken DOE for three factors (background electrolyte pH, separation temperature and HS-β-CD concentration) and two responses (Rs and At) is used. Using commercially available software, the whole modeling and anticipative process is automatic, simple and requires minimal skills from the researcher. Both strategies studied have proven to successfully anticipate Rs values close to the experimental ones for EKC conditions outside the DOE domain for the two model compounds. The results in this work suggest that PLS2-prediction approach could be the strategy of choice to obtain secure anticipations in EKC. Copyright © 2014 Elsevier B.V. All rights reserved.
Does the continuum theory of dynamic fracture work?
NASA Astrophysics Data System (ADS)
Kessler, David A.; Levine, Herbert
2003-09-01
We investigate the validity of the linear elastic fracture mechanics approach to dynamic fracture. We first test the predictions in a lattice simulation, using a formula of Eshelby for the time-dependent stress intensity factor. Excellent agreement with the theory is found. We then use the same method to analyze the experiment of Sharon and Fineberg. The data here are not consistent with the theoretical expectation.
Developing Language and Writing Skills of Deaf and Hard of Hearing Students: A Simultaneous Approach
ERIC Educational Resources Information Center
Dostal, Hannah M.; Wolbers, Kimberly A.
2014-01-01
In school, deaf and hard of hearing students (d/hh) are often exposed to American Sign Language (ASL) while also developing literacy skills in English. ASL does not have a written form, but is a fully accessible language to the d/hh through which it is possible to mediate understanding, draw on prior experiences, and engage critical thinking and…
Exterior LED Lighting Projects at Princeton University
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, Robert G.; Evans, William; Murphy, Arthur T.
For this report, PNNL / the U.S. Department of Energy (DOE) studied a series of past exterior lighting projects at Princeton, in order to document Princeton’s experiences with solid-state lighting (SSL) and the lessons learned along the way, and to show how their approach to SSL projects evolved as their own learning expanded and as the products available improved in performance and sophistication.
A new approach to electrophoresis in space
NASA Technical Reports Server (NTRS)
Snyder, Robert S.; Rhodes, Percy H.
1990-01-01
Previous electrophoresis experiments performed in space are reviewed. There is sufficient data available from the results of these experiments to show that they were designed with incomplete knowledge of the fluid dynamics of the process including electrohydrodynamics. Redesigning laboratory chambers and operating procedures developed on Earth for space without understanding both the advantages and disadvantages of the microgravity environment has yielded poor separations of both cells and proteins. However, electrophoreris is still an important separation tool in the laboratory and thermal convection does limit its performance. Thus, there is a justification for electrophoresis but the emphasis of future space experiments must be directed toward basic research with model experiments to understand the microgravity environment and fluid analysis to test the basic principles of the process.
Limited access atrial septal defect closure and the evolution of minimally invasive surgery.
Izzat, M B; Yim, A P; El-Zufari, M H
1998-04-01
While minimizing the "invasiveness" in general surgery has been equated with minimizing "access", what constitutes minimally invasive intra-cardiac surgery remains controversial. Many surgeons doubt the benefits of minimizing access when the need for cardiopulmonary bypass cannot be waived. Recognizing that median sternotomy itself does entail significant morbidity, we investigated the value of alternative approaches to median sternotomy using atrial septal defect closure as our investigative model. We believe that some, but not all minimal access approaches are associated with reduced postoperative morbidity and enhanced recovery. Our current strategy is to use a mini-sternotomy approach in adult patients, whereas conventional median sternotomy remains our standard approach in the pediatric population. Considerable clinical experiences coupled with documented clinical benefits are fundamental before a certain approach is adopted in routine practice.
Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods
NASA Technical Reports Server (NTRS)
Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan
2016-01-01
The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.
Dialogical argumentation in elementary science classrooms
NASA Astrophysics Data System (ADS)
Kim, Mijung; Roth, Wolff-Michael
2018-02-01
To understand students' argumentation abilities, there have been practices that focus on counting and analyzing argumentation schemes such as claim, evidence, warrant, backing, and rebuttal. This analytic approach does not address the dynamics of epistemic criteria of children's reasoning and decision-making in dialogical situations. The common approach also does not address the practice of argumentation in lower elementary grades (K-3) because these children do not master the structure of argumentation and, therefore, are considered not ready for processing argumentative discourse. There is thus little research focusing on lower elementary school students' argumentation in school science. This study, drawing on the societal-historical approach by L. S. Vygotsky, explored children's argumentation as social relations by investigating the genesis of evidence-related practices (especially burden of proof) in second- and third-grade children. The findings show (a) students' capacity for connecting claim and evidence/responding to the burden of proof and critical move varies and (b) that teachers play a significant role to emphasize the importance of evidence but experience difficulties removing children's favored ideas during the turn taking of argumentative dialogue. The findings on the nature of dialogical reasoning and teacher's role provide further insights about discussions on pedagogical approaches to children's reasoning and argumentation.
A network approach for distinguishing ethical issues in research and development.
Zwart, Sjoerd D; van de Poel, Ibo; van Mil, Harald; Brumsen, Michiel
2006-10-01
In this paper we report on our experiences with using network analysis to discern and analyse ethical issues in research into, and the development of, a new wastewater treatment technology. Using network analysis, we preliminarily interpreted some of our observations in a Group Decision Room (GDR) session where we invited important stakeholders to think about the risks of this new technology. We show how a network approach is useful for understanding the observations, and suggests some relevant ethical issues. We argue that a network approach is also useful for ethical analysis of issues in other fields of research and development. The abandoning of the overarching rationality assumption, which is central to network approaches, does not have to lead to ethical relativism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Training programs at DOE facilities should prepare personnel to safely and efficiently operate and maintain the facilities in accordance with DOE requirements. This guide presents good practices for a systematic approach to on-the-job training (OJT) and OJT programs and should be used in conjunction with DOE Training Program Handbook: A Systematic Approach to Training, and with the DOE Handbook entitled Alternative Systematic Approaches to Training to develop performance-based OJT programs. DOE contractors may also use this guide to modify existing OJT programs that do not meet the systematic approach to training (SAT) objectives.
Dong, Jia; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K.N.; Knobeloch, Daniel; Gerlach, Jörg C.; Zeilinger, Katrin
2008-01-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes. PMID:19003182
Dong, Jia; Mandenius, Carl-Fredrik; Lübberstedt, Marc; Urbaniak, Thomas; Nüssler, Andreas K N; Knobeloch, Daniel; Gerlach, Jörg C; Zeilinger, Katrin
2008-07-01
Optimization of cell culture media based on statistical experimental design methodology is a widely used approach for improving cultivation conditions. We applied this methodology to refine the composition of an established culture medium for growth of a human hepatoma cell line, C3A. A selection of growth factors and nutrient supplements were systematically screened according to standard design of experiments (DoE) procedures. The results of the screening indicated that the medium additives hepatocyte growth factor, oncostatin M, and fibroblast growth factor 4 significantly influenced the metabolic activities of the C3A cell line. Surface response methodology revealed that the optimum levels for these factors were 30 ng/ml for hepatocyte growth factor and 35 ng/ml for oncostatin M. Additional experiments on primary human hepatocyte cultures showed high variance in metabolic activities between cells from different individuals, making determination of optimal levels of factors more difficult. Still, it was possible to conclude that hepatocyte growth factor, epidermal growth factor, and oncostatin M had decisive effects on the metabolic functions of primary human hepatocytes.
NASA Astrophysics Data System (ADS)
Jin, Q.; Zheng, Z.; Zhu, C.
2006-12-01
Microorganisms in nature conserve energy by catalyzing various geochemical reactions. To build a quantitative relationship between geochemical conditions and metabolic rates, we propose a bioenergetics-kinetics coupled modeling approach. This approach describes microbial community as a metabolic network, i.e., fermenting microbes degrade organic substrates while aerobic respirer, nitrate reducer, metal reducer, sulfate reducer, and methanogen consume the fermentation products. It quantifies the control of substrate availability and biological energy conservation on the metabolic rates using thermodynamically consistent rate laws. We applied this simulation approach to study the progress of microbial metabolism during a field biostimulation experiment conducted in Oak Ridge, Tennessee. In the experiment, ethanol was injected into a monitoring well and groundwater was sampled to monitor changes in the chemistry. With time, concentrations of ethanol and SO42- decreased while those of NH4+, Fe2+, and Mn2+ increased. The simulation results fitted well to the observation, indicating simultaneous ethanol degradation and terminal electron accepting processes. The rates of aerobic respiration and denitrification were mainly controlled by substrate concentrations while those of ethanol degradation, sulfate reduction, and methanogenesis were controlled dominantly by the energy availability. The simulation results suggested two different microbial growth statuses in the subsurface. For the functional groups with significant growth, variations with time in substrate concentrations demonstrated a typical S curve. For the groups without significant growth, initial decreases in substrate concentrations were linear with time. Injecting substrates followed by monitoring environmental chemistry therefore provides a convenient approach to characterize microbial growth in the subsurface where methods for direct observation are currently unavailable. This research was funded by the NABIR program, DOE, under grant No. DE-FG02-04ER63740 to CZ. We thank J. Istok, David Watson, and Philip Jardine for their help. The views and opinions of authors expressed herein do not necessarily state or reflect those of the DOE.
Inkpen, S Andrew
2016-06-01
Experimental ecologists often invoke trade-offs to describe the constraints they encounter when choosing between alternative experimental designs, such as between laboratory, field, and natural experiments. In making these claims, they tend to rely on Richard Levins' analysis of trade-offs in theoretical model-building. But does Levins' framework apply to experiments? In this paper, I focus this question on one desideratum widely invoked in the modelling literature: generality. Using the case of generality, I assess whether Levins-style treatments of modelling provide workable resources for assessing trade-offs in experimental design. I argue that, of four strategies modellers employ to increase generality, only one may be unproblematically applied to experimental design. Furthermore, modelling desiderata do not have obvious correlates in experimental design, and when we define these desiderata in a way that seem consistent with ecologists' usage, the trade-off framework falls apart. I conclude that a Levins-inspired framework for modelling does not provide the content for a similar approach to experimental practice; this does not, however, mean that it cannot provide the form. Copyright © 2016 Elsevier Ltd. All rights reserved.
When one size does not fit all: a commentary.
Jones, Sarah E
2014-01-01
This commentary explores a few of the common threads in a symposium of obesity narratives in light of the American Medical Association's classification of obesity as a disease. While the narratives illustrate the breadth of experiences, they each highlight the absence of a clear approach for the treatment of obesity, as well as the lack of conversation and compassion in the most basic of interactions with medical professionals. This could be cause for despair, yet we learn through these shared experiences that we can take control of our care and plot a course for real and lasting assistance with this condition.
2015-03-01
Defense DODAF Department of Defense Architecture Framework DOE design of experiment EMMI energy , mass, material wealth, information FNF fire and... energy or blast power (depending on the type of projectile). Tank munitions have significant penetrative ability and can cause serious damage to...survivability of ground combat vehicles during ground force maneuver operations. The simulation results indicated that the presence of air defense
Repealing Don’t Ask, Don’t Tell: Addressing the Ripple Effects
2010-02-25
and lesbians as a minority group who are fighting against discrimination . Transgender individuals are considered a “group,” and although there does...study stated transgender people in the military experience discrimination and that the Veteran’s Administration denied assistance when approached by a...integrated blacks, there was initial resistance based on discrimination .24 Prior to the integration, whites opposed the policy with vehement hostility
Learning to read aloud: A neural network approach using sparse distributed memory
NASA Technical Reports Server (NTRS)
Joglekar, Umesh Dwarkanath
1989-01-01
An attempt to solve a problem of text-to-phoneme mapping is described which does not appear amenable to solution by use of standard algorithmic procedures. Experiments based on a model of distributed processing are also described. This model (sparse distributed memory (SDM)) can be used in an iterative supervised learning mode to solve the problem. Additional improvements aimed at obtaining better performance are suggested.
A practical approach for the scale-up of roller compaction process.
Shi, Weixian; Sprockel, Omar L
2016-09-01
An alternative approach for the scale-up of ribbon formation during roller compaction was investigated, which required only one batch at the commercial scale to set the operational conditions. The scale-up of ribbon formation was based on a probability method. It was sufficient in describing the mechanism of ribbon formation at both scales. In this method, a statistical relationship between roller compaction parameters and ribbon attributes (thickness and density) was first defined with DoE using a pilot Alexanderwerk WP120 roller compactor. While the milling speed was included in the design, it has no practical effect on granule properties within the study range despite its statistical significance. The statistical relationship was then adapted to a commercial Alexanderwerk WP200 roller compactor with one experimental run. The experimental run served as a calibration of the statistical model parameters. The proposed transfer method was then confirmed by conducting a mapping study on the Alexanderwerk WP200 using a factorial DoE, which showed a match between the predictions and the verification experiments. The study demonstrates the applicability of the roller compaction transfer method using the statistical model from the development scale calibrated with one experiment point at the commercial scale. Copyright © 2016 Elsevier B.V. All rights reserved.
Högden, Fabia; Hütter, Mandy; Unkelbach, Christian
2018-02-26
The role of awareness in evaluative learning has been thoroughly investigated with a variety of theoretical and methodological approaches. We investigated evaluative conditioning (EC) without awareness with an approach that conceptually provides optimal conditions for unaware learning - the Continuous Flash Suppression paradigm (CFS). In CFS, a stimulus presented to one eye can be rendered invisible for a prolonged duration by presenting a high-contrast dynamic pattern to the other eye. The suppressed stimulus is nevertheless processed. First, Experiment 1 established EC effects in a pseudo-CFS setup without suppression. Experiment 2 then employed CFS to suppress conditioned stimuli (CSs) from awareness while the unconditioned stimuli (USs) were visible. While Experiment 1 and 2 used a between-participants manipulation of CS suppression, Experiments 3 and 4 both manipulated suppression within participants. We observed EC effects when CSs were not suppressed, but found no EC effects when the CS was suppressed from awareness. We relate our finding to previous research and discuss theoretical implications for EC. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Plasticity in animal personality traits: does prior experience alter the degree of boldness?
Frost, Ashley J; Winrow-Giffen, Alexandria; Ashley, Paul J; Sneddon, Lynne U
2007-02-07
Theoreticians predict that animal 'personality' traits may be maladaptive if fixed throughout different contexts, so the present study aimed to test whether these traits are fixed or plastic. Rainbow trout (Onchorhyncus mykiss) were given emboldening or negative experiences in the forms of watching bold or shy individuals responding to novelty or winning or losing fights to examine whether prior experience affected boldness. Bold individuals that lost fights or watched shy demonstrators became more shy by increasing their latency to approach a novel object, whereas shy observers that watched bold demonstrators remained cautious and did not modify their responses to novelty. Shy winners became bolder and decreased their latency to approach a novel object, but shy losers also displayed this shift. In comparison, control groups showed no change in behaviour. Bold fishes given negative experiences reduced their boldness which may be an adaptive response; however, shy fishes may base their strategic decisions upon self-assessment of their relative competitive ability and increase their boldness in situations where getting to resources more quickly ensures they outcompete better competitors.
Hussein, Husnah; Williams, David J; Liu, Yang
2015-07-01
A systematic design of experiments (DOE) approach was used to optimize the perfusion process of a tri-axial bioreactor designed for translational tissue engineering exploiting mechanical stimuli and mechanotransduction. Four controllable design parameters affecting the perfusion process were identified in a cause-effect diagram as potential improvement opportunities. A screening process was used to separate out the factors that have the largest impact from the insignificant ones. DOE was employed to find the settings of the platen design, return tubing configuration and the elevation difference that minimise the load on the pump and variation in the perfusion process and improve the controllability of the perfusion pressures within the prescribed limits. DOE was very effective for gaining increased knowledge of the perfusion process and optimizing the process for improved functionality. It is hypothesized that the optimized perfusion system will result in improved biological performance and consistency.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romanov, A.
Many modern and most future accelerators rely on precise configuration of lattice and trajectory. The Integrable Optics Test Accelerator (IOTA) at Fermilab that is coming to final stages of construction will be used to test advanced approaches of control over particles dynamics. Various experiments planned at IOTA require high flexibility of lattice configuration as well as high precision of lattice and closed orbit control. Dense element placement does not allow to have ideal configuration of diagnostics and correctors for all planned experiments. To overcome this limitations advanced method of lattice an beneficial for other machines. Developed algorithm is based onmore » LOCO approach, extended with various sets of other experimental data, such as dispersion, BPM BPM phase advances, beam shape information from synchrotron light monitors, responses of closed orbit bumps to variations of focusing elements and other. Extensive modeling of corrections for a big number of random seed errors is used to illustrate benefits from developed approach.« less
Investigation of wing crack formation with a combined phase-field and experimental approach
NASA Astrophysics Data System (ADS)
Lee, Sanghyun; Reber, Jacqueline E.; Hayman, Nicholas W.; Wheeler, Mary F.
2016-08-01
Fractures that propagate off of weak slip planes are known as wing cracks and often play important roles in both tectonic deformation and fluid flow across reservoir seals. Previous numerical models have produced the basic kinematics of wing crack openings but generally have not been able to capture fracture geometries seen in nature. Here we present both a phase-field modeling approach and a physical experiment using gelatin for a wing crack formation. By treating the fracture surfaces as diffusive zones instead of as discontinuities, the phase-field model does not require consideration of unpredictable rock properties or stress inhomogeneities around crack tips. It is shown by benchmarking the models with physical experiments that the numerical assumptions in the phase-field approach do not affect the final model predictions of wing crack nucleation and growth. With this study, we demonstrate that it is feasible to implement the formation of wing cracks in large scale phase-field reservoir models.
Design-of-Experiments Approach to Improving Inferior Vena Cava Filter Retrieval Rates.
Makary, Mina S; Shah, Summit H; Warhadpande, Shantanu; Vargas, Ivan G; Sarbinoff, James; Dowell, Joshua D
2017-01-01
The association of retrievable inferior vena cava filters (IVCFs) with adverse events has led to increased interest in prompt retrieval, particularly in younger patients given the progressive nature of these complications over time. This study takes a design-of-experiments (DOE) approach to investigate methods to best improve filter retrieval rates, with a particular focus on younger (<60 years) patients. A DOE approach was executed in which combinations of variables were tested to best improve retrieval rates. The impact of a virtual IVCF clinic, primary care physician (PCP) letters, and discharge instructions was investigated. The decision for filter retrieval in group 1 was determined solely by the referring physician. Group 2 included those patients prospectively followed in an IVCF virtual clinic in which filter retrieval was coordinated by the interventional radiologist when clinically appropriate. In group 3, in addition to being followed through the IVCF clinic, each patient's PCP was faxed a follow-up letter, and information regarding IVCF retrieval was added to the patient's discharge instructions. A total of 10 IVCFs (8.4%) were retrieved among 119 retrievable IVCFs placed in group 1. Implementation of the IVCF clinic in group 2 significantly improved the retrieval rate to 25.3% (23 of 91 retrievable IVCFs placed, P < .05). The addition of discharge instructions and PCP letters to the virtual clinic (group 3) resulted in a retrieval rate of 33.3% (17 of 51). The retrieval rates demonstrated more pronounced improvement when examining only younger patients, with retrieval rates of 11.3% (7 of 62), 29.5% (13 of 44, P < .05), and 45.2% (14 of 31) for groups 1, 2, and 3, respectively. DOE methodology is not routinely executed in health care, but it is an effective approach to evaluating clinical practice behavior and patient quality measures. In this study, implementation of the combination of a virtual clinic, PCP letters, and discharge instructions improved retrieval rates compared with a virtual clinic alone. Quality improvement strategies such as these that augment patient and referring physician knowledge on interventional radiologic procedures may ultimately improve patient safety and personalized care. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
Symmetry Breaking and the B3LYP Functional
NASA Technical Reports Server (NTRS)
Bauschlicher, Charles W., Jr.; Hudgins, Douglas M.; Allamandola, Louis J.; Arnold, James O. (Technical Monitor)
1999-01-01
The infrared spectra of six molecules, each of which contains a five-membered ring, and their cations are determined using density functional theory (DFT); both the B3LYP and BP86 functionals are used. The computed results are compared with the experimental spectra. For the neutral molecules, both methods are in good agreement with experiment. Even the Hartree-Fock (HF) approach is qualitatively correct for the neutrals. For the cations, the HF approach fails, as found for other organic ring systems. The B3LYP and BP86 approaches are in good mutual agreement for five of the six cation spectra, and in good agreement with experiment for four of the five cations where the experimental spectra are available. It is only for the fluoranthene cation, where the BP86 and B3LYP functionals yield different results; the BP86 yields the expected C2v symmetry, while the B3LYP approach breaks symmetry. The experimental spectra supports the BP86 spectra over the B3LYP, but the quality of the experimental spectra does not allow a critical evaluation of the accuracy of the BP86 approach for this difficult system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
This document contains a listing, description, and selected references for documented human radiation experiments sponsored, supported, or performed by the US Department of Energy (DOE) or its predecessors, including the US Energy Research and Development Administration (ERDA), the US Atomic Energy Commission (AEC), the Manhattan Engineer District (MED), and the Off ice of Scientific Research and Development (OSRD). The list represents work completed by DOE`s Off ice of Human Radiation Experiments (OHRE) through June 1995. The experiment list is available on the Internet via a Home Page on the World Wide Web (http://www.ohre.doe.gov). The Home Page also includes the fullmore » text of Human Radiation Experiments. The Department of Energy Roadmap to the Story and the Records (DOE/EH-0445), published in February 1995, to which this publication is a supplement. This list includes experiments released at Secretary O`Leary`s June 1994 press conference, as well as additional studies identified during the 12 months that followed. Cross-references are provided for experiments originally released at the press conference; for experiments released as part of The DOE Roadmap; and for experiments published in the 1986 congressional report entitled American Nuclear Guinea Pigs: Three Decades of Radiation Experiments on US Citizens. An appendix of radiation terms is also provided.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carnes, S.A.
2001-02-15
Public participation in Office of Environmental Management (EM) activities throughout the DOE complex is a critical component of the overall success of remediation and waste management efforts. The challenges facing EM and its stakeholders over the next decade or more are daunting (Nuclear Waste News 1996). Achieving a mission composed of such challenges will require innovation, dedication, and a significant degree of good will among all stakeholders. EM's efforts to date, including obtaining and using inputs offered by EM stakeholders, have been notable. Public participation specialists have accepted and met challenges and have consistently tried to improve their performance. Theymore » have reported their experiences both formally and informally (e.g., at professional conferences and EM Public Participation Network Workshops, other internal meetings of DOE and contractor public participation specialists, and one-on-one consultations) in order to advance the state of their practice. Our research, and our field research in particular (including our interactions with many representatives of numerous stakeholder groups at nine DOE sites with diverse EM problems), have shown that it, is possible to develop coherent results even in a problem domain as complex as that of EM. We conclude that performance-based evaluations of public participation appear possible, and we have recommended an approach, based on combined and integrated multi-stakeholder views on the attributes of successful public participation and associated performance indicators, that seems workable and should be acceptable to diverse stakeholders. Of course, as an untested recommendation, our approach needs the validation that can only be achieved by application (perhaps at a few DOE sites with ongoing EM activities). Such an application would serve to refine the proposed approach in terms of its clarity, its workability, and its potential for full-scale use by EM and, potentially, other government agencies and private sector concerns.« less
Does Branding Need Web Usability? A Value-Oriented Empirical Study
NASA Astrophysics Data System (ADS)
Bolchini, Davide; Garzotto, Franca; Sorce, Fabio
Does usability of a web-based communication artifact affect brand, i.e., the set of beliefs, emotions, attitudes, or qualities that people mentally associate to the entity behind that artifact? Intuitively, the answer is “yes”: usability is a fundamental aspect of the quality of the experience with a website, and a “good” experience with a “product” or its reifications tends to translate into “good” brand perception. To date, however, the existence of a connection between web usability and brand perception is shown through anecdotic arguments, and is not supported by published systematic research. This paper discusses a study that empirically investigates this correlation in a more rigorous, analytical, and replicable way. Our main contribution is twofold: on the one hand, we provide empirical evidence to the heuristic principle that web usability influences branding, and we do that through four between subjects controlled experiments that involved 120 subjects. On the other hand, we inform the study with a systematic value-oriented approach to the user experience, and thus provide a conceptual framework that can be reused in other experimental settings, either for replicating our study, or for designing similar studies focusing on the correlation of web branding vs. design factors other than usability.
Li, Kui; Wang, Lei; Lv, Yanhong; Gao, Pengyu; Song, Tianxiao
2015-01-01
Getting a land vehicle’s accurate position, azimuth and attitude rapidly is significant for vehicle based weapons’ combat effectiveness. In this paper, a new approach to acquire vehicle’s accurate position and orientation is proposed. It uses biaxial optical detection platform (BODP) to aim at and lock in no less than three pre-set cooperative targets, whose accurate positions are measured beforehand. Then, it calculates the vehicle’s accurate position, azimuth and attitudes by the rough position and orientation provided by vehicle based navigation systems and no less than three couples of azimuth and pitch angles measured by BODP. The proposed approach does not depend on Global Navigation Satellite System (GNSS), thus it is autonomous and difficult to interfere. Meanwhile, it only needs a rough position and orientation as algorithm’s iterative initial value, consequently, it does not have high performance requirement for Inertial Navigation System (INS), odometer and other vehicle based navigation systems, even in high precise applications. This paper described the system’s working procedure, presented theoretical deviation of the algorithm, and then verified its effectiveness through simulation and vehicle experiments. The simulation and experimental results indicate that the proposed approach can achieve positioning and orientation accuracy of 0.2 m and 20″ respectively in less than 3 min. PMID:26492249
Li, Kui; Wang, Lei; Lv, Yanhong; Gao, Pengyu; Song, Tianxiao
2015-10-20
Getting a land vehicle's accurate position, azimuth and attitude rapidly is significant for vehicle based weapons' combat effectiveness. In this paper, a new approach to acquire vehicle's accurate position and orientation is proposed. It uses biaxial optical detection platform (BODP) to aim at and lock in no less than three pre-set cooperative targets, whose accurate positions are measured beforehand. Then, it calculates the vehicle's accurate position, azimuth and attitudes by the rough position and orientation provided by vehicle based navigation systems and no less than three couples of azimuth and pitch angles measured by BODP. The proposed approach does not depend on Global Navigation Satellite System (GNSS), thus it is autonomous and difficult to interfere. Meanwhile, it only needs a rough position and orientation as algorithm's iterative initial value, consequently, it does not have high performance requirement for Inertial Navigation System (INS), odometer and other vehicle based navigation systems, even in high precise applications. This paper described the system's working procedure, presented theoretical deviation of the algorithm, and then verified its effectiveness through simulation and vehicle experiments. The simulation and experimental results indicate that the proposed approach can achieve positioning and orientation accuracy of 0.2 m and 20″ respectively in less than 3 min.
Twemlow, S W
2000-10-01
This paper demonstrates that several psychoanalytic models taken together converge to collectively explain school violence and power struggles better than each does alone. Using my own experience in doing psychoanalytically informed community intervention, I approach the problem of school violence from a combination of Adlerian, Stollerian, dialectical social systems, and Klein-Bion perspectives. This integrated model is then applied to the Columbine High School massacre in Littleton, Colorado.
Targeting the Poor: Evidence from a Field Experiment in Indonesia
Alatas, Vivi; Banerjee, Abhijit; Hanna, Rema; Olken, Benjamin A.; Tobias, Julia
2014-01-01
This paper reports an experiment in 640 Indonesian villages on three approaches to target the poor: proxy-means tests (PMT), where assets are used to predict consumption; community targeting, where villagers rank everyone from richest to poorest; and a hybrid. Defining poverty based on PPP$2 per-capita consumption, community targeting and the hybrid perform somewhat worse in identifying the poor than PMT, though not by enough to significantly affect poverty outcomes for a typical program. Elite capture does not explain these results. Instead, communities appear to apply a different concept of poverty. Consistent with this finding, community targeting results in higher satisfaction. PMID:25197099
Fourier Magnitude-Based Privacy-Preserving Clustering on Time-Series Data
NASA Astrophysics Data System (ADS)
Kim, Hea-Suk; Moon, Yang-Sae
Privacy-preserving clustering (PPC in short) is important in publishing sensitive time-series data. Previous PPC solutions, however, have a problem of not preserving distance orders or incurring privacy breach. To solve this problem, we propose a new PPC approach that exploits Fourier magnitudes of time-series. Our magnitude-based method does not cause privacy breach even though its techniques or related parameters are publicly revealed. Using magnitudes only, however, incurs the distance order problem, and we thus present magnitude selection strategies to preserve as many Euclidean distance orders as possible. Through extensive experiments, we showcase the superiority of our magnitude-based approach.
In Silico Labeling: Predicting Fluorescent Labels in Unlabeled Images.
Christiansen, Eric M; Yang, Samuel J; Ando, D Michael; Javaherian, Ashkan; Skibinski, Gaia; Lipnick, Scott; Mount, Elliot; O'Neil, Alison; Shah, Kevan; Lee, Alicia K; Goyal, Piyush; Fedus, William; Poplin, Ryan; Esteva, Andre; Berndl, Marc; Rubin, Lee L; Nelson, Philip; Finkbeiner, Steven
2018-04-19
Microscopy is a central method in life sciences. Many popular methods, such as antibody labeling, are used to add physical fluorescent labels to specific cellular constituents. However, these approaches have significant drawbacks, including inconsistency; limitations in the number of simultaneous labels because of spectral overlap; and necessary perturbations of the experiment, such as fixing the cells, to generate the measurement. Here, we show that a computational machine-learning approach, which we call "in silico labeling" (ISL), reliably predicts some fluorescent labels from transmitted-light images of unlabeled fixed or live biological samples. ISL predicts a range of labels, such as those for nuclei, cell type (e.g., neural), and cell state (e.g., cell death). Because prediction happens in silico, the method is consistent, is not limited by spectral overlap, and does not disturb the experiment. ISL generates biological measurements that would otherwise be problematic or impossible to acquire. Copyright © 2018 Elsevier Inc. All rights reserved.
Optical cuff for optogenetic control of the peripheral nervous system
NASA Astrophysics Data System (ADS)
Michoud, Frédéric; Sottas, Loïc; Browne, Liam E.; Asboth, Léonie; Latremoliere, Alban; Sakuma, Miyuki; Courtine, Grégoire; Woolf, Clifford J.; Lacour, Stéphanie P.
2018-02-01
Objective. Nerves in the peripheral nervous system (PNS) contain axons with specific motor, somatosensory and autonomic functions. Optogenetics offers an efficient approach to selectively activate axons within the nerve. However, the heterogeneous nature of nerves and their tortuous route through the body create a challenging environment to reliably implant a light delivery interface. Approach. Here, we propose an optical peripheral nerve interface—an optocuff—, so that optogenetic modulation of peripheral nerves become possible in freely behaving mice. Main results. Using this optocuff, we demonstrate orderly recruitment of motor units with epineural optical stimulation of genetically targeted sciatic nerve axons, both in anaesthetized and in awake, freely behaving animals. Behavioural experiments and histology show the optocuff does not damage the nerve thus is suitable for long-term experiments. Significance. These results suggest that the soft optocuff might be a straightforward and efficient tool to support more extensive study of the PNS using optogenetics.
Mirage is an image in a flat ground surface.
Tavassoly, M Taghi; Osanloo, Soghra; Salehpour, Ali
2015-04-01
Mirage is a fascinating phenomenon that has attracted many scientists to report their observations and descriptions about it. There are two different approaches to mirage formation. The more popular one is attributed to total internal reflection that occurs in the near ground air layers on hot sunny days. According to the other approach, mirage is an image in a rough surface that is observed at grazing angles of incidence. Most of the existing descriptions are qualitative and some include calculations based on guessing temperature change with no concrete experiments. In this report, first we show that Fermat's principle also concerns the wave nature of light and covers the constructive and destructive interference that is essential for image formation. Then, we provide a brief review of the image formation theory in a rough plane and demonstrate by experiments in the lab and deserts that the temperature gradient in the near ground air layers does not lead to mirage formation.
Low-cost Active Structural Control Space Experiment (LASC)
NASA Technical Reports Server (NTRS)
Robinett, Rush; Bukley, Angelia P.
1992-01-01
The DOE Lab Director's Conference identified the need for the DOE National Laboratories to actively and aggressively pursue ways to apply DOE technology to problems of national need. Space structures are key elements of DOD and NASA space systems and a space technology area in which DOE can have a significant impact. LASC is a joint agency space technology experiment (DOD Phillips, NASA Marshall, and DOE Sandia). The topics are presented in viewgraph form and include the following: phase 4 investigator testbed; control of large flexible structures in orbit; INFLEX; Controls, Astrophysics; and structures experiments in space; SARSAT; and LASC mission objectives.
Hu, Meng; Krauss, Martin; Brack, Werner; Schulze, Tobias
2016-11-01
Liquid chromatography-high resolution mass spectrometry (LC-HRMS) is a well-established technique for nontarget screening of contaminants in complex environmental samples. Automatic peak detection is essential, but its performance has only rarely been assessed and optimized so far. With the aim to fill this gap, we used pristine water extracts spiked with 78 contaminants as a test case to evaluate and optimize chromatogram and spectral data processing. To assess whether data acquisition strategies have a significant impact on peak detection, three values of MS cycle time (CT) of an LTQ Orbitrap instrument were tested. Furthermore, the key parameter settings of the data processing software MZmine 2 were optimized to detect the maximum number of target peaks from the samples by the design of experiments (DoE) approach and compared to a manual evaluation. The results indicate that short CT significantly improves the quality of automatic peak detection, which means that full scan acquisition without additional MS 2 experiments is suggested for nontarget screening. MZmine 2 detected 75-100 % of the peaks compared to manual peak detection at an intensity level of 10 5 in a validation dataset on both spiked and real water samples under optimal parameter settings. Finally, we provide an optimization workflow of MZmine 2 for LC-HRMS data processing that is applicable for environmental samples for nontarget screening. The results also show that the DoE approach is useful and effort-saving for optimizing data processing parameters. Graphical Abstract ᅟ.
NASA Astrophysics Data System (ADS)
Parfenov, D. I.; Bolodurina, I. P.
2018-05-01
The article presents the results of developing an approach to detecting and protecting against network attacks on the corporate infrastructure deployed on the multi-cloud platform. The proposed approach is based on the combination of two technologies: a softwareconfigurable network and virtualization of network functions. The approach for searching for anomalous traffic is to use a hybrid neural network consisting of a self-organizing Kohonen network and a multilayer perceptron. The study of the work of the prototype of the system for detecting attacks, the method of forming a learning sample, and the course of experiments are described. The study showed that using the proposed approach makes it possible to increase the effectiveness of the obfuscation of various types of attacks and at the same time does not reduce the performance of the network
Adaptive zooming in X-ray computed tomography.
Dabravolski, Andrei; Batenburg, Kees Joost; Sijbers, Jan
2014-01-01
In computed tomography (CT), the source-detector system commonly rotates around the object in a circular trajectory. Such a trajectory does not allow to exploit a detector fully when scanning elongated objects. Increase the spatial resolution of the reconstructed image by optimal zooming during scanning. A new approach is proposed, in which the full width of the detector is exploited for every projection angle. This approach is based on the use of prior information about the object's convex hull to move the source as close as possible to the object, while avoiding truncation of the projections. Experiments show that the proposed approach can significantly improve reconstruction quality, producing reconstructions with smaller errors and revealing more details in the object. The proposed approach can lead to more accurate reconstructions and increased spatial resolution in the object compared to the conventional circular trajectory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C.T.
1994-03-01
This paper presents a comparison of several qualitatively different approaches to Total Quality Management (TQM). The continuum ranges from management approaches that are primarily standards -- with specific guidelines, but few theoretical concepts -- to approaches that are primarily philosophical, with few specific guidelines. The approaches to TQM discussed in this paper include the International Organization for Standardization (ISO) 9000 Standard, the Malcolm Baldrige National Quality Award, Senge`s the Learning Organization, Watkins and Marsick`s approach to organizational learning, Covey`s Seven Habits of Highly Successful People, and Deming`s Fourteen Points for Management. Some of these approaches (Deming and ISO 9000) aremore » then compared to the DOE`s official position on quality management and conduct of operations (DOE Orders 5700.6C and 5480.19). Using a tabular format, it is shown that while 5700.6C (Quality Assurance) maps well to many of the current approaches to TQM, DOE`s principle guide to management Order 5419.80 (Conduct of Operations) has many significant conflicts with some of the modern approaches to continuous quality improvement.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copeland, Alex; Brown, C. Titus
2011-10-13
DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.
Copeland, Alex; Brown, C. Titus
2018-04-27
DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.
Analysis on Flexural Strength of A36 Mild Steel by Design of Experiment (DOE)
NASA Astrophysics Data System (ADS)
Nurulhuda, A.; Hafizzal, Y.; Izzuddin, MZM; Sulawati, MRN; Rafidah, A.; Suhaila, Y.; Fauziah, AR
2017-08-01
Nowadays demand for high quality and reliable components and materials are increasing so flexural tests have become vital test method in both the research and manufacturing process and development to explain in details about the material’s ability to withstand deformation under load. Recently, there are lack research studies on the effect of thickness, welding type and joint design on the flexural condition by DOE approach method. Therefore, this research will come out with the flexural strength of mild steel since it is not well documented. By using Design of Experiment (DOE), a full factorial design with two replications has been used to study the effects of important parameters which are welding type, thickness and joint design. The measurement of output response is identified as flexural strength value. Randomize experiments was conducted based on table generated via Minitab software. A normal probability test was carried out using Anderson Darling Test and show that the P-value is <0.005. Thus, the data is not normal since there is significance different between the actual data with the ideal data. Referring to the ANOVA, only factor joint design is significant since the P-value is less than 0.05. From the main plot and interaction plot, the recommended setting for each of parameters were suggested as high level for welding type, high level for thickness and low level for joint design. The prediction model was developed thru regression in order to measure effect of output response for any changes on parameters setting. In the future, the experiments can be enhanced using Taguchi methods in order to do verification of result.
Reed Johnson, F; Lancsar, Emily; Marshall, Deborah; Kilambi, Vikram; Mühlbacher, Axel; Regier, Dean A; Bresnahan, Brian W; Kanninen, Barbara; Bridges, John F P
2013-01-01
Stated-preference methods are a class of evaluation techniques for studying the preferences of patients and other stakeholders. While these methods span a variety of techniques, conjoint-analysis methods-and particularly discrete-choice experiments (DCEs)-have become the most frequently applied approach in health care in recent years. Experimental design is an important stage in the development of such methods, but establishing a consensus on standards is hampered by lack of understanding of available techniques and software. This report builds on the previous ISPOR Conjoint Analysis Task Force Report: Conjoint Analysis Applications in Health-A Checklist: A Report of the ISPOR Good Research Practices for Conjoint Analysis Task Force. This report aims to assist researchers specifically in evaluating alternative approaches to experimental design, a difficult and important element of successful DCEs. While this report does not endorse any specific approach, it does provide a guide for choosing an approach that is appropriate for a particular study. In particular, it provides an overview of the role of experimental designs for the successful implementation of the DCE approach in health care studies, and it provides researchers with an introduction to constructing experimental designs on the basis of study objectives and the statistical model researchers have selected for the study. The report outlines the theoretical requirements for designs that identify choice-model preference parameters and summarizes and compares a number of available approaches for constructing experimental designs. The task-force leadership group met via bimonthly teleconferences and in person at ISPOR meetings in the United States and Europe. An international group of experimental-design experts was consulted during this process to discuss existing approaches for experimental design and to review the task force's draft reports. In addition, ISPOR members contributed to developing a consensus report by submitting written comments during the review process and oral comments during two forum presentations at the ISPOR 16th and 17th Annual International Meetings held in Baltimore (2011) and Washington, DC (2012). Copyright © 2013 International Society for Pharmacoeconomics and Outcomes Research (ISPOR). Published by Elsevier Inc. All rights reserved.
Modeling the Test-Retest Statistics of a Localization Experiment in the Full Horizontal Plane.
Morsnowski, André; Maune, Steffen
2016-10-01
Two approaches to model the test-retest statistics of a localization experiment basing on Gaussian distribution and on surrogate data are introduced. Their efficiency is investigated using different measures describing directional hearing ability. A localization experiment in the full horizontal plane is a challenging task for hearing impaired patients. In clinical routine, we use this experiment to evaluate the progress of our cochlear implant (CI) recipients. Listening and time effort limit the reproducibility. The localization experiment consists of a 12 loudspeaker circle, placed in an anechoic room, a "camera silens". In darkness, HSM sentences are presented at 65 dB pseudo-erratically from all 12 directions with five repetitions. This experiment is modeled by a set of Gaussian distributions with different standard deviations added to a perfect estimator, as well as by surrogate data. Five repetitions per direction are used to produce surrogate data distributions for the sensation directions. To investigate the statistics, we retrospectively use the data of 33 CI patients with 92 pairs of test-retest-measurements from the same day. The first model does not take inversions into account, (i.e., permutations of the direction from back to front and vice versa are not considered), although they are common for hearing impaired persons particularly in the rear hemisphere. The second model considers these inversions but does not work with all measures. The introduced models successfully describe test-retest statistics of directional hearing. However, since their applications on the investigated measures perform differently no general recommendation can be provided. The presented test-retest statistics enable pair test comparisons for localization experiments.
From leader to leadership: clinician managers and where to next?
Fulop, Liz; Day, Gary E
2010-08-01
Individual clinician leadership is at the forefront of health reforms in Australia as well as overseas with many programs run by health departments (and hospitals) generally focus on the development of individual leaders. This paper argues, along with others, that leadership in the clinician management context cannot be understood from an individualistic approach alone. Clinician managers, especially in the ranks of doctors, are usually described as 'hybrid-professional managers' as well as reluctant leaders for whom most leadership theories do not easily apply. Their experiences of leadership development programs run by health departments both in Australia and internationally are likely to be based on an individual leader-focussed approach that is driving health care reforms. These approaches work from three key assumptions: (1) study and fix the person; (2) give them a position or title; and (3) make them responsible for results. Some would argue that the combination of these three approaches equates to heroic and transformational leadership. Several alternative approaches to leadership development are presented to illustrate how reforms in healthcare, and notably in hospitals, must incorporate alternative approaches, such as those based on collective and relational forms of leadership. This does not mean eschewing individual approaches to leadership but rather, thinking of them differently and making them more relevant to the daily experiences of clinician managers. We conclude by highlighting several significant challenges facing leadership development for clinician managers that arise from these considerations.
What you do every day matters: A new direction for health promotion.
Gewurtz, Rebecca E; Moll, Sandra E; Letts, Lori J; Larivière, Nadine; Levasseur, Mélanie; Krupa, Terry M
2016-08-15
Canadian health promotion campaigns directed towards healthy living have traditionally emphasized discrete behaviours that influence health and wellbeing, such as diet, physical activity and smoking. Although this traditional approach is important and supported by evidence, it does not account for broader determinants of health. The purpose of this commentary is to propose an innovative health promotion approach that expands the healthy living discourse through a focus on patterns of daily activity. We highlight four key public health messages derived from a synthesis of existing research evidence. The messages are based on the premise that what you do every day has an important impact on health and well-being. Rather than being prescriptive or outlining minimum requirements, this approach invites reflection on various experiences and activity patterns that shape the health and well-being of individuals and communities. This broader and more inclusive approach to healthy living reflects diverse needs and experiences, making it relevant and attainable for people of all ages and abilities. Future efforts directed at operationalizing the key messages for individuals and communities hold much promise for populations that may be at risk of activity patterns believed to contribute to poor health and well-being.
NASA Technical Reports Server (NTRS)
Hoge, F. E.; Swift, R. N.
1983-01-01
Airborne lidar oil spill experiments carried out to determine the practicability of the AOFSCE (absolute oil fluorescence spectral conversion efficiency) computational model are described. The results reveal that the model is suitable over a considerable range of oil film thicknesses provided the fluorescence efficiency of the oil does not approach the minimum detection sensitivity limitations of the lidar system. Separate airborne lidar experiments to demonstrate measurement of the water column Raman conversion efficiency are also conducted to ascertain the ultimate feasibility of converting such relative oil fluorescence to absolute values. Whereas the AOFSCE model is seen as highly promising, further airborne water column Raman conversion efficiency experiments with improved temporal or depth-resolved waveform calibration and software deconvolution techniques are thought necessary for a final determination of suitability.
Makrakis, Vassilios; Kostoulas-Makrakis, Nelly
2016-02-01
Quantitative and qualitative approaches to planning and evaluation in education for sustainable development have often been treated by practitioners from a single research paradigm. This paper discusses the utility of mixed method evaluation designs which integrate qualitative and quantitative data through a sequential transformative process. Sequential mixed method data collection strategies involve collecting data in an iterative process whereby data collected in one phase contribute to data collected in the next. This is done through examples from a programme addressing the 'Reorientation of University Curricula to Address Sustainability (RUCAS): A European Commission Tempus-funded Programme'. It is argued that the two approaches are complementary and that there are significant gains from combining both. Using methods from both research paradigms does not, however, mean that the inherent differences among epistemologies and methodologies should be neglected. Based on this experience, it is recommended that using a sequential transformative mixed method evaluation can produce more robust results than could be accomplished using a single approach in programme planning and evaluation focussed on education for sustainable development. Copyright © 2015 Elsevier Ltd. All rights reserved.
Noonan, Robert J; Fairclough, Stuart J; Knowles, Zoe R; Boddy, Lynne M
2017-07-14
Understanding family physical activity (PA) behaviour is essential for designing effective family-based PA interventions. However, effective approaches to capture the perceptions and "lived experiences" of families are not yet well established. The aims of the study were to: (1) demonstrate how a "write, draw, show and tell" (WDST) methodological approach can be appropriate to family-based PA research, and (2) present two distinct family case studies to provide insights into the habitual PA behaviour and experiences of a nuclear and single-parent family. Six participants (including two "target" children aged 9-11 years, two mothers and two siblings aged 6-8 years) from two families were purposefully selected to take part in the study, based on their family structure. Participants completed a paper-based PA diary and wore an ActiGraph GT9X accelerometer on their left wrist for up to 10 weekdays and 16 weekend days. A range of WDST tasks were then undertaken by each family to offer contextual insight into their family-based PA. The selected families participated in different levels and modes of PA, and reported contrasting leisure opportunities and experiences. These novel findings encourage researchers to tailor family-based PA intervention programmes to the characteristics of the family.
Maeda, Jin; Suzuki, Tatsuya; Takayama, Kozo
2012-01-01
Design spaces for multiple dose strengths of tablets were constructed using a Bayesian estimation method with one set of design of experiments (DoE) of only the highest dose-strength tablet. The lubricant blending process for theophylline tablets with dose strengths of 100, 50, and 25 mg is used as a model manufacturing process in order to construct design spaces. The DoE was conducted using various Froude numbers (X(1)) and blending times (X(2)) for theophylline 100-mg tablet. The response surfaces, design space, and their reliability of the compression rate of the powder mixture (Y(1)), tablet hardness (Y(2)), and dissolution rate (Y(3)) of the 100-mg tablet were calculated using multivariate spline interpolation, a bootstrap resampling technique, and self-organizing map clustering. Three experiments under an optimal condition and two experiments under other conditions were performed using 50- and 25-mg tablets, respectively. The response surfaces of the highest-strength tablet were corrected to those of the lower-strength tablets by Bayesian estimation using the manufacturing data of the lower-strength tablets. Experiments under three additional sets of conditions of lower-strength tablets showed that the corrected design space made it possible to predict the quality of lower-strength tablets more precisely than the design space of the highest-strength tablet. This approach is useful for constructing design spaces of tablets with multiple strengths.
Feeser, Melanie; Fan, Yan; Weigand, Anne; Hahn, Adam; Gärtner, Matti; Aust, Sabine; Böker, Heinz; Bajbouj, Malek; Grimm, Simone
2014-12-01
Previous studies have shown that oxytocin (OXT) enhances social cognitive processes. It has also been demonstrated that OXT does not uniformly facilitate social cognition. The effects of OXT administration strongly depend on the exposure to stressful experiences in early life. Emotional facial recognition is crucial for social cognition. However, no study has yet examined how the effects of OXT on the ability to identify emotional faces are altered by early life stress (ELS) experiences. Given the role of OXT in modulating social motivational processes, we specifically aimed to investigate its effects on the recognition of approach- and avoidance-related facial emotions. In a double-blind, between-subjects, placebo-controlled design, 82 male participants performed an emotion recognition task with faces taken from the "Karolinska Directed Emotional Faces" set. We clustered the six basic emotions along the dimensions approach (happy, surprise, anger) and avoidance (fear, sadness, disgust). ELS was assessed with the Childhood Trauma Questionnaire (CTQ). Our results showed that OXT improved the ability to recognize avoidance-related emotional faces as compared to approach-related emotional faces. Whereas the performance for avoidance-related emotions in participants with higher ELS scores was comparable in both OXT and placebo condition, OXT enhanced emotion recognition in participants with lower ELS scores. Independent of OXT administration, we observed increased emotion recognition for avoidance-related faces in participants with high ELS scores. Our findings suggest that the investigation of OXT on social recognition requires a broad approach that takes ELS experiences as well as motivational processes into account.
Bowman, Margo; Treiman, Rebecca
2002-08-01
From an early age, children can go beyond rote memorization to form links between print and speech that are based on letter names in the initial positions of words (Treiman & Rodriguez, 1999; Treiman, Sotak, & Bowman, 2001). For example, children's knowledge of the name of the letter t helps them learn that the novel word TM is pronounced as team. Four experiments were carried out to determine whether letter names at the ends of words are equally useful. Four- and five-year-olds derived little benefit from such information in reading (Experiments 1 and 3) or spelling (Experiment 2), although adults did (Experiment 4). For young children, word-final information appears to have less influence on reading and spelling performance than does word-initial information. The results help delineate the circumstances under which children can go beyond a logographic approach in learning about print.
Rapp, David E; Lyon, Mark B; Orvieto, Marcelo A; Zagaja, Gregory P
2005-10-01
The classical approach to the undergraduate medical clerkship has several limitations, including variability of clinical exposure and method of examination. As a result, the clerkship experience does not ensure exposure to and reinforcement of the fundamental concepts of a given specialty. This article reviews the classic approach to clerkship education within the undergraduate medical education. Specific attention is placed on clinical exposure and clerkship examination. We describe the introduction of the Core Learning Objective (CLO) educational model at the University of Chicago Section of Urology. This model is designed to provide an efficient exposure to and evaluation of core clerkship learning objectives. The CLO model has been successfully initiated, focusing on both technical and clinical skill sets. The proposed model has been introduced with positive initial results and should allow for an efficient approach to the teaching and evaluation of core objectives in clerkship education.
A functional architecture of the human brain: Emerging insights from the science of emotion
Lindquist, Kristen A.; Barrett, Lisa Feldman
2012-01-01
The ‘faculty psychology’ approach to the mind, which attempts to explain mental function in terms of categories that reflect modular ‘faculties’, such as emotions, cognitions, and perceptions, has dominated research into the mind and its physical correlates. In this paper, we argue that brain organization does not respect the commonsense categories belonging to the faculty psychology approach. We review recent research from the science of emotion demonstrating that the human brain contains broadly distributed functional networks that can each be re-described as basic psychological operations that interact to produce a range of mental states, including, but not limited to, anger, sadness, fear, disgust, and so on. When compared to the faculty psychology approach, this ‘constructionist’ approach provides an alternative functional architecture to guide the design and interpretation of experiments in cognitive neuroscience. PMID:23036719
Carvalho, Rimenys J; Cruz, Thayana A
2018-01-01
High-throughput screening (HTS) systems have emerged as important tools to provide fast and low cost evaluation of several conditions at once since it requires small quantities of material and sample volumes. These characteristics are extremely valuable for experiments with large number of variables enabling the application of design of experiments (DoE) strategies or simple experimental planning approaches. Once, the capacity of HTS systems to mimic chromatographic purification steps was established, several studies were performed successfully including scale down purification. Here, we propose a method for studying different purification conditions that can be used for any recombinant protein, including complex and glycosylated proteins, using low binding filter microplates.
Tailoring Systems Engineering Projects for Small Satellite Missions
NASA Technical Reports Server (NTRS)
Horan, Stephen; Belvin, Keith
2013-01-01
NASA maintains excellence in its spaceflight systems by utilizing rigorous engineering processes based on over 50 years of experience. The NASA systems engineering process for flight projects described in NPR 7120.5E was initially developed for major flight projects. The design and development of low-cost small satellite systems does not entail the financial and risk consequences traditionally associated with spaceflight projects. Consequently, an approach is offered to tailoring of the processes such that the small satellite missions will benefit from the engineering rigor without overly burdensome overhead. In this paper we will outline the approaches to tailoring the standard processes for these small missions and describe how it will be applied in a proposed small satellite mission.
Elmer-Dixon, Margaret M; Bowler, Bruce E
2018-05-19
A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.
Willecke, N; Szepes, A; Wunderlich, M; Remon, J P; Vervaet, C; De Beer, T
2018-04-21
The overall objective of this work is to understand how excipient characteristics influence the drug product quality attributes and process performance of a continuous twin screw wet granulation process. The knowledge gained in this study is intended to be used for Quality by Design (QbD)-based formulation design and formulation optimization. Three principal components which represent the overarching properties of 8 selected pharmaceutical fillers were used as factors, whereas factors 4 and 5 represented binder type and binder concentration in a design of experiments (DoE). The majority of process parameters were kept constant to minimize their influence on the granule and drug product quality. 27 DoE batches consisting of binary filler/binder mixtures were processed via continuous twin screw wet granulation followed by tablet compression. Multiple linear regression models were built providing understanding of the impact of filler and binder properties on granule and tablet quality attributes (i.e. 16 DoE responses). The impact of fillers on the granule and tablet responses was more dominant compared to the impact of binder type and concentration. The filler properties had a relevant effect on granule characteristics, such as particle size, friability and specific surface area. Binder type and concentration revealed a relevant influence on granule flowability and friability as well as on the compactability (required compression force during tableting to obtain target hardness). In order to evaluate the DoE models' validity, a verification of the DoE models was performed with new formulations (i.e. a new combination of filler, binder type and binder concentration) which were initially not included in the dataset used to build the DoE models. The combined PCA (principle component analysis)/DoE approach allowed to link the excipient properties with the drug product quality attributes. Copyright © 2018 Elsevier B.V. All rights reserved.
Dos Santos Mesquita, Cristina; da Costa Maia, Ângela
2016-12-01
Psychiatric patients report higher levels of victimisation and are at risk for further victimisation in different contexts, such as psychiatric institutions. Studies in this field tend to focus on hospital staff as victims, experiencing classic forms of victimisation (e.g. physical assault, threats, verbal abuse), through qualitative studies. This is a quantitative retrospective study that aims to know the occurrence of psychiatric victimisation and other adverse experiences in Portuguese psychiatric patients. Ninety-five psychiatric patients, between 20 and 79 years old (M - 45.18, SD - 13.06), with a history of psychiatric hospitalisation answered the Experiences in Psychiatric Institution Inventory. Participants were recruited in four psychiatric hospitals. Inpatients were approached during their hospitalisation; outpatients were approached in scheduled appointment days. Only 23 (24.2%) participants reported no victimisation. Total Experiences of Self varied from 0 to 7 (M - 1.75, SD - 1.72), Total Witnessed Experiences varied from 0 to 7 (M - 1.17, SD - 1.64), and Total Global Experiences varied from 0 to 14 (M - 2.92, SD - 3.01). These results show that victimisation and adverse experiences in psychiatric contexts are frequent and go beyond classic forms of victimisation. A deeper knowledge of these experiences and their impact in the mental health of psychiatric patients may promote quality of care provided and lead to more effective treatments, thus reducing the number and length of hospitalisations, and the financial burden for public health services. © 2016 Nordic College of Caring Science.
Grewe, Oliver; Nagel, Frederik; Kopiez, Reinhard; Altenmüller, Eckart
2007-11-01
Most people are able to identify basic emotions expressed in music and experience affective reactions to music. But does music generally induce emotion? Does it elicit subjective feelings, physiological arousal, and motor reactions reliably in different individuals? In this interdisciplinary study, measurement of skin conductance, facial muscle activity, and self-monitoring were synchronized with musical stimuli. A group of 38 participants listened to classical, rock, and pop music and reported their feelings in a two-dimensional emotion space during listening. The first entrance of a solo voice or choir and the beginning of new sections were found to elicit interindividual changes in subjective feelings and physiological arousal. Quincy Jones' "Bossa Nova" motivated movement and laughing in more than half of the participants. Bodily reactions such as "goose bumps" and "shivers" could be stimulated by the "Tuba Mirum" from Mozart's Requiem in 7 of 38 participants. In addition, the authors repeated the experiment seven times with one participant to examine intraindividual stability of effects. This exploratory combination of approaches throws a new light on the astonishing complexity of affective music listening.
An Inquiry-Based Approach to Study the Synapse: Student-Driven Experiments Using C. elegans
Lemons, Michele L.
2016-01-01
Inquiry-based instruction has been well demonstrated to enhance long term retention and to improve application and synthesis of knowledge. Here we describe an inquiry-based teaching module that trains undergraduates as scientists who pose questions, design and execute hypothesis-driven experiments, analyze data and communicate their research findings. Before students design their research projects, they learn and practice several research techniques with the model organism, Caenorhabditis elegans. This nematode is an ideal choice for experimentation in an undergraduate lab due to its powerful genetics, ease and low cost of maintenance, and amenability for undergraduate training. Students are challenged to characterize an instructor-assigned “mystery mutant” C. elegans strain. The “mystery mutant” strain has a defect in cholinergic synaptic transmission. Students are well poised to experimentally test how the mutation impacts synaptic transmission. For example, students design experiments that address questions including: Does the effected gene influence acetylcholine neurotransmitter release? Does it inhibit postsynaptic cholinergic receptors? Students must apply their understanding of the synapse while using their recently acquired research skills (including aldicarb and levamisole assays) to successfully design, execute and analyze their experiments. Students prepare an experimental plan and a timeline for proposed experiments. Undergraduates work collaboratively in pairs and share their research findings in oral and written formats. Modifications to suit instructor-specific goals and courses with limited or no lab time are provided. Students have anonymously reported their surprise regarding how much can be learned from a worm and feelings of satisfaction from conducting research experiments of their own design. PMID:27980470
Lopes, Sidnei Antônio; Paulino, Mário Fonseca; Detmann, Edenio; Valente, Ériton Egídio Lisboa; de Barros, Lívia Vieira; Rennó, Luciana Navajas; de Campos Valadares Filho, Sebastião; Martins, Leandro Soares
2016-08-01
The aim of this study was to evaluate the effects of beef calves' supplementation in creep feeding systems on milk yield, body weight (BW), and body condition score (BCS) of their dams on tropical pastures using a meta-analytical approach. The database was obtained from 11 experiments conducted between 2009 and 2014 in Brazil, totaling 485 observations (cows). The database consisted of 273 Nellore and 212 crossbred (7/8 Nellore × 1/8 Holstein) cows. All experiments were carried out in the suckling phase (from 3 to 8 months of age of calves) during the transition phase between rainy and dry seasons from February to June of different years. The data were analyzed by a meta-analytical approach using mixed models and taking into account random variation among experiments. Calves' supplementation (P ≥ 0.59) and the calves' sex (P ≥ 0.48) did not affect milk yield of cows. The average fat-corrected milk (FCM) yield was 6.71 and 6.83 kg/day for cows that had their calves supplemented and not supplemented, respectively. Differences were observed (P < 0.0001) for milk yield due to the genetic group where crossbred cows presented greater FCM yield (7.37 kg/day) compared with Nellore cows (6.17 kg/day). There was no effect of the calves' supplementation on BW change (P ≥ 0.11) and BCS change (P ≥ 0.23) of the cows. Therefore, it is concluded that supplementation of beef calves using creep feeding systems in tropical pastures does not affect milk yield, body weight, or body condition of their dams.
Holmes, William J; Darby, Richard AJ; Wilks, Martin DB; Smith, Rodney; Bill, Roslyn M
2009-01-01
Background The optimisation and scale-up of process conditions leading to high yields of recombinant proteins is an enduring bottleneck in the post-genomic sciences. Typical experiments rely on varying selected parameters through repeated rounds of trial-and-error optimisation. To rationalise this, several groups have recently adopted the 'design of experiments' (DoE) approach frequently used in industry. Studies have focused on parameters such as medium composition, nutrient feed rates and induction of expression in shake flasks or bioreactors, as well as oxygen transfer rates in micro-well plates. In this study we wanted to generate a predictive model that described small-scale screens and to test its scalability to bioreactors. Results Here we demonstrate how the use of a DoE approach in a multi-well mini-bioreactor permitted the rapid establishment of high yielding production phase conditions that could be transferred to a 7 L bioreactor. Using green fluorescent protein secreted from Pichia pastoris, we derived a predictive model of protein yield as a function of the three most commonly-varied process parameters: temperature, pH and the percentage of dissolved oxygen in the culture medium. Importantly, when yield was normalised to culture volume and density, the model was scalable from mL to L working volumes. By increasing pre-induction biomass accumulation, model-predicted yields were further improved. Yield improvement was most significant, however, on varying the fed-batch induction regime to minimise methanol accumulation so that the productivity of the culture increased throughout the whole induction period. These findings suggest the importance of matching the rate of protein production with the host metabolism. Conclusion We demonstrate how a rational, stepwise approach to recombinant protein production screens can reduce process development time. PMID:19570229
Elmiger, Marco P; Poetzsch, Michael; Steuer, Andrea E; Kraemer, Thomas
2018-03-06
High resolution mass spectrometry and modern data independent acquisition (DIA) methods enable the creation of general unknown screening (GUS) procedures. However, even when DIA is used, its potential is far from being exploited, because often, the untargeted acquisition is followed by a targeted search. Applying an actual GUS (including untargeted screening) produces an immense amount of data that must be dealt with. An optimization of the parameters regulating the feature detection and hit generation algorithms of the data processing software could significantly reduce the amount of unnecessary data and thereby the workload. Design of experiment (DoE) approaches allow a simultaneous optimization of multiple parameters. In a first step, parameters are evaluated (crucial or noncrucial). Second, crucial parameters are optimized. The aim in this study was to reduce the number of hits, without missing analytes. The obtained parameter settings from the optimization were compared to the standard settings by analyzing a test set of blood samples spiked with 22 relevant analytes as well as 62 authentic forensic cases. The optimization lead to a marked reduction of workload (12.3 to 1.1% and 3.8 to 1.1% hits for the test set and the authentic cases, respectively) while simultaneously increasing the identification rate (68.2 to 86.4% and 68.8 to 88.1%, respectively). This proof of concept study emphasizes the great potential of DoE approaches to master the data overload resulting from modern data independent acquisition methods used for general unknown screening procedures by optimizing software parameters.
34 CFR 644.22 - How does the Secretary evaluate prior experience?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary evaluate prior experience? 644.22 Section 644.22 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION EDUCATIONAL OPPORTUNITY CENTERS How Does the...
Design of experiments (DOE) - history, concepts, and relevance to in vitro culture
USDA-ARS?s Scientific Manuscript database
Design of experiments (DOE) is a large and well-developed field for understanding and improving the performance of complex systems. Because in vitro culture systems are complex, but easily manipulated in controlled conditions, they are particularly well-suited for the application of DOE principle...
Muscat Galea, Charlene; Didion, David; Clicq, David; Mangelings, Debby; Vander Heyden, Yvan
2017-12-01
A supercritical chromatographic method for the separation of a drug and its impurities has been developed and optimized applying an experimental design approach and chromatogram simulations. Stationary phase screening was followed by optimization of the modifier and injection solvent composition. A design-of-experiment (DoE) approach was then used to optimize column temperature, back-pressure and the gradient slope simultaneously. Regression models for the retention times and peak widths of all mixture components were built. The factor levels for different grid points were then used to predict the retention times and peak widths of the mixture components using the regression models and the best separation for the worst separated peak pair in the experimental domain was identified. A plot of the minimal resolutions was used to help identifying the factor levels leading to the highest resolution between consecutive peaks. The effects of the DoE factors were visualized in a way that is familiar to the analytical chemist, i.e. by simulating the resulting chromatogram. The mixture of an active ingredient and seven impurities was separated in less than eight minutes. The approach discussed in this paper demonstrates how SFC methods can be developed and optimized efficiently using simple concepts and tools. Copyright © 2017 Elsevier B.V. All rights reserved.
Al Qaroot, Bashar S; Sobuh, Mohammad
2016-06-01
Problem-based learning (where rather than feeding students the knowledge, they look for it themselves) has long been thought of as an ideal approach in teaching because it would encourage students to acquire knowledge from an undetermined medium of wrong and right answers. However, the effect of such approach in the learning experience of prosthetics and orthotics students has never been investigated. This study explores the implications of integrating problem-based learning into teaching on the students' learning experience via implementing a research-informed clinical practice module into the curriculum of last year prosthetics and orthotics undergraduate students at the University of Jordan (Amman, Jordan). Qualitative research pilot study. Grounded theory approach was used based on the data collected from interviewing a focus group of four students. Students have identified a number of arguments from their experience in the research-informed clinical practice where, generally speaking, students described research-informed clinical practice as a very good method of education. Integrating problem-based learning into teaching has many positive implications. In particular, students pointed out that their learning experience and clinical practice have much improved after the research-informed clinical practice. Findings from this investigation demonstrate that embedding problem-based learning into prosthetics and orthotics students' curriculum has the potential to enhance students' learning experience, particularly students' evidence-based practice. This may lead to graduates who are more knowledgeable and thus who can offer the optimal patient care (i.e. clinical practice). © The International Society for Prosthetics and Orthotics 2014.
Does syntax contribute to the function of duets in a parrot, Amazona auropalliata?
Dahlin, Christine R; Wright, Timothy F
2012-07-01
Complex acoustic signals in many animal species are characterized by a syntax that governs how different notes are combined, but the importance of syntax to the communicative function of signals is not well understood. Mated pairs of yellow-naped amazons, Amazona auropalliata, produce coordinated vocal duets that are used for territory maintenance and defense. These duets follow rules that specify the ordering of notes within duets, such as a strict alternation of sex-specific notes and a defined progression of note types through each duet. These syntactical rules may function to define sex-specific roles, improve coordination, and allow individuals to combine calls into meaningful sequences. As a first step toward understanding the functional significance of syntax, we conducted two separate audio playback experiments in which we presented nesting pairs with normal duets and duets with broken syntax (i.e., one of the syntactic rules was broken). In Experiment One, we reversed the order of female and male notes within note pairs while retaining the typical progression of note types through a duet. In Experiment Two we reversed the order of note types across a whole duet while retaining the typical female-male ordering within note pairs. We hypothesized that duets with broken syntax would be less-effective signals than duets with normal syntax and predicted that pairs would respond less to broken syntax than to normal duets. Contrary to predictions, we did not observe differences in response between treatments for any variables except latency to approach the speaker. After we combined data across experiments post hoc, we observed longer latencies to approach the speakers after playbacks of broken syntax duets, suggesting that pairs could differentiate between playbacks. These responses suggest that breaking one rule of duet syntax at a time does not result in detectable loss of signal efficacy in the context of territorial intrusions.
Andri, Bertyl; Dispas, Amandine; Marini, Roland Djang'Eing'a; Hubert, Philippe; Sassiat, Patrick; Al Bakain, Ramia; Thiébaut, Didier; Vial, Jérôme
2017-03-31
This work presents a first attempt to establish a model of the retention behaviour for pharmaceutical compounds in gradient mode SFC. For this purpose, multivariate statistics were applied on the basis of data gathered with the Design of Experiment (DoE) methodology. It permitted to build optimally the experiments needed, and served as a basis for providing relevant physicochemical interpretation of the effects observed. Data gathered over a broad experimental domain enabled the establishment of well-fit linear models of the retention of the individual compounds in presence of methanol as co-solvent. These models also allowed the appreciation of the impact of each experimental parameter and their factorial combinations. This approach was carried out with two organic modifiers (i.e. methanol and ethanol) and provided comparable results. Therefore, it demonstrates the feasibility to model retention in gradient mode SFC for individual compounds as a function of the experimental conditions. This approach also permitted to highlight the predominant effect of some parameters (e.g. gradient slope and pressure) on the retention of compounds. Because building of individual models of retention was possible, the next step considered the establishment of a global model of the retention to predict the behaviour of given compounds on the basis of, on the one side, the physicochemical descriptors of the compounds (e.g. Linear Solvation Energy Relationship (LSER) descriptors) and, on the other side, of the experimental conditions. This global model was established by means of partial least squares regression for the selected compounds, in an experimental domain defined by the Design of Experiment (DoE) methodology. Assessment of the model's predictive capabilities revealed satisfactory agreement between predicted and actual retention (i.e. R 2 =0.942, slope=1.004) of the assessed compounds, which is unprecedented in the field. Copyright © 2017 Elsevier B.V. All rights reserved.
All-optical atom trap as a target for MOTRIMS-like collision experiments
NASA Astrophysics Data System (ADS)
Sharma, S.; Acharya, B. P.; De Silva, A. H. N. C.; Parris, N. W.; Ramsey, B. J.; Romans, K. L.; Dorn, A.; de Jesus, V. L. B.; Fischer, D.
2018-04-01
Momentum-resolved scattering experiments with laser-cooled atomic targets have been performed since almost two decades with magneto-optical trap recoil ion momentum spectroscopy (MOTRIMS) setups. Compared to experiments with gas-jet targets, MOTRIMS features significantly lower target temperatures allowing for an excellent recoil ion momentum resolution. However, the coincident and momentum-resolved detection of electrons was long rendered impossible due to incompatible magnetic field requirements. Here we report on an experimental approach which is based on an all-optical 6Li atom trap that—in contrast to magneto-optical traps—does not require magnetic field gradients in the trapping region. Atom temperatures of about 2 mK and number densities up to 109 cm-3 make this trap ideally suited for momentum-resolved electron-ion coincidence experiments. The overall configuration of the trap is very similar to conventional magneto-optical traps. It mainly requires small modifications of laser beam geometries and polarization which makes it easily implementable in other existing MOTRIMS experiments.
Energy Efficient Approach in RFID Network
NASA Astrophysics Data System (ADS)
Mahdin, Hairulnizam; Abawajy, Jemal; Salwani Yaacob, Siti
2016-11-01
Radio Frequency Identification (RFID) technology is among the key technology of Internet of Things (IOT). It is a sensor device that can monitor, identify, locate and tracking physical objects via its tag. The energy in RFID is commonly being used unwisely because they do repeated readings on the same tag as long it resides in the reader vicinity. Repeated readings are unnecessary because it only generate duplicate data that does not contain new information. The reading process need to be schedule accordingly to minimize the chances of repeated readings to save the energy. This will reduce operational cost and can prolong the tag's battery lifetime that cannot be replaced. In this paper, we propose an approach named SELECT to minimize energy spent during reading processes. Experiments conducted shows that proposed algorithm contribute towards significant energy savings in RFID compared to other approaches.
A robust quantitative near infrared modeling approach for blend monitoring.
Mohan, Shikhar; Momose, Wataru; Katz, Jeffrey M; Hossain, Md Nayeem; Velez, Natasha; Drennen, James K; Anderson, Carl A
2018-01-30
This study demonstrates a material sparing Near-Infrared modeling approach for powder blend monitoring. In this new approach, gram scale powder mixtures are subjected to compression loads to simulate the effect of scale using an Instron universal testing system. Models prepared by the new method development approach (small-scale method) and by a traditional method development (blender-scale method) were compared by simultaneously monitoring a 1kg batch size blend run. Both models demonstrated similar model performance. The small-scale method strategy significantly reduces the total resources expended to develop Near-Infrared calibration models for on-line blend monitoring. Further, this development approach does not require the actual equipment (i.e., blender) to which the method will be applied, only a similar optical interface. Thus, a robust on-line blend monitoring method can be fully developed before any large-scale blending experiment is viable, allowing the blend method to be used during scale-up and blend development trials. Copyright © 2017. Published by Elsevier B.V.
Galaj, E; Shukur, A; Manuszak, M; Newman, K; Ranaldi, R
2017-05-01
Environmental enrichment (EE) produces differential effects on psychostimulant-related behaviors. Therefore, we investigated whether the timing of EE exposure - during rearing and before cocaine exposure versus in adulthood and after cocaine exposure might be a determining factor. In Experiment 1, rats reared with EE or not (non-EE) were conditioned with cocaine (5, 10 or 20mg/kg) in one compartment of a CPP apparatus and saline in the other, and later tested for cocaine CPP. In Experiment 2, locomotor activity in response to repeated injections of saline or cocaine was measured in rats raised with EE or non-EE. In Experiment 3 we measured the effects of EE or non-EE during rearing on food-based conditioned approach learning. In Experiment 4, rats were exposed to cocaine CPP conditioning then underwent 60days of EE or non-EE treatment after which they were tested for cocaine CPP. Our results show that rearing in EE did not reduce cocaine CPP or cocaine-induced locomotor activity (Experiments 1 and 2) but significantly facilitated conditioned approach learning (Experiment 3). On the other hand, EE treatment introduced after cocaine conditioning significantly reduced the expression of cocaine CPP (Experiment 4). These findings suggest that EE does not protect against cocaine's rewarding and stimulant effects but can reduce already established cocaine effects, suggesting that EE might be an effective treatment for cocaine addiction-related behaviors. Copyright © 2017 Elsevier Inc. All rights reserved.
Translating standards into practice - one Semantic Web API for Gene Expression.
Deus, Helena F; Prud'hommeaux, Eric; Miller, Michael; Zhao, Jun; Malone, James; Adamusiak, Tomasz; McCusker, Jim; Das, Sudeshna; Rocca Serra, Philippe; Fox, Ronan; Marshall, M Scott
2012-08-01
Sharing and describing experimental results unambiguously with sufficient detail to enable replication of results is a fundamental tenet of scientific research. In today's cluttered world of "-omics" sciences, data standards and standardized use of terminologies and ontologies for biomedical informatics play an important role in reporting high-throughput experiment results in formats that can be interpreted by both researchers and analytical tools. Increasing adoption of Semantic Web and Linked Data technologies for the integration of heterogeneous and distributed health care and life sciences (HCLSs) datasets has made the reuse of standards even more pressing; dynamic semantic query federation can be used for integrative bioinformatics when ontologies and identifiers are reused across data instances. We present here a methodology to integrate the results and experimental context of three different representations of microarray-based transcriptomic experiments: the Gene Expression Atlas, the W3C BioRDF task force approach to reporting Provenance of Microarray Experiments, and the HSCI blood genomics project. Our approach does not attempt to improve the expressivity of existing standards for genomics but, instead, to enable integration of existing datasets published from microarray-based transcriptomic experiments. SPARQL Construct is used to create a posteriori mappings of concepts and properties and linking rules that match entities based on query constraints. We discuss how our integrative approach can encourage reuse of the Experimental Factor Ontology (EFO) and the Ontology for Biomedical Investigations (OBIs) for the reporting of experimental context and results of gene expression studies. Copyright © 2012 Elsevier Inc. All rights reserved.
A statistical approach to selecting and confirming validation targets in -omics experiments
2012-01-01
Background Genomic technologies are, by their very nature, designed for hypothesis generation. In some cases, the hypotheses that are generated require that genome scientists confirm findings about specific genes or proteins. But one major advantage of high-throughput technology is that global genetic, genomic, transcriptomic, and proteomic behaviors can be observed. Manual confirmation of every statistically significant genomic result is prohibitively expensive. This has led researchers in genomics to adopt the strategy of confirming only a handful of the most statistically significant results, a small subset chosen for biological interest, or a small random subset. But there is no standard approach for selecting and quantitatively evaluating validation targets. Results Here we present a new statistical method and approach for statistically validating lists of significant results based on confirming only a small random sample. We apply our statistical method to show that the usual practice of confirming only the most statistically significant results does not statistically validate result lists. We analyze an extensively validated RNA-sequencing experiment to show that confirming a random subset can statistically validate entire lists of significant results. Finally, we analyze multiple publicly available microarray experiments to show that statistically validating random samples can both (i) provide evidence to confirm long gene lists and (ii) save thousands of dollars and hundreds of hours of labor over manual validation of each significant result. Conclusions For high-throughput -omics studies, statistical validation is a cost-effective and statistically valid approach to confirming lists of significant results. PMID:22738145
An Observing System Simulation Experiment Approach to Meteorological Network Assessment
NASA Astrophysics Data System (ADS)
Abbasnezhadi, K.; Rasmussen, P. F.; Stadnyk, T.; Boluwade, A.
2016-12-01
A proper knowledge of the spatiotemporal distribution of rainfall is important in order to conduct a mindful investigation of water movement and storage throughout a catchment. Currently, the most accurate precipitation information available for the remote Boreal ecozones of northern Manitoba is coming from the Canadian Precipitation Analysis (CaPA) data assimilation system. Throughout the Churchill River Basin (CRB), CaPA still does not have the proper skill due to the limited number of weather stations. A new approach to experimental network design was investigated based on the concept of Observing System Simulation Experiment (OSSE). The OSSE-based network assessment procedure which simulates the CaPA system provides a scientific and hydrologically significant tool to assess the sensitivity of CaPA precipitation analysis to observation network density throughout the CRB. To simulate CaPA system, synthetic background and station data were simulated, respectively, by adding spatially uncorrelated and correlated Gaussian noises to an assumingly true daily weather field synthesized by a gridded precipitation generator which simulates CaPA data. Given the true reference field on one hand, and a set of pseudo-CaPA analyses associated with different network realizations on the other hand, a WATFLOOD hydrological model was employed to compare the modeled runoff. The simulations showed that as network density increases, the accuracy of CaPA precipitation products improves up to a certain limit beyond which adding more stations to the network does not result in further accuracy.
Optimization of laser butt welding parameters with multiple performance characteristics
NASA Astrophysics Data System (ADS)
Sathiya, P.; Abdul Jaleel, M. Y.; Katherasan, D.; Shanmugarajan, B.
2011-04-01
This paper presents a study carried out on 3.5 kW cooled slab laser welding of 904 L super austenitic stainless steel. The joints have butts welded with different shielding gases, namely argon, helium and nitrogen, at a constant flow rate. Super austenitic stainless steel (SASS) normally contains high amount of Mo, Cr, Ni, N and Mn. The mechanical properties are controlled to obtain good welded joints. The quality of the joint is evaluated by studying the features of weld bead geometry, such as bead width (BW) and depth of penetration (DOP). In this paper, the tensile strength and bead profiles (BW and DOP) of laser welded butt joints made of AISI 904 L SASS are investigated. The Taguchi approach is used as a statistical design of experiment (DOE) technique for optimizing the selected welding parameters. Grey relational analysis and the desirability approach are applied to optimize the input parameters by considering multiple output variables simultaneously. Confirmation experiments have also been conducted for both of the analyses to validate the optimized parameters.
Personalized recommendation based on heat bidirectional transfer
NASA Astrophysics Data System (ADS)
Ma, Wenping; Feng, Xiang; Wang, Shanfeng; Gong, Maoguo
2016-02-01
Personalized recommendation has become an increasing popular research topic, which aims to find future likes and interests based on users' past preferences. Traditional recommendation algorithms pay more attention to forecast accuracy by calculating first-order relevance, while ignore the importance of diversity and novelty that provide comfortable experiences for customers. There are some levels of contradictions between these three metrics, so an algorithm based on bidirectional transfer is proposed in this paper to solve this dilemma. In this paper, we agree that an object that is associated with history records or has been purchased by similar users should be introduced to the specified user and recommendation approach based on heat bidirectional transfer is proposed. Compared with the state-of-the-art approaches based on bipartite network, experiments on two benchmark data sets, Movielens and Netflix, demonstrate that our algorithm has better performance on accuracy, diversity and novelty. Moreover, this method does better in exploiting long-tail commodities and cold-start problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-06-01
The U.S. Department of Energy (DOE) Office of Legacy Management developed this report as a guide for discussions with the Colorado State regulators and other interested stakeholders in response to increased drilling for natural gas reserves near the underground nuclear explosion site at Rulison, Colorado. The Rulison site is located in the Piceance Basin of western Colorado, 40 miles northeast of Grand Junction. The Rulison test was the second natural gas reservoir stimulation experiment in the Plowshare Program, which was designed to develop peaceful uses for nuclear energy. On September 10, 1969, the U.S. Atomic Energy Commission, a predecessor agencymore » of DOE, detonated a 40-kiloton nuclear device 8426 feet below the ground surface in an attempt to release commercially marketable quantities of natural gas. The blast vaporized surrounding rock and formed a cavity about 150 feet in diameter. Although the contaminated materials from drilling operations were subsequently removed from the surface of the blast site, no feasible technology exists to remove subsurface radioactive contamination in or around the test cavity. An increase in drilling for natural gas near the site has raised concern about the possibility of encountering residual radioactivity from the area of the detonation. DOE prohibits drilling in the 40-acre lot surrounding the blast site at a depth below 6000 feet. DOE has no evidence that indicates contamination from the Rulison site detonation has migrated or will ever migrate beyond the 40-acre institutional control boundary. The Colorado Oil and Gas Conservation Commission (COGCC) established two wider boundaries around the site. When a company applies for a permit to drill within a 3-mile radius of surface ground zero, COGCC notifies DOE and provides an opportunity to comment on the application. COGCC also established a half-mile radius around surface ground zero. An application to drill within one-half mile requires a full hearing before the commission. This report outlines DOE's recommendation that gas developers adopt a conservative, staged drilling approach allowing gas reserves near the Rulison site to be recovered in a manner that minimizes the likelihood of encountering contamination. This staged approach calls for collecting data from wells outside the half-mile zone before drilling closer, and then drilling within the half-mile zone in a sequential manner, first at low contamination probability locations and then moving inward. DOE's recommended approach for drilling in this area will protect public safety while allowing collection of additional data to confirm that contamination is contained within the 40-acre institutional control boundary.« less
Decontamination & decommissioning focus area
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-08-01
In January 1994, the US Department of Energy Office of Environmental Management (DOE EM) formally introduced its new approach to managing DOE`s environmental research and technology development activities. The goal of the new approach is to conduct research and development in critical areas of interest to DOE, utilizing the best talent in the Department and in the national science community. To facilitate this solutions-oriented approach, the Office of Science and Technology (EM-50, formerly the Office of Technology Development) formed five Focus AReas to stimulate the required basic research, development, and demonstration efforts to seek new, innovative cleanup methods. In Februarymore » 1995, EM-50 selected the DOE Morgantown Energy Technology Center (METC) to lead implementation of one of these Focus Areas: the Decontamination and Decommissioning (D & D) Focus Area.« less
Ameringer, Suzanne; Erickson, Jeanne M; Macpherson, Catherine Fiona; Stegenga, Kristin; Linder, Lauri A
2015-12-01
Adolescents and young adults (AYAs) with cancer experience multiple distressing symptoms during treatment. Because the typical approach to symptom assessment does not easily reflect the symptom experience of individuals, alternative approaches to enhancing communication between the patient and provider are needed. We developed an iPad-based application that uses a heuristic approach to explore AYAs' cancer symptom experiences. In this mixed-methods descriptive study, 72 AYAs (13-29 years old) with cancer receiving myelosuppressive chemotherapy used the Computerized Symptom Capture Tool (C-SCAT) to create images of the symptoms and symptom clusters they experienced from a list of 30 symptoms. They answered open-ended questions within the C-SCAT about the causes of their symptoms and symptom clusters. The images generated through the C-SCAT and accompanying free-text data were analyzed using descriptive, content, and visual analyses. Most participants (n = 70) reported multiple symptoms (M = 8.14). The most frequently reported symptoms were nausea (65.3%), feeling drowsy (55.6%), lack of appetite (55.6%), and lack of energy (55.6%). Forty-six grouped their symptoms into one or more clusters. The most common symptom cluster was nausea/eating problems/appetite problems. Nausea was most frequently named as the priority symptom in a cluster and as a cause of other symptoms. Although common threads were present in the symptoms experienced by AYAs, the graphic images revealed unique perspectives and a range of complexity of symptom relationships, clusters, and causes. Results highlight the need for a tailored approach to symptom management based on how the AYA with cancer perceives his or her symptom experience. © 2015 Wiley Periodicals, Inc.
Toward the optimization of normalized graph Laplacian.
Xie, Bo; Wang, Meng; Tao, Dacheng
2011-04-01
Normalized graph Laplacian has been widely used in many practical machine learning algorithms, e.g., spectral clustering and semisupervised learning. However, all of them use the Euclidean distance to construct the graph Laplacian, which does not necessarily reflect the inherent distribution of the data. In this brief, we propose a method to directly optimize the normalized graph Laplacian by using pairwise constraints. The learned graph is consistent with equivalence and nonequivalence pairwise relationships, and thus it can better represent similarity between samples. Meanwhile, our approach, unlike metric learning, automatically determines the scale factor during the optimization. The learned normalized Laplacian matrix can be directly applied in spectral clustering and semisupervised learning algorithms. Comprehensive experiments demonstrate the effectiveness of the proposed approach.
Social Networking Sites' Influence on Travelers' Authentic Experience a Case Study of Couch Surfing
ERIC Educational Resources Information Center
Liu, Xiao
2013-01-01
This study explored travelers' experiences in the era of network hospitality 2.0 using CouchSurfing.org as a case study. The following research questions guided this study: 1) what experience does CouchSurfing create for travelers before, during and after their travel? 2) how does couch surfers' experience relate to authenticity in context of…
Prior experiences associated with residents' scores on a communication and interpersonal skill OSCE.
Yudkowsky, Rachel; Downing, Steven M; Ommert, Dennis
2006-09-01
This exploratory study investigated whether prior task experience and comfort correlate with scores on an assessment of patient-centered communication. A six-station standardized patient exam assessed patient-centered communication of 79 PGY2-3 residents in Internal Medicine and Family Medicine. A survey provided information on prior experiences. t-tests, correlations, and multi-factorial ANOVA explored relationship between scores and experiences. Experience with a task predicted comfort but did not predict communication scores. Comfort was moderately correlated with communication scores for some tasks; residents who were less comfortable were indeed less skilled, but greater comfort did not predict higher scores. Female gender and medical school experiences with standardized patients along with training in patient-centered interviewing were associated with higher scores. Residents without standardized patient experiences in medical school were almost five times more likely to be rejected by patients. Task experience alone does not guarantee better communication, and may instill a false sense of confidence. Experiences with standardized patients during medical school, especially in combination with interviewing courses, may provide an element of "deliberate practice" and have a long-term impact on communication skills. The combination of didactic courses and practice with standardized patients may promote a patient-centered approach.
Buyel, Johannes Felix; Fischer, Rainer
2014-01-31
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Changes in the representation of space and time while listening to music
Schäfer, Thomas; Fachner, Jörg; Smukalla, Mario
2013-01-01
Music is known to alter people's ordinary experience of space and time. Not only does this challenge the concept of invariant space and time tacitly assumed in psychology but it may also help us understand how music works and how music can be understood as an embodied experience. Yet research about these alterations is in its infancy. This review is intended to delineate a future research agenda. We review experimental evidence and subjective reports of the influence of music on the representation of space and time and present prominent approaches to explaining these effects. We discuss the role of absorption and altered states of consciousness and their associated changes in attention and neurophysiological processes, as well as prominent models of human time processing and time experience. After integrating the reviewed research, we conclude that research on the influence of music on the representation of space and time is still quite inconclusive but that integrating the different approaches could lead to a better understanding of the observed effects. We also provide a working model that integrates a large part of the evidence and theories. Several suggestions for further research in both music psychology and cognitive psychology are outlined. PMID:23964254
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohanty, Subhasish; Soppet, William K.; Majumdar, Saurindranath
Argonne National Laboratory (ANL), under the sponsorship of Department of Energy’s Light Water Reactor Sustainability (LWRS) program, is trying to develop a mechanistic approach for more accurate life estimation of LWR components. In this context, ANL has conducted many fatigue experiments under different test and environment conditions on type 316 stainless steel (316SS) material which is widely used in the US reactors. Contrary to the conventional S~N curve based empirical fatigue life estimation approach, the aim of the present DOE sponsored work is to develop an understanding of the material ageing issues more mechanistically (e.g. time dependent hardening and softening)more » under different test and environmental conditions. Better mechanistic understanding will help develop computer-based advanced modeling tools to better extrapolate stress-strain evolution of reactor components under multi-axial stress states and hence help predict their fatigue life more accurately. In this paper (part-I) the fatigue experiments under different test and environment conditions and related stress-strain results for 316 SS are discussed. In a second paper (part-II) the related evolutionary cyclic plasticity material modeling techniques and results are discussed.« less
Bothersome tinnitus : Cognitive behavioral perspectives.
Cima, R F F
2018-05-01
Tinnitus is not traceable to a single disease or pathology, but merely a symptom, which is distressing to some but not all individuals able to perceive it. The experience of tinnitus does not equate to tinnitus distress. Tinnitus suffering might be understood as a function of tinnitus-related distress in that bothersome tinnitus is an illness rather than a disease. In bothersome (distressing) tinnitus, the perception of the characteristic sound is a very disturbing and bothersome experience because of maladaptive psychological responses. Several cognitive and behavioral theoretical frameworks attempting to explain the nature and cause of tinnitus suffering have been introduced in and will be summarized here. Current treatment approaches are generally based on models that aim to: alleviate the perceptional experience by focusing on the tinnitus perception for habituation or even soothing purposes; decrease awareness of the sound by attentional training and cognitive interventions; decrease the maladaptive responses and the resulting distress by behavioral methods (i. e., exposure). The cognitive behavioral fear-avoidance model may offer an integrative cognitive behavioral approach that can lead to a new set of paradigms for studying the underlying mechanisms explaining chronic tinnitus suffering as well for developing innovative strategies to treat bothersome tinnitus.
Changes in the representation of space and time while listening to music.
Schäfer, Thomas; Fachner, Jörg; Smukalla, Mario
2013-01-01
Music is known to alter people's ordinary experience of space and time. Not only does this challenge the concept of invariant space and time tacitly assumed in psychology but it may also help us understand how music works and how music can be understood as an embodied experience. Yet research about these alterations is in its infancy. This review is intended to delineate a future research agenda. We review experimental evidence and subjective reports of the influence of music on the representation of space and time and present prominent approaches to explaining these effects. We discuss the role of absorption and altered states of consciousness and their associated changes in attention and neurophysiological processes, as well as prominent models of human time processing and time experience. After integrating the reviewed research, we conclude that research on the influence of music on the representation of space and time is still quite inconclusive but that integrating the different approaches could lead to a better understanding of the observed effects. We also provide a working model that integrates a large part of the evidence and theories. Several suggestions for further research in both music psychology and cognitive psychology are outlined.
Wu, Jinlu
2013-01-01
Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a "mutation" method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the "mutations"; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional "cookbook"-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class.
Accounting for Attribute-Level Non-Attendance in a Health Choice Experiment: Does it Matter?
Erdem, Seda; Campbell, Danny; Hole, Arne Risa
2015-07-01
An extensive literature has established that it is common for respondents to ignore attributes of the alternatives within choice experiments. In most of the studies on attribute non-attendance, it is assumed that respondents consciously (or unconsciously) ignore one or more attributes of the alternatives, regardless of their levels. In this paper, we present a new line of enquiry and approach for modelling non-attendance in the context of investigating preferences for health service innovations. This approach recognises that non-attendance may not just be associated with attributes but may also apply to the attribute's levels. Our results show that respondents process each level of an attribute differently: while attending to the attribute, they ignore a subset of the attribute's levels. In such cases, the usual approach of assuming that respondents either attend to the attribute or not, irrespective of its levels, is erroneous and could lead to misguided policy recommendations. Our results indicate that allowing for attribute-level non-attendance leads to substantial improvements in the model fit and has an impact on estimated marginal willingness to pay and choice predictions. Copyright © 2014 John Wiley & Sons, Ltd.
van Koperen, Tessa M; Renders, Carry M; Spierings, Eline J M; Hendriks, Anna-Marie; Westerman, Marjan J; Seidell, Jacob C; Schuit, Albertine J
2016-01-01
Background . Integrated community-wide intervention approaches (ICIAs) are implemented to prevent childhood obesity. Programme evaluation improves these ICIAs, but professionals involved often struggle with performance. Evaluation tools have been developed to support Dutch professionals involved in ICIAs. It is unclear how useful these tools are to intended users. We therefore researched the facilitators of and barriers to ICIA programme evaluation as perceived by professionals and their experiences of the evaluation tools. Methods . Focus groups and interviews with 33 public health professionals. Data were analysed using a thematic content approach. Findings . Evaluation is hampered by insufficient time, budget, and experience with ICIAs, lack of leadership, and limited advocacy for evaluation. Epidemiologists are regarded as responsible for evaluation but feel incompetent to perform evaluation or advocate its need in a political environment. Managers did not prioritise process evaluations, involvement of stakeholders, and capacity building. The evaluation tools are perceived as valuable but too comprehensive considering limited resources. Conclusion . Evaluating ICIAs is important but most professionals are unfamiliar with it and management does not prioritise process evaluation nor incentivize professionals to evaluate. To optimise programme evaluation, more resources and coaching are required to improve professionals' evaluation capabilities and specifically the use of evaluation.
Spierings, Eline J. M.; Westerman, Marjan J.; Seidell, Jacob C.; Schuit, Albertine J.
2016-01-01
Background. Integrated community-wide intervention approaches (ICIAs) are implemented to prevent childhood obesity. Programme evaluation improves these ICIAs, but professionals involved often struggle with performance. Evaluation tools have been developed to support Dutch professionals involved in ICIAs. It is unclear how useful these tools are to intended users. We therefore researched the facilitators of and barriers to ICIA programme evaluation as perceived by professionals and their experiences of the evaluation tools. Methods. Focus groups and interviews with 33 public health professionals. Data were analysed using a thematic content approach. Findings. Evaluation is hampered by insufficient time, budget, and experience with ICIAs, lack of leadership, and limited advocacy for evaluation. Epidemiologists are regarded as responsible for evaluation but feel incompetent to perform evaluation or advocate its need in a political environment. Managers did not prioritise process evaluations, involvement of stakeholders, and capacity building. The evaluation tools are perceived as valuable but too comprehensive considering limited resources. Conclusion. Evaluating ICIAs is important but most professionals are unfamiliar with it and management does not prioritise process evaluation nor incentivize professionals to evaluate. To optimise programme evaluation, more resources and coaching are required to improve professionals' evaluation capabilities and specifically the use of evaluation. PMID:28116149
ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes
NASA Astrophysics Data System (ADS)
Yuan, Gary; Gygi, Francois
2011-03-01
ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.
Menshikov, Ivan S; Shklover, Alexsandr V; Babkina, Tatiana S; Myagkov, Mikhail G
2017-01-01
In this research, the social behavior of the participants in a Prisoner's Dilemma laboratory game is explained on the basis of the quantal response equilibrium concept and the representation of the game in Markov strategies. In previous research, we demonstrated that social interaction during the experiment has a positive influence on cooperation, trust, and gratefulness. This research shows that the quantal response equilibrium concept agrees only with the results of experiments on cooperation in Prisoner's Dilemma prior to social interaction. However, quantal response equilibrium does not explain of participants' behavior after social interaction. As an alternative theoretical approach, an examination was conducted of iterated Prisoner's Dilemma game in Markov strategies. We built a totally mixed Nash equilibrium in this game; the equilibrium agrees with the results of the experiments both before and after social interaction.
Soccer Matches as Experiments - How Often Does the 'Best' Team Win?
NASA Technical Reports Server (NTRS)
Skinner, Gerald K.; Freeman, G. H.
2009-01-01
Models in which the number of goals scored by a team in a soccer match follow a Poisson distribution or a closely related one, have been widely discussed. We here consider a soccer match as an experiment to assess which of two teams is superior and examine the probability that the outcome of the experiment (match) truly represents the relative abilities of the two teams. Given a final score it is possible by using a Bayesian approach to quantify the probability that it was or was not the case that the best team won. For typical scores, the probability of a misleading result is significant. Modifying the rules of the game to increase thc typical number of goals scored would improve the situation, but a level of confidence that would normally be regarded as satisfactory could not be obtained unless the character of the game were radically changed.
Does Islam play a role in anti-immigrant sentiment? An experimental approach.
Creighton, Mathew J; Jamal, Amaney
2015-09-01
Are Muslim immigrants subjected to targeted opposition (i.e., Islamophobia) on their pathway to US citizenship? Using a list experiment and a representative sample of the US population, we compare explicit and implicit opposition to Muslim and Christian immigrants. We find that Muslim immigrants, relative to Christian immigrants, experience greater explicit resistance. However, when social desirability bias is taken into account via the list experiment, we find that opposition to Christian and Muslim immigrants is the same. The explanation is that respondents conceal a significant amount of opposition to Christian immigrants. Muslim immigrants, on the other hand, are afforded no such protection. We find that religiosity or denomination do not play a significant role in determining implicit or explicit opposition. We conclude that Islamophobia, which is only explicitly expressed, is best understood as reflective of social desirability bias from which Muslim immigrants do not benefit. Copyright © 2015 Elsevier Inc. All rights reserved.
Discovering the truth in attempted suicide.
Michel, Konrad; Maltsberger, John T; Jobes, David A; Leenaars, Antoon A; Orbach, Israel; Stadler, Kathrin; Dey, Pascal; Young, Richard A; Valach, Ladislav
2002-01-01
The findings of an international workshop on improving clinical interactions between mental health workers and suicidal patients are reported. Expert clinician-researchers identified common contemporary problems in interviews of suicide attempters. Various videotaped interviews of suicide attempters were critically discussed in relation to expert experience and the existing literature in this area. The working group agreed that current mental health practice often does not take into account the subjective experience of patients attempting suicide, and that contemporary clinical assessments of suicidal behavior are more clinician-centered than patient-centered. The group concluded that clinicians should strive for a shared understanding of the patient's suicidality; and that interviewers should be more aware of the suicidal patient's inner experience of mental pain and loss of self-respect. Collaborative and narrative approaches to the suicidal patient are more promising, enhancing the clinician's ability to empathize and help the patient begin to reestablish a sense of mastery, thereby strengthening the clinical alliance.
Myagkov, Mikhail G.
2017-01-01
In this research, the social behavior of the participants in a Prisoner's Dilemma laboratory game is explained on the basis of the quantal response equilibrium concept and the representation of the game in Markov strategies. In previous research, we demonstrated that social interaction during the experiment has a positive influence on cooperation, trust, and gratefulness. This research shows that the quantal response equilibrium concept agrees only with the results of experiments on cooperation in Prisoner’s Dilemma prior to social interaction. However, quantal response equilibrium does not explain of participants’ behavior after social interaction. As an alternative theoretical approach, an examination was conducted of iterated Prisoner's Dilemma game in Markov strategies. We built a totally mixed Nash equilibrium in this game; the equilibrium agrees with the results of the experiments both before and after social interaction. PMID:29190280
Hall, Mel; Sikes, Pat
2017-01-01
In the U.K. context where the emphasis is (quite rightly) on living well with dementia, on positivity and enabling approaches, it can be difficult for researchers to investigate and report negative experiences. Failing to re-present perceptions and experiences as they are lived, however, does a serious disservice to the research endeavor and can prevent policy and service development and positive change. In this article, we present some stories told by participants in an Alzheimer’s Society (United Kingdom) Funded project uniquely investigating the perceptions and experiences of children and young people who have a parent with dementia. Sometimes the stories were not easy to hear, especially when they challenged dominant master narratives around dementia. We discuss our view that when the young people we spoke with told us how things were for them, we were ethically bound to respect and disseminate their accounts. PMID:28682738
NASA Technical Reports Server (NTRS)
Buchner, S.; LaBel, K.; Barth, J.; Campbell, A.
2005-01-01
Space experiments are occasionally launched to study the effects of radiation on electronic and photonic devices. This begs the following questions: Are space experiments necessary? Do the costs justify the benefits? How does one judge success of space experiment? What have we learned from past space experiments? How does one design a space experiment? This viewgraph presentation provides information on the usefulness of space and ground tests for simulating radiation damage to spacecraft components.
Sexual Behavior Among Persons With Cognitive Impairments.
Thom, Robyn P; Grudzinskas, Albert J; Saleh, Fabian M
2017-05-01
Although the cognitively impaired are frequently included in heterogeneous studies of problematic sexual behavior, the epidemiology, etiology, and approach to assessment and treatment of persons with dementia and intellectual disability are distinct from those of the general population. The incidence of inappropriate sexual behavior among the intellectually disabled is 15-33%; however, the nature tends to be more socially inappropriate than with violative intent. Limited sociosexual education is a large contributor, and better addressing this area offers a target for prevention and treatment. A thorough clinical assessment of problematic sexual behaviors in the cognitively impaired requires understanding the patient's internal experience, which can be challenging. Assessment tools validated for the general population have not been validated for this population. Very few studies have assessed treatment approaches specifically among the cognitively impaired; however, research does suggest utility in habilitative, psychotherapeutic, and pharmacologic approaches which have been validated among the general population.
Nonhuman primate dermatology: a literature review
Bernstein, Joseph A.; Didier, Peter J.
2015-01-01
In general, veterinary dermatologists do not have extensive clinical experience of nonhuman primate (NHP) dermatoses. The bulk of the published literature does not provide an organized evidence-based approach to the NHP dermatologic case. The veterinary dermatologist is left to extract information from both human and veterinary dermatology, an approach that can be problematic as it forces the clinician to make diagnostic and therapeutic decisions based on two very disparate bodies of literature. A more cohesive approach to NHP dermatology – without relying on assumptions that NHP pathology most commonly behaves similarly to other veterinary and human disease – is required. This review of the dermatology of NHP species includes discussions of primary dermatoses, as well as diseases where dermatologic signs represent a significant secondary component, provides a first step towards encouraging the veterinary community to study and report the dermatologic diseases of nonhuman primates. PMID:19490576
Bridging different perspectives of the physiological and mathematical disciplines.
Batzel, Jerry Joseph; Hinghofer-Szalkay, Helmut; Kappel, Franz; Schneditz, Daniel; Kenner, Thomas; Goswami, Nandu
2012-12-01
The goal of this report is to discuss educational approaches for bridging the different perspectives of the physiological and mathematical disciplines. These approaches can enhance the learning experience for physiology, medical, and mathematics students and simultaneously act to stimulate mathematical/physiological/clinical interdisciplinary research. While physiology education incorporates mathematics, via equations and formulas, it does not typically provide a foundation for interdisciplinary research linking mathematics and physiology. Here, we provide insights and ideas derived from interdisciplinary seminars involving mathematicians and physiologists that have been conducted over the last decade. The approaches described here can be used as templates for giving physiology and medical students insights into how sophisticated tools from mathematics can be applied and how the disciplines of mathematics and physiology can be integrated in research, thereby fostering a foundation for interdisciplinary collaboration. These templates are equally applicable to linking mathematical methods with other life and health sciences in the educational process.
NASA Astrophysics Data System (ADS)
Chen, Wen; Wang, Fajie
Based on the implicit calculus equation modeling approach, this paper proposes a speculative concept of the potential and wave operators on negative dimensionality. Unlike the standard partial differential equation (PDE) modeling, the implicit calculus modeling approach does not require the explicit expression of the PDE governing equation. Instead the fundamental solution of physical problem is used to implicitly define the differential operator and to implement simulation in conjunction with the appropriate boundary conditions. In this study, we conjecture an extension of the fundamental solution of the standard Laplace and Helmholtz equations to negative dimensionality. And then by using the singular boundary method, a recent boundary discretization technique, we investigate the potential and wave problems using the fundamental solution on negative dimensionality. Numerical experiments reveal that the physics behaviors on negative dimensionality may differ on positive dimensionality. This speculative study might open an unexplored territory in research.
Negotiating Parenthood: Experiences of Economic Hardship among Parents with Cognitive Difficulties
ERIC Educational Resources Information Center
Fernqvist, Stina
2015-01-01
People with cognitive difficulties often have scarce economic resources, and parents with cognitive difficulties are no exception. In this article, parents' experiences are put forth and discussed, for example, how does economic hardship affect family life? How do the parents experience support, what kind of strain does the scarce economy put on…
Optimization of a chondrogenic medium through the use of factorial design of experiments.
Enochson, Lars; Brittberg, Mats; Lindahl, Anders
2012-12-01
The standard culture system for in vitro cartilage research is based on cells in a three-dimensional micromass culture and a defined medium containing the chondrogenic key growth factor, transforming growth factor (TGF)-β1. The aim of this study was to optimize the medium for chondrocyte micromass culture. Human chondrocytes were cultured in different media formulations, designed with a factorial design of experiments (DoE) approach and based on the standard medium for redifferentiation. The significant factors for the redifferentiation of the chondrocytes were determined and optimized in a two-step process through the use of response surface methodology. TGF-β1, dexamethasone, and glucose were significant factors for differentiating the chondrocytes. Compared to the standard medium, TGF-β1 was increased 30%, dexamethasone reduced 50%, and glucose increased 22%. The potency of the optimized medium was validated in a comparative study against the standard medium. The optimized medium resulted in micromass cultures with increased expression of genes important for the articular chondrocyte phenotype and in cultures with increased glycosaminoglycan/DNA content. Optimizing the standard medium with the efficient DoE method, a new medium that gave better redifferentiation for articular chondrocytes was determined.
Monitoring of the secondary drying in freeze-drying of pharmaceuticals.
Fissore, Davide; Pisano, Roberto; Barresi, Antonello A
2011-02-01
This paper is focused on the in-line monitoring of the secondary drying phase of a lyophilization process. An innovative software sensor is presented to estimate reliably the residual moisture in the product and the time required to complete secondary drying, that is, to reach the target value of the residual moisture or of the desorption rate. Such results are obtained by coupling a mathematical model of the process and the in-line measurement of the solvent desorption rate and by means of the pressure rise test or another sensors (e.g., windmills, laser sensors) that can measure the vapor flux in the drying chamber. The proposed method does not require extracting any vial during the operation or using expensive sensors to measure off-line the residual moisture. Moreover, it does not require any preliminary experiment to determine the relationship between the desorption rate and residual moisture in the product. The effectiveness of the proposed approach is demonstrated by means of experiments carried out in a pilot-scale apparatus: in this case, some vials were extracted from the drying chamber and the moisture content was measured to validate the estimations provided by the soft-sensor. Copyright © 2010 Wiley-Liss, Inc.
DOE EiR at Oakridge National Lab 2008/09
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauer, Michael
2012-11-30
This project placed an experienced technology entrepreneur at Oak Ridge National Lab, one of DOE's premier laboratories undertaking cutting edge research in a variety of fields, including energy technologies. With the goal of accelerating the commercialization of advanced energy technologies, the task was to review available technologies at the lab and identify those that qualify for licensing and commercialization by a private startup company, backed by private venture capital. During the project, more than 1,500 inventions filed at the lab were reviewed over a 1 year period; a successively smaller number was selected for more detailed review, ultimately resulting inmore » five, and then 1 technology, being reviewed for immediate commercialization. The chosen technology, consisting in computational chemistry based approached to optimization of enzymes, was tested in lab experiments, paid for by funds raised by ORNL for the purpose of proving out the effectiveness of the technology and readiness for commercialization. The experiments proved out that the technology worked however it's performance proved not yet mature enough to qualify for private venture capital funded commercialization in a high tech startup. As a consequence, the project did not result in a new startup company being formed, as originally intended.« less
[Mothers Talk. Part Unless Told of Voluntary Abortion].
Hernández Garre, José Manuel; Aznar Mula, Isabel María; Echevarría Pérez, Paloma
2017-01-01
The aim of the study was to investigate the experiences linked to the post-abortion syndrome in mothers who have had a voluntary abortion. A phenomenological qualitative approach to collect the experiences of mothers who had voluntarily interrupted their pregnancy was used. The research technique was the semistructured interviews with women who had contacted different association's help of the Murcia region for support after experiencing symptoms consistent with post-abortion syndrome. The testimonies show feminist or utilitarian arguments to justify the decision to abort, they talk about a system, to some extent, mercantilist that has no real intention of giving real life choices. Experience shows that far from lived as an act of female freedom is experienced traumatically, developed symptoms following the sense of loss. In this context, the resource spiritual becomes the best tool to expiate guilt. The experience of abortion does not improve the lives of women; far from it is a trauma that can be avoided with proper advice to avoid the tragedy of abortion.
Yokoyama, Hiromi M; Nakayachi, Kazuya
2014-07-01
How does the public assess an appropriate financial allocation to science promotion? This article empirically examined the subadditivity effect in the judgment of budgetary allocation. Results of the first experiment showed that the ratio of the national budget allocated for science promotion by participants increased when science was decomposed into more specific categories compared to when it was presented as "science promotion" alone. Consistent with these findings, results of the second experiment showed that the allotment ratio to science promotion decreased when the number of other expenditure items increased. Meanwhile, the third experiment revealed that in the case of a budgetary cutback, the total amount taken from science promotion greatly increased when science was decomposed into subcategories. The subadditivity effect and increase in the total allotment ratio by unpacking science promotion was confirmed by these three experiments not only on budgetary allocation but also on budgetary cutback.
Brunier, Elisabeth; Le Chapellier, Michel; Dejean, Pierre Henri
2012-01-01
The aims of this paper are to present concept and results of an innovative educational model approach based on ergonomics involvement in industrial project. First we present Cross disciplinary Problem solving Workshop by answering three questions:1) What is a CPW: A partnership between Universities and one or several companies, purposes of it are first to increase health, well being, companies teams competencies, and competitiveness, second to train the "IPOD generation" to include risks prevention in design. 2) How does it work? CPW allows cooperation between experience and new insight through inductive methods. This model follows the Piaget (1) philosophy linking concrete world to abstraction by a learning system associating realization and abstraction. 3) Is it successful? In order to answer this third question we will show examples of studies and models performed during CPWs.It appears that the CPWs produce visible results in companies such as new process designs, new methods, and also changes in lectures. However some less visible results remain unclear: How the company personnel evolve during and after CPW? Does CPW motivate our future engineers enough to continuously improve their skills in risk prevention and innovative design?
Chaouiya, Claudine; Keating, Sarah M; Berenguier, Duncan; Naldi, Aurélien; Thieffry, Denis; van Iersel, Martijn P; Le Novère, Nicolas; Helikar, Tomáš
2015-09-04
Quantitative methods for modelling biological networks require an in-depth knowledge of the biochemical reactions and their stoichiometric and kinetic parameters. In many practical cases, this knowledge is missing. This has led to the development of several qualitative modelling methods using information such as, for example, gene expression data coming from functional genomic experiments. The SBML Level 3 Version 1 Core specification does not provide a mechanism for explicitly encoding qualitative models, but it does provide a mechanism for SBML packages to extend the Core specification and add additional syntactical constructs. The SBML Qualitative Models package for SBML Level 3 adds features so that qualitative models can be directly and explicitly encoded. The approach taken in this package is essentially based on the definition of regulatory or influence graphs. The SBML Qualitative Models package defines the structure and syntax necessary to describe qualitative models that associate discrete levels of activities with entity pools and the transitions between states that describe the processes involved. This is particularly suited to logical models (Boolean or multi-valued) and some classes of Petri net models can be encoded with the approach.
Major transitions in information technology
Valverde, Sergi
2016-01-01
When looking at the history of technology, we can see that all inventions are not of equal importance. Only a few technologies have the potential to start a new branching series (specifically, by increasing diversity), have a lasting impact in human life and ultimately became turning points. Technological transitions correspond to times and places in the past when a large number of novel artefact forms or behaviours appeared together or in rapid succession. Why does that happen? Is technological change continuous and gradual or does it occur in sudden leaps and bounds? The evolution of information technology (IT) allows for a quantitative and theoretical approach to technological transitions. The value of information systems experiences sudden changes (i) when we learn how to use this technology, (ii) when we accumulate a large amount of information, and (iii) when communities of practice create and exchange free information. The coexistence between gradual improvements and discontinuous technological change is a consequence of the asymmetric relationship between complexity and hardware and software. Using a cultural evolution approach, we suggest that sudden changes in the organization of ITs depend on the high costs of maintaining and transmitting reliable information. This article is part of the themed issue ‘The major synthetic evolutionary transitions’. PMID:27431527
Sigoillot, Frederic D; Huckins, Jeremy F; Li, Fuhai; Zhou, Xiaobo; Wong, Stephen T C; King, Randall W
2011-01-01
Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.
Shipstead, Zach; Engle, Randall W
2013-01-01
One approach to understanding working memory (WM) holds that individual differences in WM capacity arise from the amount of information a person can store in WM over short periods of time. This view is especially prevalent in WM research conducted with the visual arrays task. Within this tradition, many researchers have concluded that the average person can maintain approximately 4 items in WM. The present study challenges this interpretation by demonstrating that performance on the visual arrays task is subject to time-related factors that are associated with retrieval from long-term memory. Experiment 1 demonstrates that memory for an array does not decay as a product of absolute time, which is consistent with both maintenance- and retrieval-based explanations of visual arrays performance. Experiment 2 introduced a manipulation of temporal discriminability by varying the relative spacing of trials in time. We found that memory for a target array was significantly influenced by its temporal compression with, or isolation from, a preceding trial. Subsequent experiments extend these effects to sub-capacity set sizes and demonstrate that changes in the size of k are meaningful to prediction of performance on other measures of WM capacity as well as general fluid intelligence. We conclude that performance on the visual arrays task does not reflect a multi-item storage system but instead measures a person's ability to accurately retrieve information in the face of proactive interference.
Günday Türeli, Nazende; Türeli, Akif Emre; Schneider, Marc
2016-12-30
Design of Experiments (DoE) is a powerful tool for systematic evaluation of process parameters' effect on nanoparticle (NP) quality with minimum number of experiments. DoE was employed for optimization of ciprofloxacin loaded PLGA NPs for pulmonary delivery against Pseudomonas aeruginosa infections in cystic fibrosis (CF) lungs. Since the biofilm produced by bacteria was shown to be a complicated 3D barrier with heterogeneous meshes ranging from 100nm to 500nm, nanoformulations small enough to travel through those channels were assigned as target quality. Nanoprecipitation was realized utilizing MicroJet Reactor (MJR) technology based on impinging jets principle. Effect of MJR parameters flow rate, temperature and gas pressure on particle size and PDI was investigated using Box-Behnken design. The relationship between process parameters and particle quality was demonstrated by constructed fit functions (R 2 =0.9934 p<0.0001 and R 2 =0.9983 p<0.0001, for particle size and PDI, respectively). Prepared nanoformulations varied between 145.2 and 979.8nm with PDI ranging from 0.050 to 1.00 and showed encapsulation efficiencies >65%. Response surface plots provided experimental data-based understanding of MJR parameters' effect, thus NP quality. Presented work enables ciprofloxacin loaded PLGA nanoparticle preparations with pre-defined quality to fulfill the requirements of local drug delivery under CF disease conditions. Copyright © 2016 Elsevier B.V. All rights reserved.
Periasamy, Rathinasamy; Palvannan, Thayumanavan
2010-12-01
Production of laccase using a submerged culture of Pleurotus orstreatus IMI 395545 was optimized by the Taguchi orthogonal array (OA) design of experiments (DOE) methodology. This approach facilitates the study of the interactions of a large number of variables spanned by factors and their settings, with a small number of experiments, leading to considerable savings in time and cost for process optimization. This methodology optimizes the number of impact factors and enables to calculate their interaction in the production of industrial enzymes. Eight factors, viz. glucose, yeast extract, malt extract, inoculum, mineral solution, inducer (1 mM CuSO₄) and amino acid (l-asparagine) at three levels and pH at two levels, with an OA layout of L18 (2¹ × 3⁷) were selected for the proposed experimental design. The laccase yield obtained from the 18 sets of fermentation experiments performed with the selected factors and levels was further processed with Qualitek-4 software. The optimized conditions shared an enhanced laccase expression of 86.8% (from 485.0 to 906.3 U). The combination of factors was further validated for laccase production and reactive blue 221 decolorization. The results revealed an enhanced laccase yield of 32.6% and dye decolorization up to 84.6%. This methodology allows the complete evaluation of main and interaction factors. © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim
Design an optimum safety policy for personnel safety management - A system dynamic approach
NASA Astrophysics Data System (ADS)
Balaji, P.
2014-10-01
Personnel safety management (PSM) ensures that employee's work conditions are healthy and safe by various proactive and reactive approaches. Nowadays it is a complex phenomenon because of increasing dynamic nature of organisations which results in an increase of accidents. An important part of accident prevention is to understand the existing system properly and make safety strategies for that system. System dynamics modelling appears to be an appropriate methodology to explore and make strategy for PSM. Many system dynamics models of industrial systems have been built entirely for specific host firms. This thesis illustrates an alternative approach. The generic system dynamics model of Personnel safety management was developed and tested in a host firm. The model was undergone various structural, behavioural and policy tests. The utility and effectiveness of model was further explored through modelling a safety scenario. In order to create effective safety policy under resource constraint, DOE (Design of experiment) was used. DOE uses classic designs, namely, fractional factorials and central composite designs. It used to make second order regression equation which serve as an objective function. That function was optimized under budget constraint and optimum value used for safety policy which shown greatest improvement in overall PSM. The outcome of this research indicates that personnel safety management model has the capability for acting as instruction tool to improve understanding of safety management and also as an aid to policy making.
Lesson Study-Building Communities of Learning Among Pre-Service Science Teachers
NASA Astrophysics Data System (ADS)
Hamzeh, Fouada
Lesson Study is a widely used pedagogical approach that has been used for decades in its country of origin, Japan. It is a teacher-led form of professional development that involves the collaborative efforts of teachers in co-planning and observing the teaching of a lesson within a unit for evidence that the teaching practices used help the learning process (Lewis, 2002a). The purpose of this research was to investigate if Lesson Study enables pre-service teachers to improve their own teaching in the area of science inquiry-based approaches. Also explored are the self-efficacy beliefs of one group of science pre-service teachers related to their experiences in Lesson Study. The research investigated four questions: 1) Does Lesson Study influence teacher preparation for inquiry-based instruction? 2) Does Lesson Study improve teacher efficacy? 3) Does Lesson Study impact teachers' aspiration to collaborate with colleagues? 4) What are the attitudes and perceptions of pre-service teachers to the Lesson Study idea in Science? The 12 participants completed two pre- and post-study surveys: STEBI- B, Science Teaching Efficacy Belief Instrument (Enochs & Riggs, 1990) and ASTQ, Attitude towards Science Teaching. Data sources included student teaching lesson observations, lesson debriefing notes and focus group interviews. Results from the STEBI-B show that all participants measured an increase in efficacy throughout the study. This study added to the body of research on teaching learning communities, professional development programs and teacher empowerment.
Development of a Response Surface Thermal Model for Orion Mated to the International Space Station
NASA Technical Reports Server (NTRS)
Miller, Stephen W.; Meier, Eric J.
2010-01-01
A study was performed to determine if a Design of Experiments (DOE)/Response Surface Methodology could be applied to on-orbit thermal analysis and produce a set of Response Surface Equations (RSE) that accurately predict vehicle temperatures. The study used an integrated thermal model of the International Space Station and the Orion Outer mold line model. Five separate factors were identified for study: yaw, pitch, roll, beta angle, and the environmental parameters. Twenty external Orion temperatures were selected as the responses. A DOE case matrix of 110 runs was developed. The data from these cases were analyzed to produce an RSE for each of the temperature responses. The initial agreement between the engineering data and the RSE predictions was encouraging, although many RSEs had large uncertainties on their predictions. Fourteen verification cases were developed to test the predictive powers of the RSEs. The verification showed mixed results with some RSE predicting temperatures matching the engineering data within the uncertainty bands, while others had very large errors. While this study to not irrefutably prove that the DOE/RSM approach can be applied to on-orbit thermal analysis, it does demonstrate that technique has the potential to predict temperatures. Additional work is needed to better identify the cases needed to produce the RSEs
Inference of missing data and chemical model parameters using experimental statistics
NASA Astrophysics Data System (ADS)
Casey, Tiernan; Najm, Habib
2017-11-01
A method for determining the joint parameter density of Arrhenius rate expressions through the inference of missing experimental data is presented. This approach proposes noisy hypothetical data sets from target experiments and accepts those which agree with the reported statistics, in the form of nominal parameter values and their associated uncertainties. The data exploration procedure is formalized using Bayesian inference, employing maximum entropy and approximate Bayesian computation methods to arrive at a joint density on data and parameters. The method is demonstrated in the context of reactions in the H2-O2 system for predictive modeling of combustion systems of interest. Work supported by the US DOE BES CSGB. Sandia National Labs is a multimission lab managed and operated by Nat. Technology and Eng'g Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell Intl, for the US DOE NCSA under contract DE-NA-0003525.
Artificial neural networks in evaluation and optimization of modified release solid dosage forms.
Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica
2012-10-18
Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms.
Artificial Neural Networks in Evaluation and Optimization of Modified Release Solid Dosage Forms
Ibrić, Svetlana; Djuriš, Jelena; Parojčić, Jelena; Djurić, Zorica
2012-01-01
Implementation of the Quality by Design (QbD) approach in pharmaceutical development has compelled researchers in the pharmaceutical industry to employ Design of Experiments (DoE) as a statistical tool, in product development. Among all DoE techniques, response surface methodology (RSM) is the one most frequently used. Progress of computer science has had an impact on pharmaceutical development as well. Simultaneous with the implementation of statistical methods, machine learning tools took an important place in drug formulation. Twenty years ago, the first papers describing application of artificial neural networks in optimization of modified release products appeared. Since then, a lot of work has been done towards implementation of new techniques, especially Artificial Neural Networks (ANN) in modeling of production, drug release and drug stability of modified release solid dosage forms. The aim of this paper is to review artificial neural networks in evaluation and optimization of modified release solid dosage forms. PMID:24300369
NASA Astrophysics Data System (ADS)
Puligheddu, Marcello; Gygi, Francois; Galli, Giulia
The prediction of the thermal properties of solids and liquids is central to numerous problems in condensed matter physics and materials science, including the study of thermal management of opto-electronic and energy conversion devices. We present a method to compute the thermal conductivity of solids by performing ab initio molecular dynamics at non equilibrium conditions. Our formulation is based on a generalization of the approach to equilibrium technique, using sinusoidal temperature gradients, and it only requires calculations of first principles trajectories and atomic forces. We discuss results and computational requirements for a representative, simple oxide, MgO, and compare with experiments and data obtained with classical potentials. This work was supported by MICCoM as part of the Computational Materials Science Program funded by the U.S. Department of Energy (DOE), Office of Science , Basic Energy Sciences (BES), Materials Sciences and Engineering Division under Grant DOE/BES 5J-30.
NASA Technical Reports Server (NTRS)
Allan, Brian G.; Owens, Lewis R.; Lin, John C.
2006-01-01
This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan-face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan-face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3- Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCP(sub avg), the circumferential distortion level at the engine fan-face.
NASA Technical Reports Server (NTRS)
Allan, Brian G.; Owens, Lewis R., Jr.; Lin, John C.
2006-01-01
This research will investigate the use of Design-of-Experiments (DOE) in the development of an optimal passive flow control vane design for a boundary-layer-ingesting (BLI) offset inlet in transonic flow. This inlet flow control is designed to minimize the engine fan face distortion levels and first five Fourier harmonic half amplitudes while maximizing the inlet pressure recovery. Numerical simulations of the BLI inlet are computed using the Reynolds-averaged Navier-Stokes (RANS) flow solver, OVERFLOW, developed at NASA. These simulations are used to generate the numerical experiments for the DOE response surface model. In this investigation, two DOE optimizations were performed using a D-Optimal Response Surface model. The first DOE optimization was performed using four design factors which were vane height and angles-of-attack for two groups of vanes. One group of vanes was placed at the bottom of the inlet and a second group symmetrically on the sides. The DOE design was performed for a BLI inlet with a free-stream Mach number of 0.85 and a Reynolds number of 2 million, based on the length of the fan face diameter, matching an experimental wind tunnel BLI inlet test. The first DOE optimization required a fifth order model having 173 numerical simulation experiments and was able to reduce the DC60 baseline distortion from 64% down to 4.4%, while holding the pressure recovery constant. A second DOE optimization was performed holding the vanes heights at a constant value from the first DOE optimization with the two vane angles-of-attack as design factors. This DOE only required a second order model fit with 15 numerical simulation experiments and reduced DC60 to 3.5% with small decreases in the fourth and fifth harmonic amplitudes. The second optimal vane design was tested at the NASA Langley 0.3-Meter Transonic Cryogenic Tunnel in a BLI inlet experiment. The experimental results showed a 80% reduction of DPCPavg, the circumferential distortion level at the engine fan face.
Lapchuk, Anatoliy; Prygun, Olexandr; Fu, Minglei; Le, Zichun; Xiong, Qiyuan; Kryuchyn, Andriy
2017-06-26
We present the first general theoretical description of speckle suppression efficiency based on an active diffractive optical element (DOE). The approach is based on spectral analysis of diffracted beams and a coherent matrix. Analytical formulae are obtained for the dispersion of speckle suppression efficiency using different DOE structures and different DOE activation methods. We show that a one-sided 2D DOE structure has smaller speckle suppression range than a two-sided 1D DOE structure. Both DOE structures have sufficient speckle suppression range to suppress low-order speckles in the entire visible range, but only the two-sided 1D DOE can suppress higher-order speckles. We also show that a linear shift 2D DOE in a laser projector with a large numerical aperture has higher effective speckle suppression efficiency than the method using switching or step-wise shift DOE structures. The generalized theoretical models elucidate the mechanism and practical realization of speckle suppression.
Experimental Design of a UCAV-Based High-Energy Laser Weapon
2016-12-01
propagation. The Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their... Design of Experiments (DOE) methodology is then applied to determine the significance of the UCAV-HEL design parameters and their effect on the...73 A. DESIGN OF EXPERIMENTS METHODOLOGY .............................73 B. OPERATIONAL CONCEPT
ERIC Educational Resources Information Center
Hamid, Jamaliah Abdul; Krauss, Steven E.
2013-01-01
Do students' experiences on university campuses cultivate motivation to lead or a sense of readiness to lead that does not necessarily translate to active leadership? To address this question, a study was conducted with 369 undergraduates from Malaysia. Campus experience was more predictive of leadership readiness than motivation. Student…
Clements, James; Walker, Gavin; Pentlavalli, Sreekanth; Dunne, Nicholas
2014-10-01
The initial composition of acrylic bone cement along with the mixing and delivery technique used can influence its final properties and therefore its clinical success in vivo. The polymerisation of acrylic bone cement is complex with a number of processes happening simultaneously. Acrylic bone cement mixing and delivery systems have undergone several design changes in their advancement, although the cement constituents themselves have remained unchanged since they were first used. This study was conducted to determine the factors that had the greatest effect on the final properties of acrylic bone cement using a pre-filled bone cement mixing and delivery system. A design of experiments (DoE) approach was used to determine the impact of the factors associated with this mixing and delivery method on the final properties of the cement produced. The DoE illustrated that all factors present within this study had a significant impact on the final properties of the cement. An optimum cement composition was hypothesised and tested. This optimum recipe produced cement with final mechanical and thermal properties within the clinical guidelines and stated by ISO 5833 (International Standard Organisation (ISO), International standard 5833: implants for surgery-acrylic resin cements, 2002), however the low setting times observed would not be clinically viable and could result in complications during the surgical technique. As a result further development would be required to improve the setting time of the cement in order for it to be deemed suitable for use in total joint replacement surgery.
A Queueing Approach to Optimal Resource Replication in Wireless Sensor Networks
2009-04-29
network (an energy- centric approach) or to ensure the proportion of query failures does not exceed a predetermined threshold (a failure- centric ...replication strategies in wireless sensor networks. The model can be used to minimize either the total transmission rate of the network (an energy- centric ...approach) or to ensure the proportion of query failures does not exceed a predetermined threshold (a failure- centric approach). The model explicitly
Breakdown dynamics of electrically exploding thin metal wires in vacuum
NASA Astrophysics Data System (ADS)
Sarkisov, G. S.; Caplinger, J.; Parada, F.; Sotnikov, V. I.
2016-10-01
Using a two-frame intensified charge coupled device (iCCD) imaging system with a 2 ns exposure time, we observed the dynamics of voltage breakdown and corona generation in experiments of fast ns-time exploding fine Ni and stainless-steel (SS) wires in a vacuum. These experiments show that corona generation along the wire surface is subjected to temporal-spatial inhomogeneity. For both metal wires, we observed an initial generation of a bright cathode spot before the ionization of the entire wire length. This cathode spot does not expand with time. For 25.4 μm diameter Ni and SS wire explosions with positive polarity, breakdown starts from the ground anode and propagates to the high voltage cathode with speeds approaching 3500 km/s or approximately one percent of light speed.
Emilio Segrè, the Antiproton, Technetium, and Astatine
of U238, DOE Technical Report, 1942 Spontaneous Fission, DOE Technical Report, November 1950 Observation of Antiprotons, DOE Technical Report, October 1955 Antiprotons, DOE Technical Report, November 1955 The Antiproton-Nucleon Annihilation Process (Antiproton Collaboration Experiment), DOE Technical
Our On-Its-Head-and-In-Your-Dreams Approach Leads to Clean Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kazmerski, Lawrence; Gwinner, Don; Hicks, Al
Representing the Center for Inverse Design (CID), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE: energy. The mission of the CID is to revolutionize the discovery of new materials by design with tailoredmore » properties through the development and application of a novel inverse design approach powered by theory guiding experiment with an initial focus on solar energy conversion.« less
Treating career burnout: a psychodynamic existential perspective.
Pines, A M
2000-05-01
This article presents an approach for treating career burnout based on a psychodynamic existential perspective. Psychodynamic theory contributes the idea that people choose an occupation that enables them to replicate significant childhood experiences. Existential theory contributes the idea that people attempt to find existential significance through their work. It is suggested that when treating career burnout it is essential to address three questions: Why, psychodynamically, did this person choose this particular career, and how was it expected to provide existential significance? Why does this individual feel a sense of failure in the existential quest, and how is the sense of failure related to burnout? What changes need to take place for this individual to derive a sense of existential significance from work? A case illustration is presented that demonstrates the application of this approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCallen, David; Petrone, Floriana; Buckle, Ian
The U.S. Department of Energy (DOE) has ownership and operational responsibility for a large enterprise of nuclear facilities that provide essential functions to DOE missions ranging from national security to discovery science and energy research. These facilities support a number of DOE programs and offices including the National Nuclear Security Administration, Office of Science, and Office of Environmental Management. With many unique and “one of a kind” functions, these facilities represent a tremendous national investment, and assuring their safety and integrity is fundamental to the success of a breadth of DOE programs. Many DOE critical facilities are located in regionsmore » with significant natural phenomenon hazards including major earthquakes and DOE has been a leader in developing standards for the seismic analysis of nuclear facilities. Attaining and sustaining excellence in nuclear facility design and management must be a core competency of the DOE. An important part of nuclear facility management is the ability to monitor facilities and rapidly assess the response and integrity of the facilities after any major upset event. Experience in the western U.S. has shown that understanding facility integrity after a major earthquake is a significant challenge which, lacking key data, can require extensive effort and significant time. In the work described in the attached report, a transformational approach to earthquake monitoring of facilities is described and demonstrated. An entirely new type of optically-based sensor that can directly and accurately measure the earthquake-induced deformations of a critical facility has been developed and tested. This report summarizes large-scale shake table testing of the sensor concept on a representative steel frame building structure, and provides quantitative data on the accuracy of the sensor measurements.« less
Westfall, Jacob; Kenny, David A; Judd, Charles M
2014-10-01
Researchers designing experiments in which a sample of participants responds to a sample of stimuli are faced with difficult questions about optimal study design. The conventional procedures of statistical power analysis fail to provide appropriate answers to these questions because they are based on statistical models in which stimuli are not assumed to be a source of random variation in the data, models that are inappropriate for experiments involving crossed random factors of participants and stimuli. In this article, we present new methods of power analysis for designs with crossed random factors, and we give detailed, practical guidance to psychology researchers planning experiments in which a sample of participants responds to a sample of stimuli. We extensively examine 5 commonly used experimental designs, describe how to estimate statistical power in each, and provide power analysis results based on a reasonable set of default parameter values. We then develop general conclusions and formulate rules of thumb concerning the optimal design of experiments in which a sample of participants responds to a sample of stimuli. We show that in crossed designs, statistical power typically does not approach unity as the number of participants goes to infinity but instead approaches a maximum attainable power value that is possibly small, depending on the stimulus sample. We also consider the statistical merits of designs involving multiple stimulus blocks. Finally, we provide a simple and flexible Web-based power application to aid researchers in planning studies with samples of stimuli.
Technology: Catalyst for Enhancing Chemical Education for Pre-service Teachers
NASA Astrophysics Data System (ADS)
Kumar, Vinay; Bedell, Julia Yang; Seed, Allen H.
1999-05-01
A DOE/KYEPSCoR-funded project enabled us to introduce a new curricular initiative aimed at improving the chemical education of pre-service elementary teachers. The new curriculum was developed in collaboration with the School of Education faculty. A new course for the pre-service teachers, "Discovering Chemistry with Lab" (CHE 105), was developed. The integrated lecture and lab course covers basic principles of chemistry and their applications in daily life. The course promotes reasoning and problem-solving skills and utilizes hands-on, discovery/guided-inquiry, and cooperative learning approaches. This paper describes the implementation of technology (computer-interfacing and simulation experiments) in the lab. Results of two assessment surveys conducted in the laboratory are also discussed. The key features of the lab course are eight new experiments, including four computer-interfacing/simulation experiments involving the use of Macintosh Power PCs, temperature and pH probes, and a serial box interface, and use of household materials. Several experiments and the midterm and final lab practical exams emphasize the discovery/guided-inquiry approach. The results of pre- and post-surveys showed very significant positive changes in students' attitude toward the relevancy of chemistry, use of technology (computers) in elementary school classrooms, and designing and teaching discovery-based units. Most students indicated that they would be very interested (52%) or interested (36%) in using computers in their science teaching.
NASA Astrophysics Data System (ADS)
Yadav, B. K.; Tomar, J.; Harter, T.
2014-12-01
We investigate nitrate movement from non-point sources in deep, heterogeneous vadose zones, using multi-dimensional variably saturated flow and transport simulations. We hypothesize that porous media heterogeneity causes saturation variability that leads to preferential flow systems such that a significant portion of the vadose zone does not significantly contribute to flow. We solve Richards' equation and the advection-dispersion equation to simulate soil moisture and nitrate transport regimes in plot-scale experiments conducted in the San Joaquin Valley, California. We compare equilibrium against non-equilibrium (dual-porosity) approaches. In the equilibrium approach we consider each soil layer to have unique hydraulic properties as a whole, while in the dual-porosity approach we assume that large fractions of the porous flow domain are immobile. However we consider exchange of water and solute between mobile and immobile zone using the appropriate mass transfer terms. The results indicate that flow and transport in a nearly 16 m deep stratified vadose zone comprised of eight layers of unconsolidated alluvium experiences highly non-uniform, localized preferential flow and transport patterns leading to accelerated nitrate transfer. The equilibrium approach largely under-predicted the leaching of nitrate to groundwater while the dual-porosity approach showed higher rates of nitrate leaching, consistent with field observations. The dual-porosity approach slightly over-predicted nitrogen storage in the vadose zone, which may be the result of limited matrix flow or denitrification not accounted for in the model. Results of this study may be helpful to better predict fertilizer and pesticide retention times in deep vadose zone, prior to recharge into the groundwater flow system. Keywords: Nitrate, Preferential flow, Heterogeneous vadose zone, Dual-porosity approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fehrmann, Henning; Perdue, Robert
2012-07-01
Cementation of radioactive waste is a common technology. The waste is mixed with cement and water and forms a stable, solid block. The physical properties like compression strength or low leach ability depends strongly on the cement recipe. Due to the fact that this waste cement mixture has to fulfill special requirements, a recipe development is necessary. The Six Sigma{sup TM}' DMAIC methodology, together with the Design of experiment (DoE) approach, was employed to optimize the process of a recipe development for cementation at the Ling Ao nuclear power plant (NPP) in China. The DMAIC offers a structured, systematical andmore » traceable process to derive test parameters. The DoE test plans and statistical analysis is efficient regarding the amount of test runs and the benefit gain by getting a transfer function. A transfer function enables simulation which is useful to optimize the later process and being responsive to changes. The DoE method was successfully applied for developing a cementation recipe for both evaporator concentrate and resin waste in the plant. The key input parameters were determined, evaluated and the control of these parameters were included into the design. The applied Six Sigma{sup TM} tools can help to organize the thinking during the engineering process. Data are organized and clearly presented. Various variables can be limited to the most important ones. The Six Sigma{sup TM} tools help to make the thinking and decision process trace able. The tools can help to make data driven decisions (e.g. C and E Matrix). But the tools are not the only golden way. Results from scoring tools like the C and E Matrix need close review before using them. The DoE is an effective tool for generating test plans. DoE can be used with a small number of tests runs, but gives a valuable result from an engineering perspective in terms of a transfer function. The DoE prediction results, however, are only valid in the tested area. So a careful selection of input parameter and their limits for setting up a DoE is very important. An extrapolation of results is not recommended because the results are not reliable out of the tested area. (authors)« less
Lee, Yong Seung; Im, Young Jae; Shin, Sang Hee; Bascuna, Rosito T; Ha, Ji Yong; Han, Sang Won
2015-02-01
To report our experience of common sheath reimplantation (CSR) for ectopic ureterocele (EU) combined with ureteral duplication, describing success rates and postoperative complications, along with risk factors for developing postoperative incontinence. When the upper tract approach is not indicated in patients with EU, a bladder-level approach, involving either CSR or total reconstruction, is the remaining option. However, concerns exist about the high morbidity of bladder-level approaches. We retrospectively examined the postoperative results of 39 patients who underwent CSR between January 2001 and December 2012. Risk factors for the development of postoperative incontinence and decreases in differential renal function (DRF) were additionally analyzed. The median age at operation was 16.5 months. After CSR, upper urinary tract dilatation decreased in 36 patients (92.3%). During a median follow-up of 75.9 months, an additional operation was required in 7 patients (17.9%). Postoperative incontinence developed in 3 patients (7.7%). Median preoperative DRF was significantly lower in the postoperative incontinence group (P = .004). DRF decreased postoperatively in 5 of 36 patients (13.9%). No preoperative factors were related to the decrease in DRF. No patient developed hypertension or proteinuria. CSR decompressed the upper urinary tract effectively in our EU patients. Postoperative incontinence does not seem to be related to operation factors, but with preoperative DRF. When the upper tract approach is not indicated, CSR is a reasonable alternative. Total reconstruction is unnecessary as the remnant upper pole kidney after CSR does not lead to complications. Copyright © 2015 Elsevier Inc. All rights reserved.
An Analytical Calibration Approach for the Polarimetric Airborne C Band Radiometer
NASA Technical Reports Server (NTRS)
Pham, Hanh; Kim, Edward J.
2004-01-01
Passive microwave remote sensing is sensitive to the quantity and distribution of water in soil and vegetation. During summer 2000, the Microwave Geophysics Group a t the University of Michigan conducted the seventh Radiobrighness Energy Balance Experiment (REBEX-7) over a corn canopy in Michigan. Long time series of brightness temperatures, soil moisture and micrometeorology on the plot were taken. This paper addresses the calibration of the NASA GSFC polarimetric airborne C band microwave radiometer (ACMR) that participated in REBEX-7. These passive polarimeters are typically calibrated using an end-to-end approach based upon a standard artificial target or a well-known geophysical target. Analyzing the major internal functional subsystems offers a different perspective. The primary goal of this approach is to provide a transfer function that not only describes the system in its entire5 but also accounts for the contributions of each subsystem toward the final modified Stokes parameters. This approach does not assume that the radiometric system is linear as it does not take polarization isolation for granted, and it also serves as a realistic instrument simulator, a useful tool for future designs. The ACMR architecture can be partitioned into functional subsystems. The characteristics of each subsystem was extensively measured and the estimated parameters were imported into the overall dosed form system model. Inversion of the model yields a calibration for the modeled Stokes parameters with uncertainties of 0.2 K for the V and H polarizations and 2.4 K for the 3rd and 4th parameters. Application to the full Stokes parameters over a senescent cornfield is presented.
Ingvarsson, Pall Thor; Yang, Mingshi; Mulvad, Helle; Nielsen, Hanne Mørck; Rantanen, Jukka; Foged, Camilla
2013-11-01
The purpose of this study was to identify and optimize spray drying parameters of importance for the design of an inhalable powder formulation of a cationic liposomal adjuvant composed of dimethyldioctadecylammonium (DDA) bromide and trehalose-6,6'-dibehenate (TDB). A quality by design (QbD) approach was applied to identify and link critical process parameters (CPPs) of the spray drying process to critical quality attributes (CQAs) using risk assessment and design of experiments (DoE), followed by identification of an optimal operating space (OOS). A central composite face-centered design was carried out followed by multiple linear regression analysis. Four CQAs were identified; the mass median aerodynamic diameter (MMAD), the liposome stability (size) during processing, the moisture content and the yield. Five CPPs (drying airflow, feed flow rate, feedstock concentration, atomizing airflow and outlet temperature) were identified and tested in a systematic way. The MMAD and the yield were successfully modeled. For the liposome size stability, the ratio between the size after and before spray drying was modeled successfully. The model for the residual moisture content was poor, although, the moisture content was below 3% in the entire design space. Finally, the OOS was drafted from the constructed models for the spray drying of trehalose stabilized DDA/TDB liposomes. The QbD approach for the spray drying process should include a careful consideration of the quality target product profile. This approach implementing risk assessment and DoE was successfully applied to optimize the spray drying of an inhalable DDA/TDB liposomal adjuvant designed for pulmonary vaccination.
Opoku-Boateng, Gloria A.
2015-01-01
User frustration research has been one way of looking into clinicians’ experience with health information technology use and interaction. In order to understand how clinician frustration with Health Information Technology (HIT) use occurs, there is the need to explore Human-Computer Interaction (HCI) literature that addresses both frustration and HIT use. In the past three decades, HCI frustration research has increased and expanded. Researchers have done a lot of work to understand emotions, end-user frustration and affect. This paper uses a historical literature review approach to review the origins of emotion and frustration research and explore the research question; Does HCI research on frustration provide insights on clinicians’ frustration with HIT interfaces? From the literature review HCI research on emotion and frustration provides additional insights that can indeed help explain user frustration in HIT. Different approaches and HCI perspectives also help frame HIT user frustration research as well as inform HIT system design. The paper concludes with a suggested directions on how future design and research may take. PMID:26958238
Opoku-Boateng, Gloria A
2015-01-01
User frustration research has been one way of looking into clinicians' experience with health information technology use and interaction. In order to understand how clinician frustration with Health Information Technology (HIT) use occurs, there is the need to explore Human-Computer Interaction (HCI) literature that addresses both frustration and HIT use. In the past three decades, HCI frustration research has increased and expanded. Researchers have done a lot of work to understand emotions, end-user frustration and affect. This paper uses a historical literature review approach to review the origins of emotion and frustration research and explore the research question; Does HCI research on frustration provide insights on clinicians' frustration with HIT interfaces? From the literature review HCI research on emotion and frustration provides additional insights that can indeed help explain user frustration in HIT. Different approaches and HCI perspectives also help frame HIT user frustration research as well as inform HIT system design. The paper concludes with a suggested directions on how future design and research may take.
Does the Addition of Inert Gases at Constant Volume and Temperature Affect Chemical Equilibrium?
ERIC Educational Resources Information Center
Paiva, Joao C. M.; Goncalves, Jorge; Fonseca, Susana
2008-01-01
In this article we examine three approaches, leading to different conclusions, for answering the question "Does the addition of inert gases at constant volume and temperature modify the state of equilibrium?" In the first approach, the answer is yes as a result of a common students' alternative conception; the second approach, valid only for ideal…
Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner
2013-06-01
The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
ERIC Educational Resources Information Center
Bradford, Gyndolyn
2017-01-01
The idea of mentoring in higher education is considered a good thing for students and faculty. What is missing in the research is how does mentoring influence and shape the student experience, does mentoring help retention, and how does it contribute to student development? (Crisp, Baker, Griffin, Lunsford, Pifer, 2017). The mentoring relationship…
Robust audio-visual speech recognition under noisy audio-video conditions.
Stewart, Darryl; Seymour, Rowan; Pass, Adrian; Ming, Ji
2014-02-01
This paper presents the maximum weighted stream posterior (MWSP) model as a robust and efficient stream integration method for audio-visual speech recognition in environments, where the audio or video streams may be subjected to unknown and time-varying corruption. A significant advantage of MWSP is that it does not require any specific measurements of the signal in either stream to calculate appropriate stream weights during recognition, and as such it is modality-independent. This also means that MWSP complements and can be used alongside many of the other approaches that have been proposed in the literature for this problem. For evaluation we used the large XM2VTS database for speaker-independent audio-visual speech recognition. The extensive tests include both clean and corrupted utterances with corruption added in either/both the video and audio streams using a variety of types (e.g., MPEG-4 video compression) and levels of noise. The experiments show that this approach gives excellent performance in comparison to another well-known dynamic stream weighting approach and also compared to any fixed-weighted integration approach in both clean conditions or when noise is added to either stream. Furthermore, our experiments show that the MWSP approach dynamically selects suitable integration weights on a frame-by-frame basis according to the level of noise in the streams and also according to the naturally fluctuating relative reliability of the modalities even in clean conditions. The MWSP approach is shown to maintain robust recognition performance in all tested conditions, while requiring no prior knowledge about the type or level of noise.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deeb, Rula A.; Hawley, Elisabeth L.
The goal of United States (U.S.) Department of Energy's (DOE)'s environmental remediation programs is to restore groundwater to beneficial use, similar to many other Federal and state environmental cleanup programs. Based on past experience, groundwater remediation to pre-contamination conditions (i.e., drinking water standards or non-detectable concentrations) can be successfully achieved at many sites. At a subset of the most complex sites, however, complete restoration is not likely achievable within the next 50 to 100 years using today's technology. This presentation describes several approaches used at complex sites in the face of these technical challenges. Many complex sites adopted a long-termmore » management approach, whereby contamination was contained within a specified area using active or passive remediation techniques. Consistent with the requirements of their respective environmental cleanup programs, several complex sites selected land use restrictions and used risk management approaches to accordingly adopt alternative cleanup goals (alternative endpoints). Several sites used long-term management designations and approaches in conjunction with the alternative endpoints. Examples include various state designations for groundwater management zones, technical impracticability (TI) waivers or greater risk waivers at Superfund sites, and the use of Monitored Natural Attenuation (MNA) or other passive long-term management approaches over long time frames. This presentation will focus on findings, statistics, and case studies from a recently-completed report for the Department of Defense's Environmental Security Technology Certification Program (ESTCP) (Project ER-0832) on alternative endpoints and approaches for groundwater remediation at complex sites under a variety of Federal and state cleanup programs. The primary objective of the project was to provide environmental managers and regulators with tools, metrics, and information needed to evaluate alternative endpoints for groundwater remediation at complex sites. A statistical analysis of Comprehensive Environmental Response, Compensation and Liability Act (CERCLA) sites receiving TI waivers will be presented as well as case studies of other types of alternative endpoints and alternative remedial strategies that illustrate the variety of approaches used at complex sites and the technical analyses used to predict and document cost, time frame, and potential remedial effectiveness. This presentation is intended to inform DOE program managers, state regulators, practitioners and other stakeholders who are evaluating technical cleanup challenges within their own programs, and establishing programmatic approaches to evaluating and implementing long-term management approaches. Case studies provide examples of long-term management designations and strategies to manage and remediate groundwater at complex sites. At least 13 states consider some designation for groundwater containment in their corrective action policies, such as groundwater management zones, containment zones, and groundwater classification exemption areas. Long-term management designations are not a way to 'do nothing' or walk away from a site. Instead, soil and groundwater within the zone is managed to be protective of human health and the environment. Understanding when and how to adopt a long-term management approach can lead to cost savings and the more efficient use of resources across DOE and at numerous other industrial and military sites across the U.S. This presentation provides context for assessing the use and appropriate role of alternative endpoints and supporting long-term management designations in final remedies. (authors)« less
Karakucuk, Alptug; Celebi, Nevin; Teksin, Zeynep Safak
2016-12-01
The objective of this study was to prepare ritonavir (RTV) nanosuspensions, an anti-HIV protease inhibitor, to solve its poor water solubility issues. The microfluidization method with a pre-treatment step was used to obtain the nanosuspensions. Design of Experiment (DoE) approach was performed in order to understand the effect of the critical formulation parameters which were selected as polymer type (HPMC or PVP), RTV to polymer ratio, and number of passes. Interactions between the formulation variables were evaluated according to Univariate ANOVA. Particle size, particle size distribution and zeta potential were selected as dependent variables. Scanning electron microscopy, X-ray powder diffraction, and differential scanning calorimetry were performed for the in vitro characterization after lyophilization of the optimum nanosuspension formulation. The saturation solubility was examined in comparison with coarse powder, physical mixture and nanosuspension. In vitro dissolution studies were conducted using polyoxyethylene 10 lauryl ether (POE10LE) and biorelevant media (FaSSIF and FeSSIF). The results showed nanosuspensions were partially amorphous and spherically shaped with particle sizes ranging from 400 to 600nm. Moreover, 0.1-0.4 particle size distribution and about -20mV zeta potential values were obtained. The nanosuspension showed a significantly increased solubility when compared to coarse powder (3.5 fold). Coarse powder, physical mixture, nanosuspension and commercial product dissolved completely in POE10LE; however, cumulative dissolved values reached ~20% in FaSSIF for the commercial product and nanosuspension. The nanosuspension showed more than 90% drug dissolved in FeSSIF compared to the commercial product which showed ~50% in the same medium. It was determined that RTV dissolution was increased by nanosuspension formulation. We concluded that DoE approach is useful to develop nanosuspension formulation to improve solubility and dissolution rate of RTV. Copyright © 2016 Elsevier B.V. All rights reserved.
Hubert, C; Houari, S; Rozet, E; Lebrun, P; Hubert, Ph
2015-05-22
When using an analytical method, defining an analytical target profile (ATP) focused on quantitative performance represents a key input, and this will drive the method development process. In this context, two case studies were selected in order to demonstrate the potential of a quality-by-design (QbD) strategy when applied to two specific phases of the method lifecycle: the pre-validation study and the validation step. The first case study focused on the improvement of a liquid chromatography (LC) coupled to mass spectrometry (MS) stability-indicating method by the means of the QbD concept. The design of experiments (DoE) conducted during the optimization step (i.e. determination of the qualitative design space (DS)) was performed a posteriori. Additional experiments were performed in order to simultaneously conduct the pre-validation study to assist in defining the DoE to be conducted during the formal validation step. This predicted protocol was compared to the one used during the formal validation. A second case study based on the LC/MS-MS determination of glucosamine and galactosamine in human plasma was considered in order to illustrate an innovative strategy allowing the QbD methodology to be incorporated during the validation phase. An operational space, defined by the qualitative DS, was considered during the validation process rather than a specific set of working conditions as conventionally performed. Results of all the validation parameters conventionally studied were compared to those obtained with this innovative approach for glucosamine and galactosamine. Using this strategy, qualitative and quantitative information were obtained. Consequently, an analyst using this approach would be able to select with great confidence several working conditions within the operational space rather than a given condition for the routine use of the method. This innovative strategy combines both a learning process and a thorough assessment of the risk involved. Copyright © 2015 Elsevier B.V. All rights reserved.
Experimental study designs to improve the evaluation of road mitigation measures for wildlife.
Rytwinski, Trina; van der Ree, Rodney; Cunnington, Glenn M; Fahrig, Lenore; Findlay, C Scott; Houlahan, Jeff; Jaeger, Jochen A G; Soanes, Kylie; van der Grift, Edgar A
2015-05-01
An experimental approach to road mitigation that maximizes inferential power is essential to ensure that mitigation is both ecologically-effective and cost-effective. Here, we set out the need for and standards of using an experimental approach to road mitigation, in order to improve knowledge of the influence of mitigation measures on wildlife populations. We point out two key areas that need to be considered when conducting mitigation experiments. First, researchers need to get involved at the earliest stage of the road or mitigation project to ensure the necessary planning and funds are available for conducting a high quality experiment. Second, experimentation will generate new knowledge about the parameters that influence mitigation effectiveness, which ultimately allows better prediction for future road mitigation projects. We identify seven key questions about mitigation structures (i.e., wildlife crossing structures and fencing) that remain largely or entirely unanswered at the population-level: (1) Does a given crossing structure work? What type and size of crossing structures should we use? (2) How many crossing structures should we build? (3) Is it more effective to install a small number of large-sized crossing structures or a large number of small-sized crossing structures? (4) How much barrier fencing is needed for a given length of road? (5) Do we need funnel fencing to lead animals to crossing structures, and how long does such fencing have to be? (6) How should we manage/manipulate the environment in the area around the crossing structures and fencing? (7) Where should we place crossing structures and barrier fencing? We provide experimental approaches to answering each of them using example Before-After-Control-Impact (BACI) study designs for two stages in the road/mitigation project where researchers may become involved: (1) at the beginning of a road/mitigation project, and (2) after the mitigation has been constructed; highlighting real case studies when available. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Childs, D. W.
1983-01-01
An improved theory for the prediction of the rotordynamic coefficients of turbulent annular seals was developed. Predictions from the theory are compared to the experimental results and an approach for the direct calculation of empirical turbulent coefficients from test data are introduced. An improved short seal solution is shown to do a better job of calculating effective stiffness and damping coefficients than either the original short seal solution or a finite length solution. However, the original short seal solution does a much better job of predicting equivalent added mass coefficient.
Lunnon, R J
1989-07-01
Unfortunately, training medical photographers and artists does not include finance management, but in the 'cost effective' atmosphere of both NHS and universities today we must learn to approach the subject in a manner likely to satisfy our financial masters. The experience gained in introducing a unit cost system and using it for 17 years will be reviewed but more importantly, as the system has been adopted and adapted by so many, the benefits of using such a system and changing to concur with current financial thinking will also be discussed. Are we in such a Korner as we might think?
Oratoria Online: The Use of Technology Enhaced Learning to Improve Students' Oral Skills
NASA Astrophysics Data System (ADS)
Dornaleteche, Jon
New ITCs have proven to be useful tools for implementing innovating didactic and pedagogical formula oriented to enhance students' en teachers' creativity. The up-and-coming massive e-learning and blended learning projects are clear examples of such a phenomenon. The teaching of oral communication offers a perfect scenario to experiment with these formulas. Since the traditional face to face approach for teaching 'Speech techniques' does not keep up with the new digital environment that surround students, it is necessary to move towards an 'Online oratory' model focused on using TEL to improve oral skills.
No news without new scientific ideas.
Loonen, Anton J M
2014-04-01
In this editorial, it is strongly advocated that a change of policy is warranted in order to prevent neuroscience from becoming a waste of time and money in the 21st century. Repeating the same trick in different patient populations and perusing the scientific literature seems to currently be the backbone of medical science. However, this approach does not provide knowledge on how the brain works or how specific dysfunctions result in specific diseases. Therefore, earlier findings should, first, be combined to develop new theories on the mechanics of the mind, and, second, these new ideas should be tested in well-designed experiments.
Particle size reduction of propellants by cryocycling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whinnery, L.; Griffiths, S.; Lipkin, J.
1995-05-01
Repeated exposure of a propellant to liquid nitrogen causes thermal stress gradients within the material resulting in cracking and particle size reduction. This process is termed cryocycling. The authors conducted a feasibility study, combining experiments on both inert and live propellants with three modeling approaches. These models provided optimized cycle times, predicted ultimate particle size, and allowed crack behavior to be explored. Process safety evaluations conducted separately indicated that cryocycling does not increase the sensitivity of the propellants examined. The results of this study suggest that cryocycling is a promising technology for the demilitarization of tactical rocket motors.
Locality, reflection, and wave-particle duality
NASA Astrophysics Data System (ADS)
Mugur-Schächter, Mioara
1987-08-01
Bell's theorem is believed to establish that the quantum mechanical predictions do not generally admit a causal representation compatible with Einsten's principle of separability, thereby proving incompatibility between quantum mechanics and relativity. This interpretation is contested via two convergent approaches which lead to a sharp distinction between quantum nonseparability and violation of Einstein's theory of relativity. In a first approach we explicate from the quantum mechanical formalism a concept of “reflected dependence.” Founded on this concept, we produce a causal representation of the quantum mechanical probability measure involved in Bell's proof, which is clearly separable in Einstein's sense, i.e., it does not involve supraluminal velocities, and nevertheless is “nonlocal” in Bell's sense. So Bell locality and Einstein separability are distinct qualifications, and Bell nonlocality (or Bell nonseparability) and Einstein separability are not incompatible. It is then proved explicitly that with respect to the mentioned representation Bell's derivation does not hold. So Bell's derivation does not establish that any Einstein-separable representation is incompatible with quantum mechanics. This first—negative—conclusion is a syntactic fact. The characteristics of the representation and of the reasoning involved in the mentioned counterexample to the usual interpretation of Bell's theorem suggest that the representation used—notwithstanding its ability to bring forth the specified syntactic fact—is not factually true. Factual truth and syntactic properties also have to be radically distinguished in their turn. So, in a second approach, starting from de Broglie's initial relativistic model of a microsystem, a deeper, factually acceptable representation is constructed. The analyses leading to this second representation show that quantum mechanics does indeed involve basically a certain sort of nonseparability, called here de Broglie-Bohr quantum nonseparability. But the de Broglie-Bohr quantum nonseparability is shown to stem directly from the relativistic character of the considerations which led Louis de Broglie to the fundamental relation p = h/λ, thereby being essentially consistent with relativity. As to Einstein separability, it appears to be a still insufficiently specified concept of which a future, improved specification, will probably be explicitly harmonizable with the de Broglie-Bohr quantum nonseparability. The ensemble of the conclusions obtained here brings forth a new concept of causality, a concept of folded, zigzag, reflexive causality, with respect to which the type of causality conceived of up to now appears as a particular case of outstretched, one-way causality. The reflexive causality is found compatible with the results of Aspect's experiment, and it suggests new experiments. Considered globally, the conclusions obtained in the present work might convert the conceptual situation created by Bell's proof into a process of unification of quantum mechanics and relativity.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kilpatrick, Laura E.; Cotter, Ed
The U.S. Department of Energy (DOE) Office of Legacy Management is responsible for administering the DOE Uranium Leasing Program (ULP) and its 31 uranium lease tracts located in the Uravan Mineral Belt of southwestern Colorado (see Figure 1). In addition to administering the ULP for the last six decades, DOE has also undertaken the significant task of reclaiming a large number of abandoned uranium (legacy) mine sites and associated features located throughout the Uravan Mineral Belt. In 1995, DOE initiated a 3-year reconnaissance program to locate and delineate (through extensive on-the-ground mapping) the legacy mine sites and associated features containedmore » within the historically defined boundaries of its uranium lease tracts. During that same time frame, DOE recognized the lack of regulations pertaining to the reclamation of legacy mine sites and contacted the U.S. Bureau of Land Management (BLM) concerning the reclamation of legacy mine sites. In November 1995, The BLM Colorado State Office formally issued the United States Department of the Interior, Colorado Bureau of Land Management, Closure/Reclamation Guidelines, Abandoned Uranium Mine Sites as a supplement to its Solid Minerals Reclamation Handbook (H-3042-1). Over the next five-and-one-half years, DOE reclaimed the 161 legacy mine sites that had been identified on DOE withdrawn lands. By the late 1990's, the various BLM field offices in southwestern Colorado began to recognize DOE's experience and expertise in reclaiming legacy mine sites. During the ensuing 8 years, BLM funded DOE (through a series of task orders) to perform reclamation activities at 182 BLM mine sites. To date, DOE has reclaimed 372 separate and distinct legacy mine sites. During this process, DOE has learned many lessons and is willing to share those lessons with others in the reclamation industry because there are still many legacy mine sites not yet reclaimed. DOE currently administers 31 lease tracts (11,017 ha) that collectively contain over 220 legacy (abandoned) uranium mine sites. This contrasts to the millions of hectares administered by the BLM, the U.S. Forest Service, and other federal, tribal, and state agencies that contain thousands of such sites. DOE believes that the processes it has used provide a practical and cost-effective approach to abandoned uranium mine-site reclamation. Although the Federal Acquisition Regulations preclude DOE from competing with private industry, DOE is available to assist other governmental and tribal agencies in their reclamation efforts. (authors)« less
Cancer patient experience with navigation service in an urban hospital setting: a qualitative study.
Gotlib Conn, L; Hammond Mobilio, M; Rotstein, O D; Blacker, S
2016-01-01
Cancer patient navigators are increasingly present on the oncology health care team. The positive impact of navigation on cancer care is recognised, yet a clear understanding of what the patient navigator does and how he/she executes the role continues to emerge. This study aimed to understand cancer patients' perceptions of, and experiences with patient navigation, exploring how navigation may enhance the patient experience in an urban hospital setting where patients with varying needs are treated. A qualitative study using a constructionist approach was conducted. Fifteen colorectal cancer patients participated in semi-structured telephone interviews. Data were analyzed inductively and iteratively. Findings provide insight into two central aspects of cancer navigation: navigation as patient-centred coordination and explanation of clinical care, and navigation as individualised, holistic support. Within these themes, the key benefits of navigation from the patients' perspective were demystifying the system; ensuring comprehension, managing expectations; and, delivering patient-centred care. The navigator provided individualised and extended family support; a holistic approach; and, addressed emotional and psychological needs. These findings provide a means to operationalise and validate an emerging role description and competency framework for the cancer navigator who must identify and adapt to patients' varying needs throughout the cancer care continuum. © 2014 John Wiley & Sons Ltd.
Airfoil Shape Optimization based on Surrogate Model
NASA Astrophysics Data System (ADS)
Mukesh, R.; Lingadurai, K.; Selvakumar, U.
2018-02-01
Engineering design problems always require enormous amount of real-time experiments and computational simulations in order to assess and ensure the design objectives of the problems subject to various constraints. In most of the cases, the computational resources and time required per simulation are large. In certain cases like sensitivity analysis, design optimisation etc where thousands and millions of simulations have to be carried out, it leads to have a life time of difficulty for designers. Nowadays approximation models, otherwise called as surrogate models (SM), are more widely employed in order to reduce the requirement of computational resources and time in analysing various engineering systems. Various approaches such as Kriging, neural networks, polynomials, Gaussian processes etc are used to construct the approximation models. The primary intention of this work is to employ the k-fold cross validation approach to study and evaluate the influence of various theoretical variogram models on the accuracy of the surrogate model construction. Ordinary Kriging and design of experiments (DOE) approaches are used to construct the SMs by approximating panel and viscous solution algorithms which are primarily used to solve the flow around airfoils and aircraft wings. The method of coupling the SMs with a suitable optimisation scheme to carryout an aerodynamic design optimisation process for airfoil shapes is also discussed.
Wu, Jinlu
2013-01-01
Laboratory education can play a vital role in developing a learner's autonomy and scientific inquiry skills. In an innovative, mutation-based learning (MBL) approach, students were instructed to redesign a teacher-designed standard experimental protocol by a “mutation” method in a molecular genetics laboratory course. Students could choose to delete, add, reverse, or replace certain steps of the standard protocol to explore questions of interest to them in a given experimental scenario. They wrote experimental proposals to address their rationales and hypotheses for the “mutations”; conducted experiments in parallel, according to both standard and mutated protocols; and then compared and analyzed results to write individual lab reports. Various autonomy-supportive measures were provided in the entire experimental process. Analyses of student work and feedback suggest that students using the MBL approach 1) spend more time discussing experiments, 2) use more scientific inquiry skills, and 3) find the increased autonomy afforded by MBL more enjoyable than do students following regimented instructions in a conventional “cookbook”-style laboratory. Furthermore, the MBL approach does not incur an obvious increase in labor and financial costs, which makes it feasible for easy adaptation and implementation in a large class. PMID:24006394
Fairclough, Stuart J.; Knowles, Zoe R.; Boddy, Lynne M.
2017-01-01
Understanding family physical activity (PA) behaviour is essential for designing effective family-based PA interventions. However, effective approaches to capture the perceptions and “lived experiences” of families are not yet well established. The aims of the study were to: (1) demonstrate how a “write, draw, show and tell” (WDST) methodological approach can be appropriate to family-based PA research, and (2) present two distinct family case studies to provide insights into the habitual PA behaviour and experiences of a nuclear and single-parent family. Six participants (including two “target” children aged 9–11 years, two mothers and two siblings aged 6–8 years) from two families were purposefully selected to take part in the study, based on their family structure. Participants completed a paper-based PA diary and wore an ActiGraph GT9X accelerometer on their left wrist for up to 10 weekdays and 16 weekend days. A range of WDST tasks were then undertaken by each family to offer contextual insight into their family-based PA. The selected families participated in different levels and modes of PA, and reported contrasting leisure opportunities and experiences. These novel findings encourage researchers to tailor family-based PA intervention programmes to the characteristics of the family. PMID:28708114
Efficient Monte Carlo Methods for Biomolecular Simulations.
NASA Astrophysics Data System (ADS)
Bouzida, Djamal
A new approach to efficient Monte Carlo simulations of biological molecules is presented. By relaxing the usual restriction to Markov processes, we are able to optimize performance while dealing directly with the inhomogeneity and anisotropy inherent in these systems. The advantage of this approach is that we can introduce a wide variety of Monte Carlo moves to deal with complicated motions of the molecule, while maintaining full optimization at every step. This enables the use of a variety of collective rotational moves that relax long-wavelength modes. We were able to show by explicit simulations that the resulting algorithms substantially increase the speed of the simulation while reproducing the correct equilibrium behavior. This approach is particularly intended for simulations of macromolecules, although we expect it to be useful in other situations. The dynamic optimization of the new Monte Carlo methods makes them very suitable for simulated annealing experiments on all systems whose state space is continuous in general, and to the protein folding problem in particular. We introduce an efficient annealing schedule using preferential bias moves. Our simulated annealing experiments yield structures whose free energies were lower than the equilibrated X-ray structure, which leads us to believe that the empirical energy function used does not fully represent the interatomic interactions. Furthermore, we believe that the largest discrepancies involve the solvent effects in particular.
Incorporating Experience Curves in Appliance Standards Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garbesi, Karina; Chan, Peter; Greenblatt, Jeffery
2011-10-31
The technical analyses in support of U.S. energy conservation standards for residential appliances and commercial equipment have typically assumed that manufacturing costs and retail prices remain constant during the projected 30-year analysis period. There is, however, considerable evidence that this assumption does not reflect real market prices. Costs and prices generally fall in relation to cumulative production, a phenomenon known as experience and modeled by a fairly robust empirical experience curve. Using price data from the Bureau of Labor Statistics, and shipment data obtained as part of the standards analysis process, we present U.S. experience curves for room air conditioners,more » clothes dryers, central air conditioners, furnaces, and refrigerators and freezers. These allow us to develop more representative appliance price projections than the assumption-based approach of constant prices. These experience curves were incorporated into recent energy conservation standards for these products. The impact on the national modeling can be significant, often increasing the net present value of potential standard levels in the analysis. In some cases a previously cost-negative potential standard level demonstrates a benefit when incorporating experience. These results imply that past energy conservation standards analyses may have undervalued the economic benefits of potential standard levels.« less
Experiences of stress among nurses in acute mental health settings.
Currid, Thomas
To explore occupational stressors, the lived experience of stress and the meaning of this experience for staff working in acute mental health care. The study adopted a hermeneutic phenomenological approach to ascertain the lived experience of stress among eight qualified staff working in a mental health NHS trust in London. A semi-structured interview format was used. Interviews were transcribed verbatim and analysed using an interpretative phenomenological analysis framework. The occupational experience of nurses in this study indicates that staff are frequently subjected to violent and aggressive behaviour from patients. Such experiences adversely affect patient outcomes in that staff may be reluctant to engage with such individuals because of anxiety about being hurt or experiencing further intimidation. Environmental pressures coupled with high activity levels mean that staff have little time to focus on the task at hand or to plan future activities. As a result they find that when they go home they are unable to switch off from work. Further investment is needed in acute mental health settings and in staff who work in this area. If this does not happen, it is likely that the quality of service provision will deteriorate and nurses' health and wellbeing will suffer.
NASA Technical Reports Server (NTRS)
Klein, Harold P.
1989-01-01
A brief review of the purposes and the results from the Viking Biology experiments is presented, in the expectation that the lessons learned from this mission will be useful in planning future approaches to the biological exploration of Mars. Since so little was then known about potential micro-environments on Mars, three different experiments were included in the Viking mission, each one based on different assumptions about what Martian organisms might be like. In addition to the Viking Biology Instrument (VBI), important corollary information was obtained from the Viking lander imaging system and from the molecular analysis experiments that were conducted using the gas chromatograph-mass spectrometer (GCMS) instrument. No biological objects were noted by the lander imaging instrument. The GCMS did not detect any organic compounds. A description of the tests conducted by the Gas Exchange Experiment, the Labeled Release experiment, and the Pyrolytic Release experiment is given. Results are discussed. Taken as a whole, the Viking data yielded no unequivocal evidence for a Martian biota at either landing site. The results also revealed the presence of one or more reactive oxidants in the surface material and these need to be further characterized, as does the range of micro-environments, before embarking upon future searches for extant life on Mars.
Fang, Yun; Wu, Hulin; Zhu, Li-Xing
2011-07-01
We propose a two-stage estimation method for random coefficient ordinary differential equation (ODE) models. A maximum pseudo-likelihood estimator (MPLE) is derived based on a mixed-effects modeling approach and its asymptotic properties for population parameters are established. The proposed method does not require repeatedly solving ODEs, and is computationally efficient although it does pay a price with the loss of some estimation efficiency. However, the method does offer an alternative approach when the exact likelihood approach fails due to model complexity and high-dimensional parameter space, and it can also serve as a method to obtain the starting estimates for more accurate estimation methods. In addition, the proposed method does not need to specify the initial values of state variables and preserves all the advantages of the mixed-effects modeling approach. The finite sample properties of the proposed estimator are studied via Monte Carlo simulations and the methodology is also illustrated with application to an AIDS clinical data set.
Flight Screening Program Effects on Attrition in Undergraduate Pilot Training
1987-08-01
the final fiveý lesson grades (8-12), suggesting that a UPT screening decision could be made at an earl~er stage of FSP than is the current practice...Does FSP Provide An Opportunity For SIE? ....... .... 6 Training/EAperience Effects of FS?: Does the FSP Give a Training/ Experience Benefit in UPT...effect. FSP Screening: Does FSP Provide an Opportunity for SIE? Some individuals who have had no previous flying experience (other than as passengers) may
NASA Technical Reports Server (NTRS)
Schulte, Peter Z.; Moore, James W.
2011-01-01
The Crew Exploration Vehicle Parachute Assembly System (CPAS) project conducts computer simulations to verify that flight performance requirements on parachute loads and terminal rate of descent are met. Design of Experiments (DoE) provides a systematic method for variation of simulation input parameters. When implemented and interpreted correctly, a DoE study of parachute simulation tools indicates values and combinations of parameters that may cause requirement limits to be violated. This paper describes one implementation of DoE that is currently being developed by CPAS, explains how DoE results can be interpreted, and presents the results of several preliminary studies. The potential uses of DoE to validate parachute simulation models and verify requirements are also explored.
Counting states and the Hadron Resonance Gas: Does X(3872) count?
NASA Astrophysics Data System (ADS)
Ortega, Pablo G.; Entem, David R.; Fernández, Francisco; Ruiz Arriola, Enrique
2018-06-01
We analyze how the renowned X(3872), a weakly bound state right below the DDbar* threshold, should effectively be included in a hadronic representation of the QCD partition function. This can be decided by analyzing the DDbar* scattering phase-shifts in the JPC =1++ channel and their contribution to the level density in the continuum from which the abundance in a hot medium can be determined. We show that in a purely molecular picture the bound state contribution cancels the continuum providing a vanishing occupation number density at finite temperature and the X (3872) does not count below the Quark-Gluon Plasma crossover happening at T ∼ 150 MeV. In contrast, within a coupled-channels approach, for a non vanishing c c bar content the cancellation does not occur due to the onset of the X (3940) which effectively counts as an elementary particle for temperatures above T ≳ 250 MeV. Thus, a direct inclusion of the X (3872) in the Hadron Resonance Gas is not justified. We also estimate the role of this cancellation in X(3872) production in heavy-ion collision experiments in terms of the corresponding pT distribution due to a finite energy resolution.
Marshall, Anikó; Santollo, Jessica; Corteville, Caroline; Lutz, Thomas A; Daniels, Derek
2014-07-15
Bariatric surgery is currently the most effective treatment for severe obesity, and Roux-en-Y gastric bypass (RYGB) is the most common approach in the United States and worldwide. Many studies have documented the changes in body weight, food intake, and glycemic control associated with the procedure. Although dehydration is commonly listed as a postoperative complication, little focus has been directed to testing the response to dipsogenic treatments after RYGB. Accordingly, we used a rat model of RYGB to test for procedure-induced changes in daily water intake and in the response to three dipsogenic treatments: central administration of ANG II, peripheral injection of hypertonic saline, and overnight water deprivation. We did not find any systematic differences in daily water intake of sham-operated and RYGB rats, nor did we find any differences in the response to the dipsogenic treatments. The results of these experiments suggest that RYGB does not impair thirst responses and does not enhance any satiating effect of water intake. Furthermore, these data support the current view that feedback from the stomach is unnecessary for the termination of drinking behavior and are consistent with a role of orosensory or postgastric feedback.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wardaya, P. D., E-mail: pongga.wardaya@utp.edu.my; Noh, K. A. B. M., E-mail: pongga.wardaya@utp.edu.my; Yusoff, W. I. B. W., E-mail: pongga.wardaya@utp.edu.my
This paper discusses a new approach for investigating the seismic wave velocity of rock, specifically carbonates, as affected by their pore structures. While the conventional routine of seismic velocity measurement highly depends on the extensive laboratory experiment, the proposed approach utilizes the digital rock physics view which lies on the numerical experiment. Thus, instead of using core sample, we use the thin section image of carbonate rock to measure the effective seismic wave velocity when travelling on it. In the numerical experiment, thin section images act as the medium on which wave propagation will be simulated. For the modeling, anmore » advanced technique based on artificial neural network was employed for building the velocity and density profile, replacing image's RGB pixel value with the seismic velocity and density of each rock constituent. Then, ultrasonic wave was simulated to propagate in the thin section image by using finite difference time domain method, based on assumption of an acoustic-isotropic medium. Effective velocities were drawn from the recorded signal and being compared to the velocity modeling from Wyllie time average model and Kuster-Toksoz rock physics model. To perform the modeling, image analysis routines were undertaken for quantifying the pore aspect ratio that is assumed to represent the rocks pore structure. In addition, porosity and mineral fraction required for velocity modeling were also quantified by using integrated neural network and image analysis technique. It was found that the Kuster-Toksoz gives the closer prediction to the measured velocity as compared to the Wyllie time average model. We also conclude that Wyllie time average that does not incorporate the pore structure parameter deviates significantly for samples having more than 40% porosity. Utilizing this approach we found a good agreement between numerical experiment and theoretically derived rock physics model for estimating the effective seismic wave velocity of rock.« less
Lauría, Antonio
2016-02-15
In order to plan a trip, tourists with disabilities need to gather and analyse a broad range of information concerning the features of the places and services with which they are going to interact. For these people, guidebooks may represent an important source of information for gaining prior knowledge about the various critical situations they may experience as tourists. Generally, disabled people find tourist information on dedicated communication tools; guidebooks for the disabled often provide information for wheelchair users only. The aim of the research project was to develop a mainstream guidebook with supplementary tourist information both for people with impaired vision and for people with reduced mobility. The communication project behind "The Florence Experience" guidebook is inspired by both the Universal Design approach and the Performance Design approach. This article describes a case study and provides suggestions for planning in similar situations. It is also part of a broader research project relating to the communication about urban spaces accessibility. The main outcome of the research project is a multimedia and multisensory bilingual guidebook (in Italian and English) that provides information in four separate coordinated forms: a paper-based guidebook, web pages, MP3 audio files, and portable tactile maps. Creating a guidebook for all is a tough challenge that requires a highly articulated vision and the cooperation of different fields of knowledge and skills. Despite the limits described in the paper, "The Florence Experience" guidebook is, in our opinion, a considerable step forward with respect to the majority of available guidebooks both because it is a unique information tool for disabled and non-disabled people and because, unlike the majority of the guidebooks for disabled people, it does not only consider the needs of wheelchair users.
Peeters, Elisabeth; De Beer, Thomas; Vervaet, Chris; Remon, Jean-Paul
2015-04-01
Tableting is a complex process due to the large number of process parameters that can be varied. Knowledge and understanding of the influence of these parameters on the final product quality is of great importance for the industry, allowing economic efficiency and parametric release. The aim of this study was to investigate the influence of paddle speeds and fill depth at different tableting speeds on the weight and weight variability of tablets. Two excipients possessing different flow behavior, microcrystalline cellulose (MCC) and dibasic calcium phosphate dihydrate (DCP), were selected as model powders. Tablets were manufactured via a high-speed rotary tablet press using design of experiments (DoE). During each experiment also the volume of powder in the forced feeder was measured. Analysis of the DoE revealed that paddle speeds are of minor importance for tablet weight but significantly affect volume of powder inside the feeder in case of powders with excellent flowability (DCP). The opposite effect of paddle speed was observed for fairly flowing powders (MCC). Tableting speed played a role in weight and weight variability, whereas changing fill depth exclusively influenced tablet weight. The DoE approach allowed predicting the optimum combination of process parameters leading to minimum tablet weight variability. Monte Carlo simulations allowed assessing the probability to exceed the acceptable response limits if factor settings were varied around their optimum. This multi-dimensional combination and interaction of input variables leading to response criteria with acceptable probability reflected the design space.
Multiple-Zone Diffractive Optic Element for Laser Ranging Applications
NASA Technical Reports Server (NTRS)
Ramos-Izquierdo, Luis A.
2011-01-01
A diffractive optic element (DOE) can be used as a beam splitter to generate multiple laser beams from a single input laser beam. This technology has been recently used in LRO s Lunar Orbiter Laser Altimeter (LOLA) instrument to generate five laser beams that measure the lunar topography from a 50-km nominal mapping orbit (see figure). An extension of this approach is to use a multiple-zone DOE to allow a laser altimeter instrument to operate over a wider range of distances. In particular, a multiple-zone DOE could be used for applications that require both mapping and landing on a planetary body. In this case, the laser altimeter operating range would need to extend from several hundred kilometers down to a few meters. The innovator was recently involved in an investigation how to modify the LOLA instrument for the OSIRIS asteroid mapping and sample return mission. One approach is to replace the DOE in the LOLA laser beam expander assembly with a multiple-zone DOE that would allow for the simultaneous illumination of the asteroid with mapping and landing laser beams. The proposed OSIRIS multiple-zone DOE would generate the same LOLA five-beam output pattern for high-altitude topographic mapping, but would simultaneously generate a wide divergence angle beam using a small portion of the total laser energy for the approach and landing portion of the mission. Only a few percent of the total laser energy is required for approach and landing operations as the return signal increases as the inverse square of the ranging height. A wide divergence beam could be implemented by making the center of the DOE a diffractive or refractive negative lens. The beam energy and beam divergence characteristics of a multiple-zone DOE could be easily tailored to meet the requirements of other missions that require laser ranging data. Current single-zone DOE lithographic manufacturing techniques could also be used to fabricate a multiple-zone DOE by masking the different DOE zones during the manufacturing process, and the same space-compatible DOE substrates (fused silica, sapphire) that are used on standard DOE s could be used for multiple- zone DOE s. DOEs are an elegant and cost-effective optical design option for spacebased laser altimeters that require multiple output laser beams. The use of multiple-zone DOEs would allow for the design and optimization of a laser altimeter instrument required to operate over a large range of target distances, such as those designed to both map and land on a planetary body. In addition to space-based laser altimeters, this technology could find applications in military or commercial unmanned aerial vehicles (UAVs) that fly at an altitude of several kilometers and need to land. It is also conceivable that variations of this approach could be used in land-based applications such as collision avoidance and robotic control of cars, trains, and ships.
Flom, Ross; Gartman, Peggy
2016-03-01
Several studies have examined dogs' (Canis lupus familiaris) comprehension and use of human communicative cues. Relatively few studies have, however, examined the effects of human affective behavior (i.e., facial and vocal expressions) on dogs' exploratory and point-following behavior. In two experiments, we examined dogs' frequency of following an adult's pointing gesture in locating a hidden reward or treat when it occurred silently, or when it was paired with a positive or negative facial and vocal affective expression. Like prior studies, the current results demonstrate that dogs reliably follow human pointing cues. Unlike prior studies, the current results also demonstrate that the addition of a positive affective facial and vocal expression, when paired with a pointing gesture, did not reliably increase dogs' frequency of locating a hidden piece of food compared to pointing alone. In addition, and within the negative facial and vocal affect conditions of Experiment 1 and 2, dogs were delayed in their exploration, or approach, toward a baited or sham-baited bowl. However, in Experiment 2, dogs continued to follow an adult's pointing gesture, even when paired with a negative expression, as long as the attention-directing gesture referenced a baited bowl. Together these results suggest that the addition of affective information does not significantly increase or decrease dogs' point-following behavior. Rather these results demonstrate that the presence or absence of affective expressions influences a dogs' exploratory behavior and the presence or absence of reward affects whether they will follow an unfamiliar adult's attention-directing gesture.
NASA Astrophysics Data System (ADS)
Jones, Peter N.
The majority of studies concerning consciousness have examined and modeled the concept of consciousness in terms of particular lines of inquiry, a process that has circumscribed the general applicability of any results from such approaches. The purpose of this dissertation was to study consciousness from a concept-based, cross-cultural approach and to attempt to unify the concept across the cultures examined. The 4 cultures are the academic disciplines of philosophy, physics, psychology, and anthropology. Consciousness was examined in terms of how the concept is framed and where the major limitations in each line of inquiry occur. The rationale for examining consciousness as a concept across 4 cultures was to determine whether there was any common component in each line's framing that could be used to unify the concept. The study found that experience itself was the primary unifying factor in each field's framing and that experience was treated as a nonreducible property within each line of inquiry. By taking experience itself (but not subjective experience) as a fundamental property, each culture's concept of consciousness becomes tractable. As such, this dissertation argues that experience should be taken as a fundamental property of the concept. The significance of this analysis is that by taking experience as a fundamental property, it becomes possible to unify the concept across the 4 cultures. This unification is presented as a unity thesis, which is a theory arguing for unification of the concept based on the fundamental of experience. Following this theoretical examination, this paper discusses several key implications of the unity thesis, including implications of the unity thesis for the current status of altered states of consciousness and for the so-called hard and easy problems associated with the concept (at least within Occidental ontology). It is argued that the so-called hard problem does not exist when experience is taken as a fundamental property of ontological reality and that altered states of consciousness are in fact better understood as access states of consciousness based on unity thesis. The dissertation concludes with suggestions for further lines of research.
How does delivery method influence factors that contribute to women's childbirth experiences?
Carquillat, Pierre; Boulvain, Michel; Guittier, Marie-Julia
2016-12-01
whether delivery method influences factors contributing to women's childbirth experience remains debated. we compared subjective childbirth experience according to different delivery methods. this study used a cross-sectional design. the setting comprised two university hospitals: one in Geneva, Switzerland and one in Clermont-Ferrand, France. a total of 291 primiparous women were recruited from July 2014 to January 2015 during their stay in the maternity wards. The mean age of the participants was 30.8 (SD=4.7) years, and most were Swiss or European (86%). the 'Questionnaire for Assessing Childbirth Experience' was sent between four and six weeks after delivery. Clinimetric and psychometric approaches were used to assess childbirth experience according to delivery method. the mean scores of the four questionnaire dimensions varied significantly by delivery method. 'First moments with the newborn' was more negatively experienced by women from the caesarean section group compared to those who delivered vaginally (p<0.001). Similar results regarding the dimension of 'emotional status' were also observed, as women who delivered by caesarean section felt more worried, less secure, and less confident (p=0.001). 'Relationship with staff' significantly differed between groups (p=0.047) as more negative results were shown in the 'unexpected medical intervention groups' (i.e. emergency caesarean section and instrumental delivered vaginally). Women's 'feelings at one-month post partum' in the emergency caesarean section group were less satisfactory than the other groups. Delivery method and other obstetric variables explained only a low proportion of the total variance in the global scores (R 2 adjusted=0.18), which emphasized the importance of subjective factors in women's childbirth experience. a comparison of best expected positive responses to each item (clinimetric approach) showed useful results for clinicians. This research indicated that delivery method influenced key factors (psychometric approach) of the childbirth experience. delivery method should not be considered alone and health professionals should focus on what is important for women to foster a more positive experience. In addition, women who have had an emergency caesarean section require special attention during post partum. Copyright © 2016 Elsevier Ltd. All rights reserved.
Aspects of ESA s public outreach programme
NASA Astrophysics Data System (ADS)
Maree, H.
The Science Programme Communication Service is currently implementing a new policy to increase the overall public interest in ESA Science Programme by adopting new ways of promoting its activities, accordingly to the simple principle that "different target audiences have different needs". It is clear that the general public (i.e. "the man in the street" / "the average tax- payer") rarely has the knowledge and the background to understand what exactly a space mission is, what it does and why it does it ("Mission oriented approach"). The experience has shown that a space mission becomes "popular" amongst this target audience when the relevant communication is done by passing generic/bas ic/simple messages ("Thematic oriented approach"). The careful selection of adequate supports together with efficient distribution and promotion networks are also key parameters for success of the latter approach. One should also note that the overall objective of this new policy, is to raise people's interest in space in general. By presenting the information under the ESA brand, the public will start more and more to associate this brand and Europe to space exploration. Within the next twelve months, four scientific missions will be launched. Interestingly, tree of them (SMART-1, ROSETTA and MARS EXPRESS) offer a unique opportunity to implement the new communication policy under the single thematic : Europe is exploring the Solar System. Nevertheless, the study of the various mission profiles and their potential communication impact lead us to choose to reach out the general public primarily via the sub-thematic : Europe goes to Mars.
Design an optimum safety policy for personnel safety management - A system dynamic approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balaji, P.
2014-10-06
Personnel safety management (PSM) ensures that employee's work conditions are healthy and safe by various proactive and reactive approaches. Nowadays it is a complex phenomenon because of increasing dynamic nature of organisations which results in an increase of accidents. An important part of accident prevention is to understand the existing system properly and make safety strategies for that system. System dynamics modelling appears to be an appropriate methodology to explore and make strategy for PSM. Many system dynamics models of industrial systems have been built entirely for specific host firms. This thesis illustrates an alternative approach. The generic system dynamicsmore » model of Personnel safety management was developed and tested in a host firm. The model was undergone various structural, behavioural and policy tests. The utility and effectiveness of model was further explored through modelling a safety scenario. In order to create effective safety policy under resource constraint, DOE (Design of experiment) was used. DOE uses classic designs, namely, fractional factorials and central composite designs. It used to make second order regression equation which serve as an objective function. That function was optimized under budget constraint and optimum value used for safety policy which shown greatest improvement in overall PSM. The outcome of this research indicates that personnel safety management model has the capability for acting as instruction tool to improve understanding of safety management and also as an aid to policy making.« less
34 CFR 647.22 - How does the Secretary evaluate prior experience?
Code of Federal Regulations, 2010 CFR
2010-07-01
... 34 Education 3 2010-07-01 2010-07-01 false How does the Secretary evaluate prior experience? 647.22 Section 647.22 Education Regulations of the Offices of the Department of Education (Continued) OFFICE OF POSTSECONDARY EDUCATION, DEPARTMENT OF EDUCATION RONALD E. MCNAIR POSTBACCALAUREATE ACHIEVEMENT...
Practical aspects of running DOE for improving growth media for in vitro plants
USDA-ARS?s Scientific Manuscript database
Experiments using DOE software to improve plant tissue culture growth medium are complicated and require complex setups. Once the experimental design is set and the treatment points calculated, media sheets and mixing charts must be developed. Since these experiments require three passages on the sa...
De Beer, T R M; Wiggenhorn, M; Hawe, A; Kasper, J C; Almeida, A; Quinten, T; Friess, W; Winter, G; Vervaet, C; Remon, J P
2011-02-15
The aim of the present study was to examine the possibilities/advantages of using recently introduced in-line spectroscopic process analyzers (Raman, NIR and plasma emission spectroscopy), within well-designed experiments, for the optimization of a pharmaceutical formulation and its freeze-drying process. The formulation under investigation was a mannitol (crystalline bulking agent)-sucrose (lyo- and cryoprotector) excipient system. The effects of two formulation variables (mannitol/sucrose ratio and amount of NaCl) and three process variables (freezing rate, annealing temperature and secondary drying temperature) upon several critical process and product responses (onset and duration of ice crystallization, onset and duration of mannitol crystallization, duration of primary drying, residual moisture content and amount of mannitol hemi-hydrate in end product) were examined using a design of experiments (DOE) methodology. A 2-level fractional factorial design (2(5-1)=16 experiments+3 center points=19 experiments) was employed. All experiments were monitored in-line using Raman, NIR and plasma emission spectroscopy, which supply continuous process and product information during freeze-drying. Off-line X-ray powder diffraction analysis and Karl-Fisher titration were performed to determine the morphology and residual moisture content of the end product, respectively. In first instance, the results showed that - besides the previous described findings in De Beer et al., Anal. Chem. 81 (2009) 7639-7649 - Raman and NIR spectroscopy are able to monitor the product behavior throughout the complete annealing step during freeze-drying. The DOE approach allowed predicting the optimum combination of process and formulation parameters leading to the desired responses. Applying a mannitol/sucrose ratio of 4, without adding NaCl and processing the formulation without an annealing step, using a freezing rate of 0.9°C/min and a secondary drying temperature of 40°C resulted in efficient freeze-drying supplying end products with a residual moisture content below 2% and a mannitol hemi-hydrate content below 20%. Finally, using Monte Carlo simulations it became possible to determine how varying the factor settings around their optimum still leads to fulfilled response criteria, herewith having an idea about the probability to exceed the acceptable response limits. This multi-dimensional combination and interaction of input variables (factor ranges) leading to acceptable response criteria with an acceptable probability reflects the process design space. Copyright © 2010 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Arafa, Mona G.; Ayoub, Bassam M.
2017-01-01
Niosomes entrapping pregabalin (PG) were prepared using span 60 and cholesterol in different molar ratios by hydration method, the remaining PG from the hydrating solution was separated from vesicles by freeze centrifugation. Optimization of nano-based carrier of pregabalin (PG) was achieved. Quality by Design strategy was successfully employed to obtain PG-loaded niosomes with the desired properties. The optimal particle size, drug release and entrapment efficiency were attained by Minitab® program using design of experiment (DOE) that predicted the best parameters by investigating the combined effect of different factors simultaneously. Pareto chart was used in the screening step to exclude the insignificant variables while response surface methodology (RSM) was used in the optimization step to study the significant factors. Best formula was selected to prepare topical hydrogels loaded with niosomal PG using HPMC and Carbopol 934. It was verified, by means of mechanical and rheological tests, that addition of the vesicles to the gel matrix affected significantly gel network. In vitro release and ex vivo permeation experiments were carried out. Delivery of PG molecules followed a Higuchi, non Fickian diffusion. The present work will be of interest for pharmaceutical industry as a controlled transdermal alternative to the conventional oral route.
Arafa, Mona G.; Ayoub, Bassam M.
2017-01-01
Niosomes entrapping pregabalin (PG) were prepared using span 60 and cholesterol in different molar ratios by hydration method, the remaining PG from the hydrating solution was separated from vesicles by freeze centrifugation. Optimization of nano-based carrier of pregabalin (PG) was achieved. Quality by Design strategy was successfully employed to obtain PG-loaded niosomes with the desired properties. The optimal particle size, drug release and entrapment efficiency were attained by Minitab® program using design of experiment (DOE) that predicted the best parameters by investigating the combined effect of different factors simultaneously. Pareto chart was used in the screening step to exclude the insignificant variables while response surface methodology (RSM) was used in the optimization step to study the significant factors. Best formula was selected to prepare topical hydrogels loaded with niosomal PG using HPMC and Carbopol 934. It was verified, by means of mechanical and rheological tests, that addition of the vesicles to the gel matrix affected significantly gel network. In vitro release and ex vivo permeation experiments were carried out. Delivery of PG molecules followed a Higuchi, non Fickian diffusion. The present work will be of interest for pharmaceutical industry as a controlled transdermal alternative to the conventional oral route. PMID:28134262
NASA Astrophysics Data System (ADS)
Marletto, C.; Vedral, V.
2017-12-01
All existing quantum-gravity proposals are extremely hard to test in practice. Quantum effects in the gravitational field are exceptionally small, unlike those in the electromagnetic field. The fundamental reason is that the gravitational coupling constant is about 43 orders of magnitude smaller than the fine structure constant, which governs light-matter interactions. For example, detecting gravitons—the hypothetical quanta of the gravitational field predicted by certain quantum-gravity proposals—is deemed to be practically impossible. Here we adopt a radically different, quantum-information-theoretic approach to testing quantum gravity. We propose witnessing quantumlike features in the gravitational field, by probing it with two masses each in a superposition of two locations. First, we prove that any system (e.g., a field) mediating entanglement between two quantum systems must be quantum. This argument is general and does not rely on any specific dynamics. Then, we propose an experiment to detect the entanglement generated between two masses via gravitational interaction. By our argument, the degree of entanglement between the masses is a witness of the field quantization. This experiment does not require any quantum control over gravity. It is also closer to realization than detecting gravitons or detecting quantum gravitational vacuum fluctuations.
Marletto, C; Vedral, V
2017-12-15
All existing quantum-gravity proposals are extremely hard to test in practice. Quantum effects in the gravitational field are exceptionally small, unlike those in the electromagnetic field. The fundamental reason is that the gravitational coupling constant is about 43 orders of magnitude smaller than the fine structure constant, which governs light-matter interactions. For example, detecting gravitons-the hypothetical quanta of the gravitational field predicted by certain quantum-gravity proposals-is deemed to be practically impossible. Here we adopt a radically different, quantum-information-theoretic approach to testing quantum gravity. We propose witnessing quantumlike features in the gravitational field, by probing it with two masses each in a superposition of two locations. First, we prove that any system (e.g., a field) mediating entanglement between two quantum systems must be quantum. This argument is general and does not rely on any specific dynamics. Then, we propose an experiment to detect the entanglement generated between two masses via gravitational interaction. By our argument, the degree of entanglement between the masses is a witness of the field quantization. This experiment does not require any quantum control over gravity. It is also closer to realization than detecting gravitons or detecting quantum gravitational vacuum fluctuations.
Kim, Nam Ah; An, In Bok; Lee, Sang Yeol; Park, Eun-Seok; Jeong, Seong Hoon
2012-09-01
In this study, the structural stability of hen egg white lysozyme in solution at various pH levels and in different types of buffers, including acetate, phosphate, histidine, and Tris, was investigated by means of differential scanning calorimetry (DSC). Reasonable pH values were selected from the buffer ranges and were analyzed statistically through design of experiment (DoE). Four factors were used to characterize the thermograms: calorimetric enthalpy (ΔH), temperature at maximum heat flux (T( m )), van't Hoff enthalpy (ΔH( V )), and apparent activation energy of protein solution (E(app)). It was possible to calculate E(app) through mathematical elaboration from the Lumry-Eyring model by changing the scan rate. The transition temperature of protein solution, T( m ), increased when the scan rate was faster. When comparing the T( m ), ΔH( V ), ΔH, and E(app) of lysozyme in various pH ranges and buffers with different priorities, lysozyme in acetate buffer at pH 4.767 (scenario 9) to pH 4.969 (scenario 11) exhibited the highest thermodynamic stability. Through this experiment, we found a significant difference in the thermal stability of lysozyme in various pH ranges and buffers and also a new approach to investigate the physical stability of protein by DoE.
Observer POD for radiographic testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kanzler, Daniel, E-mail: daniel.kanzler@bam.de, E-mail: uwe.ewert@bam.de, E-mail: christina.mueller@bam.de; Ewert, Uwe, E-mail: daniel.kanzler@bam.de, E-mail: uwe.ewert@bam.de, E-mail: christina.mueller@bam.de; Müller, Christina, E-mail: daniel.kanzler@bam.de, E-mail: uwe.ewert@bam.de, E-mail: christina.mueller@bam.de
2015-03-31
The radiographic testing (RT) is a non-destructive testing (NDT) method capable of finding volumetric and open planar defects depending on their orientation. The radiographic contrast is higher for larger penetrated length of the defect in a component. Even though, the detectability of defects does not only depend on the contrast, but also on the noise, the defect area and the geometry of the defect. The currently applied Probability of Detection (POD) approach uses a detection threshold that is only based on a constant noise level or on a constant contrast threshold. This does not reflect accurately the results of evaluationsmore » by human observers. A new approach is introduced, using the widely applied POD evaluation and additionally a detection threshold depending on the lateral area and shape of the indication. This work shows the process of calculating the POD curves with simulated data by the modeling software aRTist and with artificial reference data of different defect types, such as ASTM E 476 EPS plates, flat bottom holes and notches. Additional experiments with different operators confirm that the depth of a defect, the lateral area and shape of its indication contribute with different weight to the detectability of the defect if evaluated by human operators on monitors.« less
Optimization of turning process through the analytic flank wear modelling
NASA Astrophysics Data System (ADS)
Del Prete, A.; Franchi, R.; De Lorenzis, D.
2018-05-01
In the present work, the approach used for the optimization of the process capabilities for Oil&Gas components machining will be described. These components are machined by turning of stainless steel castings workpieces. For this purpose, a proper Design Of Experiments (DOE) plan has been designed and executed: as output of the experimentation, data about tool wear have been collected. The DOE has been designed starting from the cutting speed and feed values recommended by the tools manufacturer; the depth of cut parameter has been maintained as a constant. Wear data has been obtained by means the observation of the tool flank wear under an optical microscope: the data acquisition has been carried out at regular intervals of working times. Through a statistical data and regression analysis, analytical models of the flank wear and the tool life have been obtained. The optimization approach used is a multi-objective optimization, which minimizes the production time and the number of cutting tools used, under the constraint on a defined flank wear level. The technique used to solve the optimization problem is a Multi Objective Particle Swarm Optimization (MOPS). The optimization results, validated by the execution of a further experimental campaign, highlighted the reliability of the work and confirmed the usability of the optimized process parameters and the potential benefit for the company.
A systematic review and thematic analysis of cinema in medical education.
Darbyshire, Daniel; Baker, Paul
2012-06-01
The use of cinema in medical education has the potential to teach students about a variety of subjects, for instance by illustrating a lecture on communication skills with a clip of Sir Lancelot Spratt (Doctor In The House, 1954) demonstrating a paternalistic, doctor-centred approach to medicine or nurturing an ethical discussion around palliative care and dying using the cinematic adaptation of American playwright Margaret Edson's Wit (2001). Much has been written about this teaching method across several medical academic disciplines. It is the aim of this review to assimilate the various experiences in order to gain an insight into current expertise. The results are presented by the following headings under which the articles were examined: the source journal, year of publication, article type, theme, content, target, authors, if a clip or the entire film was used, and if any feedback was documented. This is followed by a chronological account of the development of the literature. Such an approach will allow the reader to gather specific information and contextualise it. This review does not critically appraise the quality of the evidence nor does it determine its validity, rather it is hoped that having read the review educators will know where to locate previous accounts of work that will help them develop more engaging pedagogy.
Major transitions in information technology.
Valverde, Sergi
2016-08-19
When looking at the history of technology, we can see that all inventions are not of equal importance. Only a few technologies have the potential to start a new branching series (specifically, by increasing diversity), have a lasting impact in human life and ultimately became turning points. Technological transitions correspond to times and places in the past when a large number of novel artefact forms or behaviours appeared together or in rapid succession. Why does that happen? Is technological change continuous and gradual or does it occur in sudden leaps and bounds? The evolution of information technology (IT) allows for a quantitative and theoretical approach to technological transitions. The value of information systems experiences sudden changes (i) when we learn how to use this technology, (ii) when we accumulate a large amount of information, and (iii) when communities of practice create and exchange free information. The coexistence between gradual improvements and discontinuous technological change is a consequence of the asymmetric relationship between complexity and hardware and software. Using a cultural evolution approach, we suggest that sudden changes in the organization of ITs depend on the high costs of maintaining and transmitting reliable information.This article is part of the themed issue 'The major synthetic evolutionary transitions'. © 2016 The Author(s).
Probing intracellular motor protein activity using an inducible cargo trafficking assay.
Kapitein, Lukas C; Schlager, Max A; van der Zwan, Wouter A; Wulf, Phebe S; Keijzer, Nanda; Hoogenraad, Casper C
2010-10-06
Although purified cytoskeletal motor proteins have been studied extensively with the use of in vitro approaches, a generic approach to selectively probe actin and microtubule-based motor protein activity inside living cells is lacking. To examine specific motor activity inside living cells, we utilized the FKBP-rapalog-FRB heterodimerization system to develop an in vivo peroxisomal trafficking assay that allows inducible recruitment of exogenous and endogenous kinesin, dynein, and myosin motors to drive specific cargo transport. We demonstrate that cargo rapidly redistributes with distinct dynamics for each respective motor, and that combined (antagonistic) actions of more complex motor combinations can also be probed. Of importance, robust cargo redistribution is readily achieved by one type of motor protein and does not require the presence of opposite-polarity motors. Simultaneous live-cell imaging of microtubules and kinesin or dynein-propelled peroxisomes, combined with high-resolution particle tracking, revealed that peroxisomes frequently pause at microtubule intersections. Titration and washout experiments furthermore revealed that motor recruitment by rapalog-induced heterodimerization is dose-dependent but irreversible. Our assay directly demonstrates that robust cargo motility does not require the presence of opposite-polarity motors, and can therefore be used to characterize the motile properties of specific types of motor proteins. Copyright © 2010 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Systematic Approach to Better Understanding Integration Costs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stark, Gregory B.
2015-09-01
This research presents a systematic approach to evaluating the costs of integrating new generation and operational procedures into an existing power system, and the methodology is independent of the type of change or nature of the generation. The work was commissioned by the U.S. Department of Energy and performed by the National Renewable Energy Laboratory to investigate three integration cost-related questions: (1) How does the addition of new generation affect a system's operational costs, (2) How do generation mix and operating parameters and procedures affect costs, and (3) How does the amount of variable generation (non-dispatchable wind and solar) impactmore » the accuracy of natural gas orders? A detailed operational analysis was performed for seven sets of experiments: variable generation, large conventional generation, generation mix, gas prices, fast-start generation, self-scheduling, and gas supply constraints. For each experiment, four components of integration costs were examined: cycling costs, non-cycling VO&M costs, fuel costs, and reserves provisioning costs. The investigation was conducted with PLEXOS production cost modeling software utilizing an updated version of the Institute of Electrical and Electronics Engineers 118-bus test system overlaid with projected operating loads from the Western Electricity Coordinating Council for the Sacramento Municipal Utility District, Puget Sound Energy, and Public Service Colorado in the year 2020. The test system was selected in consultation with an industry-based technical review committee to be a reasonable approximation of an interconnection yet small enough to allow the research team to investigate a large number of scenarios and sensitivity combinations. The research should prove useful to market designers, regulators, utilities, and others who want to better understand how system changes can affect production costs.« less
Contoured-gap coaxial guns for imploding plasma liner experiments
NASA Astrophysics Data System (ADS)
Witherspoon, F. D.; Case, A.; Brockington, S.; Cassibry, J. T.; Hsu, S. C.
2014-10-01
Arrays of supersonic, high momentum flux plasma jets can be used as standoff compression drivers for generating spherically imploding plasma liners for driving magneto-inertial fusion, hence the name plasma-jet-driven MIF (PJMIF). HyperV developed linear plasma jets for the Plasma Liner Experiment (PLX) at LANL where two guns were successfully tested. Further development at HyperV resulted in achieving the PLX goal of 8000 μg at 50 km/s. Prior work on contoured-gap coaxial guns demonstrated an approach to control the blowby instability and achieved substantial performance improvements. For future plasma liner experiments we propose to use contoured-gap coaxial guns with small Minirailgun injectors. We will describe such a gun for a 60-gun plasma liner experiment. Discussion topics will include impurity control, plasma jet symmetry and topology (esp. related to uniformity and compactness), velocity capability, and techniques planned for achieving gun efficiency of >50% using tailored impedance matched pulse forming networks. Mach2 and UAH SPH code simulations will be included. Work supported by US DOE DE-FG02-05ER54810.
Approach to design space from retrospective quality data.
Puñal Peces, Daniel; García-Montoya, Encarna; Manich, Albert; Suñé-Negre, Josep Maria; Pérez-Lozano, Pilar; Miñarro, Montse; Ticó, Josep Ramon
2016-01-01
Nowadays, the entire manufacturing process is based on the current GMPs, which emphasize the reproducibility of the process, and companies have a lot of recorded data about their processes. The establishment of the design space (DS) from retrospective data for a wet compression process. A design of experiments (DoE) with historical data from 4 years of industrial production has been carried out using the experimental factors as the results of the previous risk analysis and eight key parameters (quality specifications) that encompassed process and quality control data. Software Statgraphics 5.0 was applied, and data were processed to obtain eight DS as well as their safe and working ranges. Experience shows that it is possible to determine DS retrospectively, being the greatest difficulty in handling and processing of high amounts of data; however, the practicality of this study is very interesting as it let have the DS with minimal investment in experiments since actual production batch data are processed statistically.
Lopes; Oden
1999-06-01
In recent years, descriptive models of risky choice have incorporated features that reflect the importance of particular outcome values in choice. Cumulative prospect theory (CPT) does this by inserting a reference point in the utility function. SP/A (security-potential/aspiration) theory uses aspiration level as a second criterion in the choice process. Experiment 1 compares the ability of the CPT and SP/A models to account for the same within-subjects data set and finds in favor of SP/A. Experiment 2 replicates the main finding of Experiment 1 in a between-subjects design. The final discussion brackets the SP/A result by showing the impact on fit of both decreasing and increasing the number of free parameters. We also suggest how the SP/A approach might be useful in modeling investment decision making in a descriptively more valid way and conclude with comments on the relation between descriptive and normative theories of risky choice. Copyright 1999 Academic Press.
Caxaj, C Susana; Gill, Navjot K
2017-07-01
Belonging is linked to a variety of positive health outcomes. Yet this relationship is not well understood, particularly among rural immigrant diasporas. In this article, we explore the experiences of community belonging and wellbeing among a rural Indian-Canadian diaspora in the Interior of British Columbia, Canada, our central research questions being, "What are the experiences of belonging in this community? How does a sense of belonging (or lack of) shape mental health and wellbeing among local residents?" Using a situational analysis research approach, our findings indicate that local residents must navigate several tensions within an overarching reality of finding a space of our own. Such tensions reveal contradictory experiences of tight-knitedness, context-informed notions of cultural continuity, access/acceptability barriers, particularly in relation to rural agricultural living, and competing expectations of "small town" life. Such tensions can begin to be addressed through creative service provision, collaborative decision making, and diversity-informed program planning.
Preparing safety data packages for experimenters using the Get Away Special (GAS) carrier system
NASA Technical Reports Server (NTRS)
Kosko, Jerome
1992-01-01
The implementation of NSTS 1700.7B and more forceful scruntiny of data packages by the Johnson Space Flight Center (JSC) lead to the development of a classification policy for GAS/CAP payloads. The purpose of this policy is to classify experiments using the carrier system so that they receive an appropriate level of JSC review (i.e., one or multiphase reviews). This policy is based on energy containment to show inherent payload safety. It impacts the approach to performing hazard analyses and the nature of the data package. This paper endeavors to explain the impact of this policy as well as the impact of recent JSC as well as Kennedy Space Flight Center (KSC) 'interpretations' of existing requirements. The GAS canister does adequately contain most experiments when flown in the sealed configuration (however this must be shown, not merely stated). This paper also includes data package preparation guidelines for those experiments that require an opening door which often present unique safety issues.
Exploring the process of writing about and sharing traumatic birth experiences online.
Blainey, Sarah H; Slade, Pauline
2015-05-01
This study aimed to explore the experience of writing about a traumatic birth experience and sharing it online. Twelve women who had submitted their stories about traumatic birth experiences to the Birth Trauma Association for online publication were interviewed about their experiences. Women were interviewed shortly after writing but before posting and again 1 month after the story was posted online. All participants completed both interviews. These were transcribed and analysed using template analysis. Women described varied reasons for writing and sharing their stories, including wanting to help themselves and others. The process of writing was described as emotional, however was generally seen as a positive thing. Aspects of writing that were identified as helpful included organizing their experiences into a narrative, and distancing themselves from the experience. Writing and posting online about a traumatic birth is experienced positively by women. It may be a useful self-help intervention and is worthy of systematic evaluation. The mechanisms through which writing is reported to have impacted as described in the interviews link to the mechanisms of change in cognitive-behavioural approaches to post-traumatic symptoms. Statement of contribution What is already known on this subject? Some women develop post-traumatic stress disorder-like symptoms following birth. These can impact on both themselves and their family, yet these women may not seek professional help. Writing about a traumatic event may be a useful approach for reducing post-traumatic stress symptoms, but the impact of online sharing is unknown. What does this study add? This study demonstrates that women report benefits from writing about their birth experiences. Writing enabled organizing the experience into a narrative and distancing from the trauma, which was helpful. Sharing the story online was an emotional experience for participants, however was generally seen positively. © 2014 The British Psychological Society.
An adaptive toolbox approach to the route to expertise in sport.
de Oliveira, Rita F; Lobinger, Babett H; Raab, Markus
2014-01-01
Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes' natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise.
Assessment of the feasibility of exon 45–55 multiexon skipping for duchenne muscular dystrophy
van Vliet, Laura; de Winter, Christa L; van Deutekom, Judith CT; van Ommen, Gert-Jan B; Aartsma-Rus, Annemieke
2008-01-01
Background The specific skipping of an exon, induced by antisense oligonucleotides (AON) during splicing, has shown to be a promising therapeutic approach for Duchenne muscular dystrophy (DMD) patients. As different mutations require skipping of different exons, this approach is mutation dependent. The skipping of an entire stretch of exons (e.g. exons 45 to 55) has recently been suggested as an approach applicable to larger groups of patients. However, this multiexon skipping approach is technically challenging. The levels of intended multiexon skips are typically low and highly variable, and may be dependent on the order of intron removal. We hypothesized that the splicing order might favor the induction of multiexon 45–55 skipping. Methods We here tested the feasibility of inducing multiexon 45–55 in control and patient muscle cell cultures using various AON cocktails. Results In all experiments, the exon 45–55 skip frequencies were minimal and comparable to those observed in untreated cells. Conclusion We conclude that current state of the art does not sufficiently support clinical development of multiexon skipping for DMD. PMID:19046429
An adaptive toolbox approach to the route to expertise in sport
de Oliveira, Rita F.; Lobinger, Babett H.; Raab, Markus
2014-01-01
Expertise is characterized by fast decision-making which is highly adaptive to new situations. Here we propose that athletes use a toolbox of heuristics which they develop on their route to expertise. The development of heuristics occurs within the context of the athletes’ natural abilities, past experiences, developed skills, and situational context, but does not pertain to any of these factors separately. This is a novel approach because it integrates separate factors into a comprehensive heuristic description. The novelty of this approach lies within the integration of separate factors determining expertise into a comprehensive heuristic description. It is our contention that talent identification methods and talent development models should therefore be geared toward the assessment and development of specific heuristics. Specifically, in addition to identifying and developing separate natural abilities and skills as per usual, heuristics should be identified and developed. The application of heuristics to talent and expertise models can bring the field one step away from dichotomized models of nature and nurture toward a comprehensive approach to the route to expertise. PMID:25071673
Using Peptide-Level Proteomics Data for Detecting Differentially Expressed Proteins.
Suomi, Tomi; Corthals, Garry L; Nevalainen, Olli S; Elo, Laura L
2015-11-06
The expression of proteins can be quantified in high-throughput means using different types of mass spectrometers. In recent years, there have emerged label-free methods for determining protein abundance. Although the expression is initially measured at the peptide level, a common approach is to combine the peptide-level measurements into protein-level values before differential expression analysis. However, this simple combination is prone to inconsistencies between peptides and may lose valuable information. To this end, we introduce here a method for detecting differentially expressed proteins by combining peptide-level expression-change statistics. Using controlled spike-in experiments, we show that the approach of averaging peptide-level expression changes yields more accurate lists of differentially expressed proteins than does the conventional protein-level approach. This is particularly true when there are only few replicate samples or the differences between the sample groups are small. The proposed technique is implemented in the Bioconductor package PECA, and it can be downloaded from http://www.bioconductor.org.
Littel, Marianne; van Schie, Kevin; van den Hout, Marcel A.
2017-01-01
ABSTRACT Background: Eye movement desensitization and reprocessing (EMDR) is an effective psychological treatment for posttraumatic stress disorder. Recalling a memory while simultaneously making eye movements (EM) decreases a memory’s vividness and/or emotionality. It has been argued that non-specific factors, such as treatment expectancy and experimental demand, may contribute to the EMDR’s effectiveness. Objective: The present study was designed to test whether expectations about the working mechanism of EMDR would alter the memory attenuating effects of EM. Two experiments were conducted. In Experiment 1, we examined the effects of pre-existing (non-manipulated) knowledge of EMDR in participants with and without prior knowledge. In Experiment 2, we experimentally manipulated prior knowledge by providing participants without prior knowledge with correct or incorrect information about EMDR’s working mechanism. Method: Participants in both experiments recalled two aversive, autobiographical memories during brief sets of EM (Recall+EM) or keeping eyes stationary (Recall Only). Before and after the intervention, participants scored their memories on vividness and emotionality. A Bayesian approach was used to compare two competing hypotheses on the effects of (existing/given) prior knowledge: (1) Prior (correct) knowledge increases the effects of Recall+EM vs. Recall Only, vs. (2) prior knowledge does not affect the effects of Recall+EM. Results: Recall+EM caused greater reductions in memory vividness and emotionality than Recall Only in all groups, including the incorrect information group. In Experiment 1, both hypotheses were supported by the data: prior knowledge boosted the effects of EM, but only modestly. In Experiment 2, the second hypothesis was clearly supported over the first: providing knowledge of the underlying mechanism of EMDR did not alter the effects of EM. Conclusions: Recall+EM appears to be quite robust against the effects of prior expectations. As Recall+EM is the core component of EMDR, expectancy effects probably contribute little to the effectiveness of EMDR treatment. PMID:29038685
300 Area dangerous waste tank management system: Compliance plan approach. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-03-01
In its Dec. 5, 1989 letter to DOE-Richland (DOE-RL) Operations, the Washington State Dept. of Ecology requested that DOE-RL prepare ``a plant evaluating alternatives for storage and/or treatment of hazardous waste in the 300 Area...``. This document, prepared in response to that letter, presents the proposed approach to compliance of the 300 Area with the federal Resource Conservation and Recovery Act and Washington State`s Chapter 173-303 WAC, Dangerous Waste Regulations. It also contains 10 appendices which were developed as bases for preparing the compliance plan approach. It refers to the Radioactive Liquid Waste System facilities and to the radioactive mixedmore » waste.« less
Does Economics Education Make Bad Citizens? The Effect of Economics Education in Japan
ERIC Educational Resources Information Center
Iida, Yoshio; Oda, Sobei H.
2011-01-01
Does studying economics discourage students' cooperative mind? Several surveys conducted in the United States have concluded that the answer is yes. The authors conducted a series of economic experiments and questionnaires to consider the question in Japan. The results of the prisoner's dilemma experiment and public goods questionnaires showed no…
Evidence for Two Attentional Components in Visual Working Memory
ERIC Educational Resources Information Center
Allen, Richard J.; Baddeley, Alan D.; Hitch, Graham J.
2014-01-01
How does executive attentional control contribute to memory for sequences of visual objects, and what does this reveal about storage and processing in working memory? Three experiments examined the impact of a concurrent executive load (backward counting) on memory for sequences of individually presented visual objects. Experiments 1 and 2 found…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farbin, Amir
2015-07-15
This is the final report of for DoE Early Career Research Program Grant Titled "Model-Independent Dark-Matter Searches at the ATLAS Experiment and Applications of Many-core Computing to High Energy Physics".
NASA Astrophysics Data System (ADS)
Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik
2017-08-01
Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.
China’s Comprehensive Approach: Refining the U.S. Targeting Process to Inform U.S. Strategy
2018-04-20
control demonstrated by China, the subject matter expertise required to generate a comprehensive approach like China’s does exist. However, due to a vast...with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...code) NATIONAL DEFENSE UNIVERSITY JOINT FORCES STAFF COLLEGE JOINT ADVANCED WARFIGHTING SCHOOL CHINA’S COMPREHENSIVE APPROACH
When Does Air Resistance Become Significant in Projectile Motion?
NASA Astrophysics Data System (ADS)
Mohazzabi, Pirooz
2018-03-01
In an article in this journal, it was shown that air resistance could never be a significant source of error in typical free-fall experiments in introductory physics laboratories. Since projectile motion is the two-dimensional version of the free-fall experiment and usually follows the former experiment in such laboratories, it seemed natural to extend the same analysis to this type of motion. We shall find that again air resistance does not play a significant role in the parameters of interest in a traditional projectile motion experiment.
Kanojia, Gaurav; Willems, Geert-Jan; Frijlink, Henderik W; Kersten, Gideon F A; Soema, Peter C; Amorij, Jean-Pierre
2016-09-25
Spray dried vaccine formulations might be an alternative to traditional lyophilized vaccines. Compared to lyophilization, spray drying is a fast and cheap process extensively used for drying biologicals. The current study provides an approach that utilizes Design of Experiments for spray drying process to stabilize whole inactivated influenza virus (WIV) vaccine. The approach included systematically screening and optimizing the spray drying process variables, determining the desired process parameters and predicting product quality parameters. The process parameters inlet air temperature, nozzle gas flow rate and feed flow rate and their effect on WIV vaccine powder characteristics such as particle size, residual moisture content (RMC) and powder yield were investigated. Vaccine powders with a broad range of physical characteristics (RMC 1.2-4.9%, particle size 2.4-8.5μm and powder yield 42-82%) were obtained. WIV showed no significant loss in antigenicity as revealed by hemagglutination test. Furthermore, descriptive models generated by DoE software could be used to determine and select (set) spray drying process parameter. This was used to generate a dried WIV powder with predefined (predicted) characteristics. Moreover, the spray dried vaccine powders retained their antigenic stability even after storage for 3 months at 60°C. The approach used here enabled the generation of a thermostable, antigenic WIV vaccine powder with desired physical characteristics that could be potentially used for pulmonary administration. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.
Coeckelbergh, Mark
2012-03-01
The standard response to engineering disasters like the Deepwater Horizon case is to ascribe full moral responsibility to individuals and to collectives treated as individuals. However, this approach is inappropriate since concrete action and experience in engineering contexts seldom meets the criteria of our traditional moral theories. Technological action is often distributed rather than individual or collective, we lack full control of the technology and its consequences, and we lack knowledge and are uncertain about these consequences. In this paper, I analyse these problems by employing Kierkegaardian notions of tragedy and moral responsibility in order to account for experiences of the tragic in technological action. I argue that ascription of responsibility in engineering contexts should be sensitive to personal experiences of lack of control, uncertainty, role conflicts, social dependence, and tragic choice. I conclude that this does not justify evading individual and corporate responsibility, but inspires practices of responsibility ascription that are less 'harsh' on those directly involved in technological action, that listen to their personal experiences, and that encourage them to gain more knowledge about what they are doing. © The Author(s) 2010. This article is published with open access at Springerlink.com
A global parallel model based design of experiments method to minimize model output uncertainty.
Bazil, Jason N; Buzzard, Gregory T; Rundell, Ann E
2012-03-01
Model-based experiment design specifies the data to be collected that will most effectively characterize the biological system under study. Existing model-based design of experiment algorithms have primarily relied on Fisher Information Matrix-based methods to choose the best experiment in a sequential manner. However, these are largely local methods that require an initial estimate of the parameter values, which are often highly uncertain, particularly when data is limited. In this paper, we provide an approach to specify an informative sequence of multiple design points (parallel design) that will constrain the dynamical uncertainty of the biological system responses to within experimentally detectable limits as specified by the estimated experimental noise. The method is based upon computationally efficient sparse grids and requires only a bounded uncertain parameter space; it does not rely upon initial parameter estimates. The design sequence emerges through the use of scenario trees with experimental design points chosen to minimize the uncertainty in the predicted dynamics of the measurable responses of the system. The algorithm was illustrated herein using a T cell activation model for three problems that ranged in dimension from 2D to 19D. The results demonstrate that it is possible to extract useful information from a mathematical model where traditional model-based design of experiments approaches most certainly fail. The experiments designed via this method fully constrain the model output dynamics to within experimentally resolvable limits. The method is effective for highly uncertain biological systems characterized by deterministic mathematical models with limited data sets. Also, it is highly modular and can be modified to include a variety of methodologies such as input design and model discrimination.
To Have Control Over or to Be Free From Others? The Desire for Power Reflects a Need for Autonomy.
Lammers, Joris; Stoker, Janka I; Rink, Floor; Galinsky, Adam D
2016-04-01
The current research explores why people desire power and how that desire can be satisfied. We propose that a position of power can be subjectively experienced as conferring influence over others or as offering autonomy from the influence of others. Conversely, a low-power position can be experienced as lacking influence or lacking autonomy. Nine studies show that subjectively experiencing one's power as autonomy predicts the desire for power, whereas the experience of influence over others does not. Furthermore, gaining autonomy quenches the desire for power, but gaining influence does not. The studies demonstrated the primacy of autonomy across both experimental and correlational designs, across measured mediation and manipulated mediator approaches, and across three different continents (Europe, United States, India). Together, these studies offer evidence that people desire power not to be a master over others, but to be master of their own domain, to control their own fate. © 2016 by the Society for Personality and Social Psychology, Inc.
Data Crosscutting Requirements Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleese van Dam, Kerstin; Shoshani, Arie; Plata, Charity
2013-04-01
In April 2013, a diverse group of researchers from the U.S. Department of Energy (DOE) scientific community assembled to assess data requirements associated with DOE-sponsored scientific facilities and large-scale experiments. Participants in the review included facilities staff, program managers, and scientific experts from the offices of Basic Energy Sciences, Biological and Environmental Research, High Energy Physics, and Advanced Scientific Computing Research. As part of the meeting, review participants discussed key issues associated with three distinct aspects of the data challenge: 1) processing, 2) management, and 3) analysis. These discussions identified commonalities and differences among the needs of varied scientific communities.more » They also helped to articulate gaps between current approaches and future needs, as well as the research advances that will be required to close these gaps. Moreover, the review provided a rare opportunity for experts from across the Office of Science to learn about their collective expertise, challenges, and opportunities. The "Data Crosscutting Requirements Review" generated specific findings and recommendations for addressing large-scale data crosscutting requirements.« less
Reliability Analysis of Retaining Walls Subjected to Blast Loading by Finite Element Approach
NASA Astrophysics Data System (ADS)
GuhaRay, Anasua; Mondal, Stuti; Mohiuddin, Hisham Hasan
2018-02-01
Conventional design methods adopt factor of safety as per practice and experience, which are deterministic in nature. The limit state method, though not completely deterministic, does not take into account effect of design parameters, which are inherently variable such as cohesion, angle of internal friction, etc. for soil. Reliability analysis provides a measure to consider these variations into analysis and hence results in a more realistic design. Several studies have been carried out on reliability of reinforced concrete walls and masonry walls under explosions. Also, reliability analysis of retaining structures against various kinds of failure has been done. However, very few research works are available on reliability analysis of retaining walls subjected to blast loading. Thus, the present paper considers the effect of variation of geotechnical parameters when a retaining wall is subjected to blast loading. However, it is found that the variation of geotechnical random variables does not have a significant effect on the stability of retaining walls subjected to blast loading.
Back to the Future: Leadership, Tradition, and Authority in a Post-Critical Age
ERIC Educational Resources Information Center
Hawkins, Andrew
2010-01-01
Both modern and postmodern approaches to knowledge view tradition and authority with suspicion, even contempt, though each approach does so in different ways. Our profession vacillates between those epistemological orientations, struggling to find direction and meaning. Leadership, in particular, is in a quandary; what does leadership look like in…
Women's experience of being well during peri-menopause: a phenomenological study.
Mackey, Sandra
2007-01-01
A research study was conducted to investigate women's experience of being well during the peri-menopause because much of the research investigating the experience of menopause has concentrated on its problematic and pathological aspects. For the majority of western women the reproductive transition of menopause is not problematic, however, the nature of the unproblematic or healthy menopause has not been investigated. The aim in conducting this research was to enhance understanding of the experience of being healthy or well during menopause. In so doing, recognition of the diversity of menopausal experiences may be strengthened. The research was approached from the disciplinary perspective of nursing, and was grounded in the methodology of Heideggerian interpretive phenomenology. Data was collected via unstructured, in-depth interviews and analysis was conducted utilising the repetitive and circular process developed by van Manen. The phenomenon of being healthy or well during menopause was expressed in the form of three major themes. These were the continuity of menstrual experience, the embodiment of menopausal symptoms, and the containment of menopause and menopausal symptoms. The experience of health and well being during menopause can accommodate the experience of symptoms when the experience of symptoms does not disrupt embodied existence and the continuity of menstrual patterns. Menopause is widely studied, yet only partly understood. While much is now known about the nature and influence of ovarian hormones, the physiology of menopausal changes, and the treatment of menopausal symptoms, little is known and understood about the experience of menopause. Research that has investigated the experience of menopause has largely focused on the problematic experiences. It is now known that the majority of women, regardless of cultural background, do not experience menopause in a problematic way (Utian 1977; Porter et al. 1996). However, the nature of such experience has not been revealed and it is not known whether this experience of a non-problematic menopause constitutes wellness at menopause. The research reported here aimed to achieve greater understanding of the nature of this experience of menopause, through an investigation of women's everyday experience of wellness and wellbeing during menopause. Wellness, by its very nature, is an elusive state. It is elusive because it is a non-problematic state, thus difficult to mark out by measurement, events or experiences. In wellness, nothing 'stands out' to notice, observe or disrupt as it does in illness (van Manen 1990). Nevertheless, the term wellness describes a particular and recognisable state of being which, in this study, is revealed through interpretative analysis of post-menopausal women's descriptions of their experiences.
A Safety Case Approach for Deep Geologic Disposal of DOE HLW and DOE SNF in Bedded Salt - 13350
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sevougian, S. David; MacKinnon, Robert J.; Leigh, Christi D.
2013-07-01
The primary objective of this study is to investigate the feasibility and utility of developing a defensible safety case for disposal of United States Department of Energy (U.S. DOE) high-level waste (HLW) and DOE spent nuclear fuel (SNF) in a conceptual deep geologic repository that is assumed to be located in a bedded salt formation of the Delaware Basin [1]. A safety case is a formal compilation of evidence, analyses, and arguments that substantiate and demonstrate the safety of a proposed or conceptual repository. We conclude that a strong initial safety case for potential licensing can be readily compiled bymore » capitalizing on the extensive technical basis that exists from prior work on the Waste Isolation Pilot Plant (WIPP), other U.S. repository development programs, and the work published through international efforts in salt repository programs such as in Germany. The potential benefits of developing a safety case include leveraging previous investments in WIPP to reduce future new repository costs, enhancing the ability to effectively plan for a repository and its licensing, and possibly expediting a schedule for a repository. A safety case will provide the necessary structure for organizing and synthesizing existing salt repository science and identifying any issues and gaps pertaining to safe disposal of DOE HLW and DOE SNF in bedded salt. The safety case synthesis will help DOE to plan its future R and D activities for investigating salt disposal using a risk-informed approach that prioritizes test activities that include laboratory, field, and underground investigations. It should be emphasized that the DOE has not made any decisions regarding the disposition of DOE HLW and DOE SNF. Furthermore, the safety case discussed herein is not intended to either site a repository in the Delaware Basin or preclude siting in other media at other locations. Rather, this study simply presents an approach for accelerated development of a safety case for a potential DOE HLW and DOE SNF repository using the currently available technical basis for bedded salt. This approach includes a summary of the regulatory environment relevant to disposal of DOE HLW and DOE SNF in a deep geologic repository, the key elements of a safety case, the evolution of the safety case through the successive phases of repository development and licensing, and the existing technical basis that could be used to substantiate the safety of a geologic repository if it were to be sited in the Delaware Basin. We also discuss the potential role of an underground research laboratory (URL). (authors)« less
Finite-size Scaling of the Density of States in Photonic Band Gap Crystals
NASA Astrophysics Data System (ADS)
Hasan, Shakeeb Bin; Mosk, Allard P.; Vos, Willem L.; Lagendijk, Ad
2018-06-01
The famous vanishing of the density of states (DOS) in a band gap, be it photonic or electronic, pertains to the infinite-crystal limit. In contrast, all experiments and device applications refer to finite crystals, which raises the question: Upon increasing the linear size L of a crystal, how fast does the DOS approach the infinite-crystal limit? We present a theory for finite crystals that includes Bloch-mode broadening due to the presence of crystal boundaries. Our results demonstrate that the DOS for frequencies inside a band gap has a 1 /L scale dependence for crystals in one, two and three dimensions.
Fleming, Elizabeth; Gillibrand, Warren
2009-04-01
South Asian people are often perceived as a homogenous group whose culture is prescriptive and constraining. A metasynthesis of how culture influences diabetes self-management in the context of a South Asian population was undertaken. Theory explication was used to deconstruct and reconceptualize the findings of the studies. Eleven publications reported themes of health beliefs, individuality, context, and shared experiences. The results indicate that culture does not influence diabetes self-management in a rigid and prescriptive way; instead, individuals negotiate and interpret culture in a shifting and diverse context. An individualized approach to delivering culturally appropriate nursing care should be taken.
Bell, Christopher; Puttick, Simon; Rose, Stephen; Smith, Jye; Thomas, Paul; Dowson, Nicholas
2017-06-21
Imaging using more than one biological process using PET could be of great utility, but despite previously proposed approaches to dual-tracer imaging, it is seldom performed. The alternative of performing multiple scans is often infeasible for clinical practice or even in research studies. Dual-tracer PET scanning allows for multiple PET radiotracers to be imaged within the same imaging session. In this paper we describe our approach to utilise the basis pursuit method to aid in the design of dual-tracer PET imaging experiments, and later in separation of the signals. The advantage of this approach is that it does not require a compartment model architecture to be specified or even that both signals are distinguishable in all cases. This means the method for separating dual-tracer signals can be used for many feasible and useful combinations of biology or radiotracer, once an appropriate scanning protocol has been decided upon. Following a demonstration in separating the signals from two consecutively injected radionuclides in a controlled experiment, phantom and list-mode mouse experiments demonstrated the ability to test the feasibility of dual-tracer imaging protocols for multiple injection delays. Increases in variances predicted for kinetic macro-parameters V D and K I in brain and tumoral tissue were obtained when separating the synthetically combined data. These experiments confirmed previous work using other approaches that injections delays of 10-20 min ensured increases in variance were kept minimal for the test tracers used. On this basis, an actual dual-tracer experiment using a 20 min delay was performed using these radio tracers, with the kinetic parameters (V D and K I ) extracted for each tracer in agreement with the literature. This study supports previous work that dual-tracer PET imaging can be accomplished provided certain constraints are adhered to. The utilisation of basis pursuit techniques, with its removed need to specify a model architecture, allows the feasibility of a range of imaging protocols to be investigated via simulation in a straight-forward manner for a wide range of possible scenarios. The hope is that the ease of utilising this approach during feasibility studies and in practice removes any perceived technical barrier to performing dual-tracer imaging.
NASA Astrophysics Data System (ADS)
Bell, Christopher; Puttick, Simon; Rose, Stephen; Smith, Jye; Thomas, Paul; Dowson, Nicholas
2017-06-01
Imaging using more than one biological process using PET could be of great utility, but despite previously proposed approaches to dual-tracer imaging, it is seldom performed. The alternative of performing multiple scans is often infeasible for clinical practice or even in research studies. Dual-tracer PET scanning allows for multiple PET radiotracers to be imaged within the same imaging session. In this paper we describe our approach to utilise the basis pursuit method to aid in the design of dual-tracer PET imaging experiments, and later in separation of the signals. The advantage of this approach is that it does not require a compartment model architecture to be specified or even that both signals are distinguishable in all cases. This means the method for separating dual-tracer signals can be used for many feasible and useful combinations of biology or radiotracer, once an appropriate scanning protocol has been decided upon. Following a demonstration in separating the signals from two consecutively injected radionuclides in a controlled experiment, phantom and list-mode mouse experiments demonstrated the ability to test the feasibility of dual-tracer imaging protocols for multiple injection delays. Increases in variances predicted for kinetic macro-parameters V D and K I in brain and tumoral tissue were obtained when separating the synthetically combined data. These experiments confirmed previous work using other approaches that injections delays of 10-20 min ensured increases in variance were kept minimal for the test tracers used. On this basis, an actual dual-tracer experiment using a 20 min delay was performed using these radio tracers, with the kinetic parameters (V D and K I) extracted for each tracer in agreement with the literature. This study supports previous work that dual-tracer PET imaging can be accomplished provided certain constraints are adhered to. The utilisation of basis pursuit techniques, with its removed need to specify a model architecture, allows the feasibility of a range of imaging protocols to be investigated via simulation in a straight-forward manner for a wide range of possible scenarios. The hope is that the ease of utilising this approach during feasibility studies and in practice removes any perceived technical barrier to performing dual-tracer imaging.
What Difference Does a More In-Depth Programme Make to Learning?
ERIC Educational Resources Information Center
Reiss, Athene
2015-01-01
It is virtually axiomatic that a more extended learning experience will have more impact than a one-off experience. But how much difference does it make and is the extended time commitment justified by the results? The Berkshire, Buckinghamshire and Oxfordshire Wildlife Trust (BBOWT) conducted some research to explore this question with regard to…
ERIC Educational Resources Information Center
González-Víllora, Sixto; Serra-Olivares, Jaime; González-Martí, Irene; Hernández-Martínez, Andrea
2012-01-01
People construct knowledge through a set of highly diverse experiences. Despite being personal, this knowledge is strongly influenced by the specific context where it occurs. Such experience-based knowledge is referred to as "implicit theories" because it does not fit in with a systematic and theoretical knowledge context like that of…
Fast globally optimal segmentation of cells in fluorescence microscopy images.
Bergeest, Jan-Philip; Rohr, Karl
2011-01-01
Accurate and efficient segmentation of cells in fluorescence microscopy images is of central importance for the quantification of protein expression in high-throughput screening applications. We propose a new approach for segmenting cell nuclei which is based on active contours and convex energy functionals. Compared to previous work, our approach determines the global solution. Thus, the approach does not suffer from local minima and the segmentation result does not depend on the initialization. We also suggest a numeric approach for efficiently computing the solution. The performance of our approach has been evaluated using fluorescence microscopy images of different cell types. We have also performed a quantitative comparison with previous segmentation approaches.
Groundwater remediation solutions at hanford
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilmore, T.J.; Truex, M.J.; Williams, M.D.
2007-07-01
In 2006, Congress provided funding to the U. S. Department of Energy (DOE) to study new technologies that could be used to treat contamination from the Hanford Site that might impact the Columbia River. DOE identified three high priority sites that had groundwater contamination migrating towards the Columbia river for remediation. The contaminants included strontium-90, uranium and chromium. A natural systems approach was taken that uses a mass balance concept to frame the problem and determine the most appropriate remedial approach. This approach provides for a scientifically based remedial decision. The technologies selected to address these contaminants included an apatitemore » adsorption barrier coupled with a phyto-remediation to address the strontium-90 contamination, injection of polyphosphate into the subsurface to sequester uranium, and a bioremediation approach to reduce chromium contamination in the groundwater. The ability to provide scientifically based approaches to these sites was in large part due to work the Pacific Northwest National Laboratory developed under previous DOE Office of Science and Office of Environmental Management projects. (authors)« less
Anatomical image-guided fluorescence molecular tomography reconstruction using kernel method
NASA Astrophysics Data System (ADS)
Baikejiang, Reheman; Zhao, Yue; Fite, Brett Z.; Ferrara, Katherine W.; Li, Changqing
2017-05-01
Fluorescence molecular tomography (FMT) is an important in vivo imaging modality to visualize physiological and pathological processes in small animals. However, FMT reconstruction is ill-posed and ill-conditioned due to strong optical scattering in deep tissues, which results in poor spatial resolution. It is well known that FMT image quality can be improved substantially by applying the structural guidance in the FMT reconstruction. An approach to introducing anatomical information into the FMT reconstruction is presented using the kernel method. In contrast to conventional methods that incorporate anatomical information with a Laplacian-type regularization matrix, the proposed method introduces the anatomical guidance into the projection model of FMT. The primary advantage of the proposed method is that it does not require segmentation of targets in the anatomical images. Numerical simulations and phantom experiments have been performed to demonstrate the proposed approach's feasibility. Numerical simulation results indicate that the proposed kernel method can separate two FMT targets with an edge-to-edge distance of 1 mm and is robust to false-positive guidance and inhomogeneity in the anatomical image. For the phantom experiments with two FMT targets, the kernel method has reconstructed both targets successfully, which further validates the proposed kernel method.
Wolfe, Uta; Moran, Amy
2017-01-01
As neuroscience knowledge grows in its scope of societal applications so does the need to educate a wider audience on how to critically evaluate its research findings. Efforts at finding teaching approaches that are interdisciplinary, accessible and highly applicable to student experience are thus ongoing. The article describes an interdisciplinary undergraduate health course that combines the academic study of contemplative neuroscience with contemplative practice, specifically yoga. The class aims to reach a diverse mix of students by teaching applicable, health-relevant neuroscience material while directly connecting it to first-hand experience. Outcomes indicate success on these goals: The course attracted a wide range of students, including nearly 50% non-science majors. On a pre/post test, students showed large increases in their knowledge of neuroscience. Students’ ratings of the course overall, of increases in positive feelings about its field, and of their progress on specific course objectives were highly positive. Finally, students in their written work applied neuroscience course content to their personal and professional lives. Such results indicate that this approach could serve as a model for the interdisciplinary, accessible and applied integration of relevant neuroscience material into the undergraduate health curriculum. PMID:29371845
Direct magnetic field estimation based on echo planar raw data.
Testud, Frederik; Splitthoff, Daniel Nicolas; Speck, Oliver; Hennig, Jürgen; Zaitsev, Maxim
2010-07-01
Gradient recalled echo echo planar imaging is widely used in functional magnetic resonance imaging. The fast data acquisition is, however, very sensitive to field inhomogeneities which manifest themselves as artifacts in the images. Typically used correction methods have the common deficit that the data for the correction are acquired only once at the beginning of the experiment, assuming the field inhomogeneity distribution B(0) does not change over the course of the experiment. In this paper, methods to extract the magnetic field distribution from the acquired k-space data or from the reconstructed phase image of a gradient echo planar sequence are compared and extended. A common derivation for the presented approaches provides a solid theoretical basis, enables a fair comparison and demonstrates the equivalence of the k-space and the image phase based approaches. The image phase analysis is extended here to calculate the local gradient in the readout direction and improvements are introduced to the echo shift analysis, referred to here as "k-space filtering analysis." The described methods are compared to experimentally acquired B(0) maps in phantoms and in vivo. The k-space filtering analysis presented in this work demonstrated to be the most sensitive method to detect field inhomogeneities.
Rollins, Brandi Y; Loken, Eric; Savage, Jennifer S; Birch, Leann L
2014-02-01
Parents’ use of restrictive feeding practices is counterproductive, increasing children’s intake of restricted foods and risk for excessive weight gain. The aims of this research were to replicate Fisher and Birch’s (1999b) original findings that short-term restriction increases preschool children’s (3–5 y) selection, intake, and behavioral response to restricted foods, and to identify characteristics of children who were more susceptible to the negative effects of restriction. The experiment used a within-subjects design; 37 children completed the food reinforcement task and heights/weights were measured. Parents reported on their use of restrictive feeding practices and their child’s inhibitory control and approach. Overall, the findings replicated those of and revealed that the effects of restriction differed by children’s regulatory and appetitive tendencies. Greater increases in intake in response to restriction were observed among children lower in inhibitory control, higher in approach, who found the restricted food highly reinforcing, and who had previous experience with parental use of restriction. Results confirm that the use of restriction does not reduce children’s consumption of these foods, particularly among children with lower regulatory or higher appetitive tendencies.
A Comparison of Methods for a Priori Bias Correction in Soil Moisture Data Assimilation
NASA Technical Reports Server (NTRS)
Kumar, Sujay V.; Reichle, Rolf H.; Harrison, Kenneth W.; Peters-Lidard, Christa D.; Yatheendradas, Soni; Santanello, Joseph A.
2011-01-01
Data assimilation is being increasingly used to merge remotely sensed land surface variables such as soil moisture, snow and skin temperature with estimates from land models. Its success, however, depends on unbiased model predictions and unbiased observations. Here, a suite of continental-scale, synthetic soil moisture assimilation experiments is used to compare two approaches that address typical biases in soil moisture prior to data assimilation: (i) parameter estimation to calibrate the land model to the climatology of the soil moisture observations, and (ii) scaling of the observations to the model s soil moisture climatology. To enable this research, an optimization infrastructure was added to the NASA Land Information System (LIS) that includes gradient-based optimization methods and global, heuristic search algorithms. The land model calibration eliminates the bias but does not necessarily result in more realistic model parameters. Nevertheless, the experiments confirm that model calibration yields assimilation estimates of surface and root zone soil moisture that are as skillful as those obtained through scaling of the observations to the model s climatology. Analysis of innovation diagnostics underlines the importance of addressing bias in soil moisture assimilation and confirms that both approaches adequately address the issue.
Joucla, Sébastien; Franconville, Romain; Pippow, Andreas; Kloppenburg, Peter; Pouzat, Christophe
2013-08-01
Calcium imaging has become a routine technique in neuroscience for subcellular to network level investigations. The fast progresses in the development of new indicators and imaging techniques call for dedicated reliable analysis methods. In particular, efficient and quantitative background fluorescence subtraction routines would be beneficial to most of the calcium imaging research field. A background-subtracted fluorescence transients estimation method that does not require any independent background measurement is therefore developed. This method is based on a fluorescence model fitted to single-trial data using a classical nonlinear regression approach. The model includes an appropriate probabilistic description of the acquisition system's noise leading to accurate confidence intervals on all quantities of interest (background fluorescence, normalized background-subtracted fluorescence time course) when background fluorescence is homogeneous. An automatic procedure detecting background inhomogeneities inside the region of interest is also developed and is shown to be efficient on simulated data. The implementation and performances of the proposed method on experimental recordings from the mouse hypothalamus are presented in details. This method, which applies to both single-cell and bulk-stained tissues recordings, should help improving the statistical comparison of fluorescence calcium signals between experiments and studies. Copyright © 2013 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Kultgen, Merrl Kent
2010-01-01
This single-case study addresses the lack of qualitative research describing the instructional role of the superintendent. Guiding this study are two research questions, "How does the goal implementation process as an element of the superintendent's organizational approach impact student success?" and "How does defined autonomy as…
Why carers use adult day respite: a mixed method case study
2014-01-01
Background We need to improve our understanding of the complex interactions between family carers’ emotional relationships with care-recipients and carers use of support services. This study assessed carer’s expectations and perceptions of adult day respite services and their commitment to using services. Methods A mixed-method case study approach was used with psychological contract providing a conceptual framework. Data collection was situated within an organisational case study, and the total population of carers from the organisation’s day respite service were approached. Fifty respondents provided quantitative and qualitative data through an interview survey. The conceptual framework was expanded to include Maslow’s hierarchy of needs during analysis. Results Carers prioritised benefits for and experiences of care-recipients when making day respite decisions. Respondents had high levels of trust in the service and perceived that the major benefits for care-recipients were around social interaction and meaningful activity with resultant improved well-being. Carers wanted day respite experiences to include all levels of Maslow’s hierarchy of needs from the provision of physiological care and safety through to the higher levels of belongingness, love and esteem. Conclusion The study suggests carers need to trust that care-recipients will have quality experiences at day respite. This study was intended as a preliminary stage for further research and while not generalizable it does highlight key considerations in carers’ use of day respite services. PMID:24906239
Why carers use adult day respite: a mixed method case study.
Stirling, Christine M; Dwan, Corinna A; McKenzie, Angela R
2014-06-06
We need to improve our understanding of the complex interactions between family carers' emotional relationships with care-recipients and carers use of support services. This study assessed carer's expectations and perceptions of adult day respite services and their commitment to using services. A mixed-method case study approach was used with psychological contract providing a conceptual framework. Data collection was situated within an organisational case study, and the total population of carers from the organisation's day respite service were approached. Fifty respondents provided quantitative and qualitative data through an interview survey. The conceptual framework was expanded to include Maslow's hierarchy of needs during analysis. Carers prioritised benefits for and experiences of care-recipients when making day respite decisions. Respondents had high levels of trust in the service and perceived that the major benefits for care-recipients were around social interaction and meaningful activity with resultant improved well-being. Carers wanted day respite experiences to include all levels of Maslow's hierarchy of needs from the provision of physiological care and safety through to the higher levels of belongingness, love and esteem. The study suggests carers need to trust that care-recipients will have quality experiences at day respite. This study was intended as a preliminary stage for further research and while not generalizable it does highlight key considerations in carers' use of day respite services.
NASA Astrophysics Data System (ADS)
Glinsky, M.; Hutter, A.; Drozhko, E. G.
2001-12-01
In the early 90's international organizations showed great interest concerning the contamination problems at the PA "Mayak" territory, where liquid radioactive wastes have been stored on the surface, including Lake Karachay, reservoir "Staroye Boloto" and the Techa River cascade reservoirs. As a result of this interest, international contracts funded by DOE (USA), NRRA, EC and DGXL were instituted to study the experience of radioactive waste management accumulated at the PA "Mayak" territory, including proposed rehabilitation of the contaminated territories. However, at the initial stage of international research, the works were not coordinated and often duplicated each other, which was taken by the public and mass media as a serious divergence of opinion between the scientists on the risk assessment for the population. Many years of research resulted in elaboration of a common scientific approach to the solution of the problems of water resources contamination at the PA "Mayak" territory. A successful experience of coordinating the international projects to study radionuclide migration with surface and ground waters at the PA "Mayak" territory is demonstrated, as well as the risk assessment for the population. Substantiation for rehabilitation measures can be based on long-term predictions and modeling research that are continuing under these international projects.
Tai, Mitchell; Ly, Amanda; Leung, Inne; Nayar, Gautam
2015-01-01
The burgeoning pipeline for new biologic drugs has increased the need for high-throughput process characterization to efficiently use process development resources. Breakthroughs in highly automated and parallelized upstream process development have led to technologies such as the 250-mL automated mini bioreactor (ambr250™) system. Furthermore, developments in modern design of experiments (DoE) have promoted the use of definitive screening design (DSD) as an efficient method to combine factor screening and characterization. Here we utilize the 24-bioreactor ambr250™ system with 10-factor DSD to demonstrate a systematic experimental workflow to efficiently characterize an Escherichia coli (E. coli) fermentation process for recombinant protein production. The generated process model is further validated by laboratory-scale experiments and shows how the strategy is useful for quality by design (QbD) approaches to control strategies for late-stage characterization. © 2015 American Institute of Chemical Engineers.
Optical cuff for optogenetic control of the peripheral nervous system.
Michoud, Frédéric; Sottas, Loïc; Browne, Liam E; Asboth, Léonie; Latremoliere, Alban; Sakuma, Miyuki; Courtine, Grégoire; Woolf, Clifford J; Lacour, Stéphanie P
2018-02-01
Nerves in the peripheral nervous system (PNS) contain axons with specific motor, somatosensory and autonomic functions. Optogenetics offers an efficient approach to selectively activate axons within the nerve. However, the heterogeneous nature of nerves and their tortuous route through the body create a challenging environment to reliably implant a light delivery interface. Here, we propose an optical peripheral nerve interface-an optocuff-, so that optogenetic modulation of peripheral nerves become possible in freely behaving mice. Using this optocuff, we demonstrate orderly recruitment of motor units with epineural optical stimulation of genetically targeted sciatic nerve axons, both in anaesthetized and in awake, freely behaving animals. Behavioural experiments and histology show the optocuff does not damage the nerve thus is suitable for long-term experiments. These results suggest that the soft optocuff might be a straightforward and efficient tool to support more extensive study of the PNS using optogenetics.
Coaxial Helicity Injection experiments in NSTX*
NASA Astrophysics Data System (ADS)
Raman, R.; Jarboe, T. R.; Gates, D.; Mueller, D.; Schaffer, M. J.; Maqueda, R.; Nelson, B. A.; Menard, J.; Soukhanovskii, V.; Paul, S.; Jardin, S.; Skinner, C. H.; Sabbagh, S.; Paoletti, F.; Stutman, D.; Lao, L.; Nagata, M.
2001-10-01
Coaxial helicity injection (CHI) can potentially eliminate inductive startup and thus the induction solenoid in spherical tori (ST), thereby greatly improving the ST fusion concept. CHI experiments on NSTX have produced 360 kA of toroidal current using about 25 kA of injector current. These have been produced in the preferred 'narrow flux foot print' condition in pulses that were sustained for 300 ms. A rotating n=1 mode, previously observed in optimized discharges on smaller STs driven by CHI and deemed necessary for transporting edge driven current to the interior of the discharge, has been observed for the first time in NSTX CHI discharges. The flux utilization efficiency continues to be high, approaching 100%. EFIT and TSC codes are being used to assess flux closure. This work is supported by the US DOE contract numbers: DE-AC02-76CH03073 and DE-AC05-00R22725.
LVAD patients' and surrogates' perspectives on SPIRIT-HF: An advance care planning discussion.
Metzger, Maureen; Song, Mi-Kyung; Devane-Johnson, Stephanie
2016-01-01
To describe LVAD patients' and surrogates' experiences with, and perspectives on SPIRIT-HF, an advance care planning (ACP) intervention. ACP is important for patients with LVAD, yet little is known about their experiences or those of their surrogates who have participated in ACP discussions. We used qualitative content analysis techniques to conduct a secondary analysis of 28 interviews with patients with LVAD (n = 14) and their surrogates (n = 14) who had participated in an RCT pilot study of SPIRIT-HF. Main themes from the data include: 1) sharing their HF stories was very beneficial; 2) participating in SPIRIT-HF led to greater peace of mind for patients and surrogates; 3) "one size does not fit all" when it comes to timing of ACP discussions. An understanding patient and surrogate perspectives may inform clinicians' approach to ACP discussions. Copyright © 2016 Elsevier Inc. All rights reserved.
Building a Science Communication Culture: One Agency's Approach
NASA Astrophysics Data System (ADS)
DeWitt, S.; Tenenbaum, L. F.; Betz, L.
2014-12-01
Science communication does not have to be a solitary practice. And yet, many scientists go about it alone and with little support from their peers and organizations. To strengthen community and build support for science communicators, NASA designed a training course aimed at two goals: 1) to develop individual scientists' communication skills, and 2) to begin to build a science communication culture at the agency. NASA offered a pilot version of this training course in 2014: the agency's first multidisciplinary face-to-face learning experience for science communicators. Twenty-six Earth, space and life scientists from ten field centers came together for three days of learning. They took part in fundamental skill-building exercises, individual development planning, and high-impact team projects. This presentation will describe the course design and learning objectives, the experience of the participants, and the evaluation results that will inform future offerings of communication training for NASA scientists and others.
NASA Astrophysics Data System (ADS)
Roth, Wolff-Michael
2015-06-01
For many students, the experience with science tends to be alienating and uprooting. In this study, I take up Simone Weil's concepts of enracinement (rooting) and déracinement (uprooting) to theorize the root of this alienation, the confrontation between children's familiarity with the world and unfamiliar/strange scientific conceptions. I build on the works of the phenomenological philosopher Edmund Husserl and the German physics educator Martin Wagenschein (who directly refers to Weil's concepts) to make a case for the rooting function of original/originary experiences and the genetic method to science teaching. The genetic approach allows students to retain their foundational familiarity with the world and their descriptions thereof all the while evolving other (more scientific) ways of explaining natural phenomena.
The Marble Experiment: Overview and Simulations
NASA Astrophysics Data System (ADS)
Douglas, M. R.; Murphy, T. J.; Cobble, J. A.; Fincke, J. R.; Haines, B. M.; Hamilton, C. E.; Lee, M. N.; Oertel, J. A.; Olson, R. E.; Randolph, R. B.; Schmidt, D. W.; Shah, R. C.; Smidt, J. M.; Tregillis, I. L.
2015-11-01
The Marble ICF platform has recently been launched on both OMEGA and NIF with the goal to investigate the influence of heterogeneous mix on fusion burn. The unique separated reactant capsule design consists of an ``engineered'' CH capsule filled with deuterated plastic foam that contains pores or voids that are filled with tritium gas. Initially the deuterium and tritium are separated, but as the implosion proceeds, the D and T mix, producing a DT signature. The results of these experiments will be used to inform a probability density function (PDF) burn modelling approach for un-resolved cell morphology. Initial targets for platform development have consisted of either fine-pore foams or gas mixtures, with the goal to field the engineered foams in 2016. An overview of the Marble experimental campaign will be presented and simulations will be discussed. This work is supported by US DOE/NNSA, performed at LANL, operated by LANS LLC under contract DE-AC52-06NA25396.
Spectrally resolved visualization of fluorescent dyes permeating into skin
NASA Astrophysics Data System (ADS)
Maeder, Ulf; Bergmann, Thorsten; Beer, Sebastian; Burg, Jan Michael; Schmidts, Thomas; Runkel, Frank; Fiebich, Martin
2012-03-01
We present a spectrally resolved confocal imaging approach to qualitatively asses the overall uptake and the penetration depth of fluorescent dyes into biological tissue. We use a confocal microscope with a spectral resolution of 5 nm to measure porcine skin tissue after performing a Franz-Diffusion experiment with a submicron emulsion enriched with the fluorescent dye Nile Red. The evaluation uses linear unmixing of the dye and the tissue autofluorescence spectra. The results are combined with a manual segmentation of the skin's epidermis and dermis layers to assess the penetration behavior additionally to the overall uptake. The diffusion experiments, performed for 3h and 24h, show a 3-fold increased dye uptake in the epidermis and dermis for the 24h samples. As the method is based on spectral information it does not face the problem of superimposed dye and tissue spectra and therefore is more precise compared to intensity based evaluation methods.
NASA Astrophysics Data System (ADS)
Hawkins, Cameron; Tschuaner, Oliver; Fussell, Zachary; Smith, Jesse
2017-06-01
A novel approach that spatially identifies inhomogeneities from microscale (defects, con-formational disorder) to mesoscale (voids, inclusions) is developed using synchrotron x-ray methods: tomography, Lang topography, and micro-diffraction mapping. These techniques pro-vide a non-destructive method for characterization of mm-sized samples prior to shock experiments. These characterization maps can be used to correlate continuum level measurements in shock compression experiments to the mesoscale and microscale structure. Specifically examined is a sample of C4. We show extensive conformational disorder in gamma-RDX, which is the main component. Further, we observe that the minor HMX-component in C4 contains at least two different phases: alpha- and beta-HMX. This work supported by National Security Technologies, LLC, under Contract No. DE-AC52-06NA25946 with the U.S. Department of Energy and by the Site-Directed Research and Development Program. DOE/NV/25946-3071.
Dynamic Conductivity and Partial Ionization in Warm, Dense Hydrogen
NASA Astrophysics Data System (ADS)
Zaghoo, M.; Silvera, I. F.
2017-10-01
A theoretical description for optical conduction experiments in dense fluid hydrogen is presented. Different quantum statistical approaches are used to describe the mechanism of electron transport in hydrogen's high-temperature dense phase. We show that at the onset of the metallic transition, optical conduction could be described by a strong rise in the atomic polarizability, resulting from increased ionization; whereas in the highly degenerate limit, the Ziman weak-scattering model better describes the observed saturation of reflectance. In the highly degenerate region, the inclusion of partial ionization effects provides excellent agreement with experimental results. Hydrogen's fluid metallic state is revealed to be a partially ionized free-electron plasma. These results provide a crucial benchmark for ab initio calculations as well as an important guide for future experiments. Research supported by DOE Stockpile Stewardship Academic Alliance Program, Grant DE-FG52-10NA29656, and NASA Earth and Space Science Fellowship Program, Award NNX14AP17H.
Botha-Ravyse, Chrisna; Blignaut, Seugnet
2017-01-01
Early adoption of technology is a struggle well known to early adopters and now to me. Since the demand to use and implement technology in health professions' education has increased, I have been led to adopt various technologies, leading to many headaches. This paper addresses my experiences in developing and implementing technology in health science classrooms in a setting not adequately equipped to do so. After reflecting on my experiences, I conclude that it is crucial that systems help innovators and early adopters as they work to develop and implement teaching and learning technology. Technical decisions should address the needs of the higher education educator. In addition, once an institution chooses a specific technological approach, such as using e-guides, there should be resources in place to support the forerunners of these initiatives.
Single-molecule detection of dihydroazulene photo-thermal reaction using break junction technique
NASA Astrophysics Data System (ADS)
Huang, Cancan; Jevric, Martyn; Borges, Anders; Olsen, Stine T.; Hamill, Joseph M.; Zheng, Jue-Ting; Yang, Yang; Rudnev, Alexander; Baghernejad, Masoud; Broekmann, Peter; Petersen, Anne Ugleholdt; Wandlowski, Thomas; Mikkelsen, Kurt V.; Solomon, Gemma C.; Brøndsted Nielsen, Mogens; Hong, Wenjing
2017-05-01
Charge transport by tunnelling is one of the most ubiquitous elementary processes in nature. Small structural changes in a molecular junction can lead to significant difference in the single-molecule electronic properties, offering a tremendous opportunity to examine a reaction on the single-molecule scale by monitoring the conductance changes. Here, we explore the potential of the single-molecule break junction technique in the detection of photo-thermal reaction processes of a photochromic dihydroazulene/vinylheptafulvene system. Statistical analysis of the break junction experiments provides a quantitative approach for probing the reaction kinetics and reversibility, including the occurrence of isomerization during the reaction. The product ratios observed when switching the system in the junction does not follow those observed in solution studies (both experiment and theory), suggesting that the junction environment was perturbing the process significantly. This study opens the possibility of using nano-structured environments like molecular junctions to tailor product ratios in chemical reactions.
Introduction to the special issue: parsimony and redundancy in models of language.
Wiechmann, Daniel; Kerz, Elma; Snider, Neal; Jaeger, T Florian
2013-09-01
One of the most fundamental goals in linguistic theory is to understand the nature of linguistic knowledge, that is, the representations and mechanisms that figure in a cognitively plausible model of human language-processing. The past 50 years have witnessed the development and refinement of various theories about what kind of 'stuff' human knowledge of language consists of, and technological advances now permit the development of increasingly sophisticated computational models implementing key assumptions of different theories from both rationalist and empiricist perspectives. The present special issue does not aim to present or discuss the arguments for and against the two epistemological stances or discuss evidence that supports either of them (cf. Bod, Hay, & Jannedy, 2003; Christiansen & Chater, 2008; Hauser, Chomsky, & Fitch, 2002; Oaksford & Chater, 2007; O'Donnell, Hauser, & Fitch, 2005). Rather, the research presented in this issue, which we label usage-based here, conceives of linguistic knowledge as being induced from experience. According to the strongest of such accounts, the acquisition and processing of language can be explained with reference to general cognitive mechanisms alone (rather than with reference to innate language-specific mechanisms). Defined in these terms, usage-based approaches encompass approaches referred to as experience-based, performance-based and/or emergentist approaches (Amrnon & Snider, 2010; Bannard, Lieven, & Tomasello, 2009; Bannard & Matthews, 2008; Chater & Manning, 2006; Clark & Lappin, 2010; Gerken, Wilson, & Lewis, 2005; Gomez, 2002;
Garber, Susan L
Every day, in clinics and hospitals around the world, occupational therapists care for patients with serious problems requiring viable solutions. Each patient is unique, and his or her problem does not necessarily correspond to existing practice models. Practitioners must adapt standard approaches to provide effective outcomes, yet problems exist for which few or no beneficial approaches have been identified. Such clinical issues require solutions to be generated de novo from the practitioner's body of knowledge and past experience. Yet, no single new intervention can be used without prior validation of its efficacy. Only a therapist with a prepared mind can accept such challenges, recognize what is known and not yet known, design studies to acquire that needed knowledge, and translate it into successful clinical treatment strategies. The occupational therapist with a prepared mind is one willing to seize unexpected opportunities and construct new paradigms of practice. Innovation through scientific inquiry requires a prepared mind. Copyright © 2016 by the American Occupational Therapy Association, Inc.
Friction Laws Derived From the Acoustic Emissions of a Laboratory Fault by Machine Learning
NASA Astrophysics Data System (ADS)
Rouet-Leduc, B.; Hulbert, C.; Ren, C. X.; Bolton, D. C.; Marone, C.; Johnson, P. A.
2017-12-01
Fault friction controls nearly all aspects of fault rupture, yet it is only possible to measure in the laboratory. Here we describe laboratory experiments where acoustic emissions are recorded from the fault. We find that by applying a machine learning approach known as "extreme gradient boosting trees" to the continuous acoustical signal, the fault friction can be directly inferred, showing that instantaneous characteristics of the acoustic signal are a fingerprint of the frictional state. This machine learning-based inference leads to a simple law that links the acoustic signal to the friction state, and holds for every stress cycle the laboratory fault goes through. The approach does not use any other measured parameter than instantaneous statistics of the acoustic signal. This finding may have importance for inferring frictional characteristics from seismic waves in Earth where fault friction cannot be measured.
Blank, Hartmut
2005-02-01
Traditionally, the causes of interference phenomena were sought in "real" or "hard" memory processes such as unlearning, response competition, or inhibition, which serve to reduce the accessibility of target items. I propose an alternative approach which does not deny the influence of such processes but highlights a second, equally important, source of interference-the conversion (Tulving, 1983) of accessible memory information into memory performance. Conversion is conceived as a problem-solving-like activity in which the rememberer tries to find solutions to a memory task. Conversion-based interference effects are traced to different conversion processes in the experimental and control conditions of interference designs. I present a simple theoretical model that quantitatively predicts the resulting amount of interference. In two paired-associate learning experiments using two different types of memory tests, these predictions were corroborated. Relations of the present approach to traditional accounts of interference phenomena and implications for eyewitness testimony are discussed.
Sustainable reduction of bioreactor contamination in an industrial fermentation pilot plant.
Junker, Beth; Lester, Michael; Leporati, James; Schmitt, John; Kovatch, Michael; Borysewicz, Stan; Maciejak, Waldemar; Seeley, Anna; Hesse, Michelle; Connors, Neal; Brix, Thomas; Creveling, Eric; Salmon, Peter
2006-10-01
Facility experience primarily in drug-oriented fermentation equipment (producing small molecules such as secondary metabolites, bioconversions, and enzymes) and, to a lesser extent, in biologics-oriented fermentation equipment (producing large molecules such as recombinant proteins and microbial vaccines) in an industrial fermentation pilot plant over the past 15 years is described. Potential approaches for equipment design and maintenance, operational procedures, validation/verification testing, medium selection, culture purity/sterility analysis, and contamination investigation are presented, and those approaches implemented are identified. Failure data collected for pilot plant operation for nearly 15 years are presented and best practices for documentation and tracking are outlined. This analysis does not exhaustively discuss available design, operational and procedural options; rather it selectively presents what has been determined to be beneficial in an industrial pilot plant setting. Literature references have been incorporated to provide background and context where appropriate.
Nonlinear mechanical resonators for ultra-sensitive mass detection
NASA Astrophysics Data System (ADS)
Datskos, P. G.; Lavrik, N. V.
2014-10-01
The fundamental sensitivity limit of an appropriately scaled down mechanical resonator can approach one atomic mass unit when only thermal noise is present in the system. However, operation of such nanoscale mechanical resonators is very challenging due to minuteness of their oscillation amplitudes and presence of multiple noise sources in real experimental environments. In order to surmount these challenges, we use microscale cantilever resonators driven to large amplitudes, far beyond their nonlinear instability onset. Our experiments show that such a nonlinear cantilever resonator, described analytically as a Duffing oscillator, has mass sensing performance comparable to that of much smaller resonators operating in a linear regime. We demonstrate femtogram level mass sensing that relies on a bifurcation point tracking that does not require any complex readout means. Our approaches enable straightforward detection of mass changes that are near the fundamental limit imposed by thermo-mechanical fluctuations.
ID201202961, DOE S-124,539, Information Security Analysis Using Game Theory and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abercrombie, Robert K; Schlicher, Bob G
Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions aremore » always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.« less
Depth profile measurement with lenslet images of the plenoptic camera
NASA Astrophysics Data System (ADS)
Yang, Peng; Wang, Zhaomin; Zhang, Wei; Zhao, Hongying; Qu, Weijuan; Zhao, Haimeng; Asundi, Anand; Yan, Lei
2018-03-01
An approach for carrying out depth profile measurement of an object with the plenoptic camera is proposed. A single plenoptic image consists of multiple lenslet images. To begin with, these images are processed directly with a refocusing technique to obtain the depth map, which does not need to align and decode the plenoptic image. Then, a linear depth calibration is applied based on the optical structure of the plenoptic camera for depth profile reconstruction. One significant improvement of the proposed method concerns the resolution of the depth map. Unlike the traditional method, our resolution is not limited by the number of microlenses inside the camera, and the depth map can be globally optimized. We validated the method with experiments on depth map reconstruction, depth calibration, and depth profile measurement, with the results indicating that the proposed approach is both efficient and accurate.
NASA Astrophysics Data System (ADS)
Hogri, Roni; Bamford, Simeon A.; Taub, Aryeh H.; Magal, Ari; Giudice, Paolo Del; Mintz, Matti
2015-02-01
Neuroprostheses could potentially recover functions lost due to neural damage. Typical neuroprostheses connect an intact brain with the external environment, thus replacing damaged sensory or motor pathways. Recently, closed-loop neuroprostheses, bidirectionally interfaced with the brain, have begun to emerge, offering an opportunity to substitute malfunctioning brain structures. In this proof-of-concept study, we demonstrate a neuro-inspired model-based approach to neuroprostheses. A VLSI chip was designed to implement essential cerebellar synaptic plasticity rules, and was interfaced with cerebellar input and output nuclei in real time, thus reproducing cerebellum-dependent learning in anesthetized rats. Such a model-based approach does not require prior system identification, allowing for de novo experience-based learning in the brain-chip hybrid, with potential clinical advantages and limitations when compared to existing parametric ``black box'' models.
Structure, form, and meaning in the mental lexicon: evidence from Arabic
Boudelaa, Sami; Marslen-Wilson, William D.
2015-01-01
Does the organization of the mental lexicon reflect the combination of abstract underlying morphemic units or the concatenation of word-level phonological units? We address these fundamental issues in Arabic, a Semitic language where every surface form is potentially analyzable into abstract morphemic units – the word pattern and the root – and where this view contrasts with stem-based approaches, chiefly driven by linguistic considerations, in which neither roots nor word patterns play independent roles in word formation and lexical representation. Five cross-modal priming experiments examine the processing of morphologically complex forms in the three major subdivisions of the Arabic lexicon – deverbal nouns, verbs, and primitive nouns. The results demonstrate that root and word pattern morphemes function as abstract cognitive entities, operating independently of semantic factors and dissociable from possible phonological confounds, while stem-based approaches consistently fail to accommodate the basic psycholinguistic properties of the Arabic mental lexicon. PMID:26682237
Multi-subject Manifold Alignment of Functional Network Structures via Joint Diagonalization.
Nenning, Karl-Heinz; Kollndorfer, Kathrin; Schöpf, Veronika; Prayer, Daniela; Langs, Georg
2015-01-01
Functional magnetic resonance imaging group studies rely on the ability to establish correspondence across individuals. This enables location specific comparison of functional brain characteristics. Registration is often based on morphology and does not take variability of functional localization into account. This can lead to a loss of specificity, or confounds when studying diseases. In this paper we propose multi-subject functional registration by manifold alignment via coupled joint diagonalization. The functional network structure of each subject is encoded in a diffusion map, where functional relationships are decoupled from spatial position. Two-step manifold alignment estimates initial correspondences between functionally equivalent regions. Then, coupled joint diagonalization establishes common eigenbases across all individuals, and refines the functional correspondences. We evaluate our approach on fMRI data acquired during a language paradigm. Experiments demonstrate the benefits in matching accuracy achieved by coupled joint diagonalization compared to previously proposed functional alignment approaches, or alignment based on structural correspondences.
Using Wannier functions to improve solid band gap predictions in density functional theory
Ma, Jie; Wang, Lin-Wang
2016-04-26
Enforcing a straight-line condition of the total energy upon removal/addition of fractional electrons on eigen states has been successfully applied to atoms and molecules for calculating ionization potentials and electron affinities, but fails for solids due to the extended nature of the eigen orbitals. Here we have extended the straight-line condition to the removal/addition of fractional electrons on Wannier functions constructed within the occupied/unoccupied subspaces. It removes the self-interaction energies of those Wannier functions, and yields accurate band gaps for solids compared to experiments. It does not have any adjustable parameters and the computational cost is at the DFT level.more » This method can also work for molecules, providing eigen energies in good agreement with experimental ionization potentials and electron affinities. Our approach can be viewed as an alternative approach of the standard LDA+U procedure.« less
González, Javier; Angulo, J; Ciancio, G
2011-04-01
Renal cell cancer with tumor thrombus is present in 4-15% of cases. The prognostic significance of this entity has been object of intense debate. Nowadays, it is considered, that the presence of thrombus itself does not have a negative prognostic impact on survival rates if the thrombus could be excised satisfactorily. Complete removal of renal malignant tissue is the only curative strategy for the treatment of this kind of tumors. During the last three decades, there has been steady improvements in surgical technique and preoperative care fields that have favorably modified the surgeons' ability to safely excise these tumors. In this sense, the experience provided by multiorgan, kidney-pancreas and liver procurement and transplantation techniques led the urologists reexamine their approaches to the inferior vena cava and retroperitoneum, thus they could result useful in the always challenging resection of these complex tumors with neoplasic extension into the vena cava.
Mazzitelli, S; Tosi, A; Balestra, C; Nastruzzi, C; Luca, G; Mancuso, F; Calafiore, R; Calvitti, M
2008-09-01
The optimization, through a Design of Experiments (DoE) approach, of a microencapsulation procedure for isolated neonatal porcine islets (NPI) is described. The applied method is based on the generation of monodisperse droplets by a vibrational nozzle. An alginate/polyornithine encapsulation procedure, developed and validated in our laboratory for almost a decade, was used to embody pancreatic islets. We analyzed different experimental parameters including frequency of vibration, amplitude of vibration, polymer pumping rate, and distance between the nozzle and the gelling bath. We produced calcium-alginate gel microbeads with excellent morphological characteristics as well as a very narrow size distribution. The automatically produced microcapsules did not alter morphology, viability and functional properties of the enveloped NPI. The optimization of this automatic procedure may provide a novel approach to obtain a large number of batches possibly suitable for large scale production of immunoisolated NPI for in vivo cell transplantation procedures in humans.
The sustainability solutions agenda.
Sarewitz, Daniel; Clapp, Richard; Crumbley, Cathy; Kriebel, David; Tickner, Joel
2012-01-01
Progress toward a more sustainable society is usually described in a "knowledge-first" framework, where science characterizes a problem in terms of its causes and mechanisms as a basis for subsequent action. Here we present a different approach-A Sustainability Solutions Agenda (SSA)-which seeks from the outset to identify the possible pathways to solutions. SSA focuses on uncovering paths to sustainability by improving current technological practice, and applying existing knowledge to identify and evaluate technological alternatives. SSA allows people and organizations to transition toward greater sustainability without sacrificing essential technological functions, and therefore does not threaten the interests that depend on those functions. Whereas knowledge-first approaches view scientific information as sufficient to convince people to take the right actions, even if those actions are perceived as against their immediate interests, SSA allows values to evolve toward greater attention to sustainability as a result of the positive experience of solving a problem.
Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics
Dowding, Irene; Haufe, Stefan
2018-01-01
Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885
Pals, Regitze A S; Olesen, Kasper; Willaing, Ingrid
2016-06-01
To explore the effects of the Next Education (NEED) patient education approach in diabetes education. We tested the use of the NEED approach at eight intervention sites (n=193). Six additional sites served as controls (n=58). Data were collected through questionnaires, interviews and observations. We analysed data using descriptive statistics, logistic regression and systematic text condensation. Results from logistic regression demonstrated better overall assessment of education program experiences and enhanced self-reported improvements in maintaining medications correctly among patients from intervention sites, as compared to control sites. Interviews and observations suggested that improvements in health behavior could be explained by mechanisms related to the education setting, including using person-centeredness and dialogue. However, similar mechanisms were observed at control sites. Observations suggested that the quality of group dynamics, patients' motivation and educators' ability to facilitate participation in education, supported by the NEED approach, contributed to better results at intervention sites. The use of participatory approaches and, in particular, the NEED patient education approach in group-based diabetes education improved self-management skills and health behavior outcomes among individuals with diabetes. The use of dialogue tools in diabetes education is advised for educators. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Dorati, Rossella; DeTrizio, Antonella; Genta, Ida; Grisoli, Pietro; Merelli, Alessia; Tomasi, Corrado; Conti, Bice
2016-01-01
The present paper takes into account the DOE application to the preparation process of biodegradable microspheres for osteomyelitis local therapy. With this goal gentamicin loaded polylactide-co-glycolide-copolyethyleneglycol (PLGA-PEG) microspheres were prepared and investigated. Two preparation protocols (o/w and w/o/w) with different process conditions, and three PLGA-PEG block copolymers with different compositions of lactic and glycolic acids and PEG, were tested. A Design Of Experiment (DOE) screening design was applied as an approach to scale up manufacturing step. The results of DOE screening design confirmed that w/o/w technique, the presence of salt and the 15%w/v polymer concentration positively affected the EE% (72.1-97.5%), and span values of particle size distribution (1.03-1.23), while salt addition alone negatively affected the yield process. Process scale up resulted in a decrease of gentamicin EE% that can be attributed to the high volume of water used to remove PVA and NaCl residues. The results of in vitro gentamicin release study show prolonged gentamicin release up to three months from the microspheres prepared with salt addition in the dispersing phase; the behavior being consistent with their highly compact structure highlighted by scanning electron microscopy analysis. The prolonged release of gentamicin is maintained even after embedding the biodegradable microspheres into a thermosetting composite gel made of chitosan and acellular bovine bone matrix (Orthoss® granules), and the microbiologic evaluation demonstrated the efficacy of the gentamicin loaded microspheres on Escherichia coli. The collected results confirm the feasibility of the scale up of microsphere manufacturing process and the high potential of the microparticulate drug delivery system to be used for the local antibiotic delivery to bone.
Mendes, Luis Filipe; Tam, Wai Long; Chai, Yoke Chin; Geris, Liesbet; Luyten, Frank P; Roberts, Scott J
2016-05-01
Successful application of cell-based strategies in cartilage and bone tissue engineering has been hampered by the lack of robust protocols to efficiently differentiate mesenchymal stem cells into the chondrogenic lineage. The development of chemically defined culture media supplemented with growth factors (GFs) has been proposed as a way to overcome this limitation. In this work, we applied a fractional design of experiment (DoE) strategy to screen the effect of multiple GFs (BMP2, BMP6, GDF5, TGF-β1, and FGF2) on chondrogenic differentiation of human periosteum-derived mesenchymal stem cells (hPDCs) in vitro. In a micromass culture (μMass) system, BMP2 had a positive effect on glycosaminoglycan deposition at day 7 (p < 0.001), which in combination with BMP6 synergistically enhanced cartilage-like tissue formation that displayed in vitro mineralization capacity at day 14 (p < 0.001). Gene expression of μMasses cultured for 7 days with a medium formulation supplemented with 100 ng/mL of BMP2 and BMP6 and a low concentration of GDF5, TGF-β1, and FGF2 showed increased expression of Sox9 (1.7-fold) and the matrix molecules aggrecan (7-fold increase) and COL2A1 (40-fold increase) compared to nonstimulated control μMasses. The DoE analysis indicated that in GF combinations, BMP2 was the strongest effector for chondrogenic differentiation of hPDCs. When transplanted ectopically in nude mice, the in vitro-differentiated μMasses showed maintenance of the cartilaginous phenotype after 4 weeks in vivo. This study indicates the power of using the DoE approach for the creation of new medium formulations for skeletal tissue engineering approaches.
Management and assimilation of diverse, distributed watershed datasets
NASA Astrophysics Data System (ADS)
Varadharajan, C.; Faybishenko, B.; Versteeg, R.; Agarwal, D.; Hubbard, S. S.; Hendrix, V.
2016-12-01
The U.S. Department of Energy's (DOE) Watershed Function Scientific Focus Area (SFA) seeks to determine how perturbations to mountainous watersheds (e.g., floods, drought, early snowmelt) impact the downstream delivery of water, nutrients, carbon, and metals over seasonal to decadal timescales. We are building a software platform that enables integration of diverse and disparate field, laboratory, and simulation datasets, of various types including hydrological, geological, meteorological, geophysical, geochemical, ecological and genomic datasets across a range of spatial and temporal scales within the Rifle floodplain and the East River watershed, Colorado. We are using agile data management and assimilation approaches, to enable web-based integration of heterogeneous, multi-scale dataSensor-based observations of water-level, vadose zone and groundwater temperature, water quality, meteorology as well as biogeochemical analyses of soil and groundwater samples have been curated and archived in federated databases. Quality Assurance and Quality Control (QA/QC) are performed on priority datasets needed for on-going scientific analyses, and hydrological and geochemical modeling. Automated QA/QC methods are used to identify and flag issues in the datasets. Data integration is achieved via a brokering service that dynamically integrates data from distributed databases via web services, based on user queries. The integrated results are presented to users in a portal that enables intuitive search, interactive visualization and download of integrated datasets. The concepts, approaches and codes being used are shared across various data science components of various large DOE-funded projects such as the Watershed Function SFA, Next Generation Ecosystem Experiment (NGEE) Tropics, Ameriflux/FLUXNET, and Advanced Simulation Capability for Environmental Management (ASCEM), and together contribute towards DOE's cyberinfrastructure for data management and model-data integration.
Székely, Gy; Henriques, B; Gil, M; Ramos, A; Alvarez, C
2012-11-01
The present study reports on a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method development strategy supported by design of experiments (DoE) for the trace analysis of 4-dimethylaminopyridine (DMAP). The conventional approaches for development of LC-MS/MS methods are usually via trial and error, varying intentionally the experimental factors which is time consuming and interactions between experimental factors are not considered. The LC factors chosen for the DoE study include flow (F), gradient (G) and injection volume (V(inj)) while cone voltage (E(con)) and collision energy (E(col)) were chosen as MS parameters. All of the five factors were studied simultaneously. The method was optimized with respect to four responses: separation of peaks (Sep), peak area (A(peak)), length of the analysis (T) and the signal to noise ratio (S/N). A quadratic model, namely central composite face (CCF) featuring 29 runs was used instead of a less powerful linear model since the increase in the number of injections was insignificant. In order to determine the robustness of the method a new set of DoE experiments was carried out applying robustness around the optimal conditions was evaluated applying a fractional factorial of resolution III with 11 runs, wherein additional factors - such as column temperature and quadrupole resolution - were considered. The method utilizes a Phenomenex Gemini NX C-18 HPLC analytical column with electrospray ionization and a triple quadrupole mass detector in multiple reaction monitoring (MRM) mode, resulting in short analyses with a 10min runtime. Drawbacks of derivatization, namely incomplete reaction and time consuming sample preparation, have been avoided and the change from SIM to MRM mode resulted in increased sensitivity and lower LOQ. The DoE method development strategy led to a method allowing the trace analysis of DMAP at 0.5 ng/ml absolute concentration which corresponds to a 0.1 ppm limit of quantification in 5mg/ml mometasone furoate glucocorticoid. The obtained method was validated in a linear range of 0.1-10 ppm and presented a %RSD of 0.02% for system precision. Regarding DMAP recovery in mometasone furoate, spiked samples produced %recoveries between 83 and 113% in the range of 0.1-2 ppm. Copyright © 2012 Elsevier B.V. All rights reserved.
Waste Estimates for a Future Recycling Plant in the US Based Upon AREVA Operating Experience - 13206
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foare, Genevieve; Meze, Florian; Bader, Sven
2013-07-01
Estimates of process and secondary wastes produced by a recycling plant built in the U.S., which is composed of a used nuclear fuel (UNF) reprocessing facility and a mixed oxide (MOX) fuel fabrication facility, are performed as part of a U.S. Department of Energy (DOE) sponsored study [1]. In this study, a set of common inputs, assumptions, and constraints were identified to allow for comparison of these wastes between different industrial teams. AREVA produced a model of a reprocessing facility, an associated fuel fabrication facility, and waste treatment facilities to develop the results for this study. These facilities were dividedmore » into a number of discrete functional areas for which inlet and outlet flow streams were clearly identified to allow for an accurate determination of the radionuclide balance throughout the facility and the waste streams. AREVA relied primarily on its decades of experience and feedback from its La Hague (reprocessing) and MELOX (MOX fuel fabrication) commercial operating facilities in France to support this assessment. However, to perform these estimates for a U.S. facility with different regulatory requirements and to take advantage of some technological advancements, such as in the potential treatment of off-gases, some deviations from this experience were necessary. A summary of AREVA's approach and results for the recycling of 800 metric tonnes of initial heavy metal (MTIHM) of LWR UNF per year into MOX fuel under the assumptions and constraints identified for this DOE study are presented. (authors)« less
Chattopadhyay, Sudip; Chaudhuri, Rajat K; Freed, Karl F
2011-04-28
The improved virtual orbital-complete active space configuration interaction (IVO-CASCI) method enables an economical and reasonably accurate treatment of static correlation in systems with significant multireference character, even when using a moderate basis set. This IVO-CASCI method supplants the computationally more demanding complete active space self-consistent field (CASSCF) method by producing comparable accuracy with diminished computational effort because the IVO-CASCI approach does not require additional iterations beyond an initial SCF calculation, nor does it encounter convergence difficulties or multiple solutions that may be found in CASSCF calculations. Our IVO-CASCI analytical gradient approach is applied to compute the equilibrium geometry for the ground and lowest excited state(s) of the theoretically very challenging 2,6-pyridyne, 1,2,3-tridehydrobenzene and 1,3,5-tridehydrobenzene anionic systems for which experiments are lacking, accurate quantum calculations are almost completely absent, and commonly used calculations based on single reference configurations fail to provide reasonable results. Hence, the computational complexity provides an excellent test for the efficacy of multireference methods. The present work clearly illustrates that the IVO-CASCI analytical gradient method provides a good description of the complicated electronic quasi-degeneracies during the geometry optimization process for the radicaloid anions. The IVO-CASCI treatment produces almost identical geometries as the CASSCF calculations (performed for this study) at a fraction of the computational labor. Adiabatic energy gaps to low lying excited states likewise emerge from the IVO-CASCI and CASSCF methods as very similar. We also provide harmonic vibrational frequencies to demonstrate the stability of the computed geometries.
Using cystoscopy to segment bladder tumors with a multivariate approach in different color spaces.
Freitas, Nuno R; Vieira, Pedro M; Lima, Estevao; Lima, Carlos S
2017-07-01
Nowadays the diagnosis of bladder lesions relies upon cystoscopy examination and depends on the interpreter's experience. State of the art of bladder tumor identification are based on 3D reconstruction, using CT images (Virtual Cystoscopy) or images where the structures are exalted with the use of pigmentation, but none uses white light cystoscopy images. An initial attempt to automatically identify tumoral tissue was already developed by the authors and this paper will develop this idea. Traditional cystoscopy images processing has a huge potential to improve early tumor detection and allows a more effective treatment. In this paper is described a multivariate approach to do segmentation of bladder cystoscopy images, that will be used to automatically detect and improve physician diagnose. Each region can be assumed as a normal distribution with specific parameters, leading to the assumption that the distribution of intensities is a Gaussian Mixture Model (GMM). Region of high grade and low grade tumors, usually appears with higher intensity than normal regions. This paper proposes a Maximum a Posteriori (MAP) approach based on pixel intensities read simultaneously in different color channels from RGB, HSV and CIELab color spaces. The Expectation-Maximization (EM) algorithm is used to estimate the best multivariate GMM parameters. Experimental results show that the proposed method does bladder tumor segmentation into two classes in a more efficient way in RGB even in cases where the tumor shape is not well defined. Results also show that the elimination of component L from CIELab color space does not allow definition of the tumor shape.
Is multiple-sequence alignment required for accurate inference of phylogeny?
Höhl, Michael; Ragan, Mark A
2007-04-01
The process of inferring phylogenetic trees from molecular sequences almost always starts with a multiple alignment of these sequences but can also be based on methods that do not involve multiple sequence alignment. Very little is known about the accuracy with which such alignment-free methods recover the correct phylogeny or about the potential for increasing their accuracy. We conducted a large-scale comparison of ten alignment-free methods, among them one new approach that does not calculate distances and a faster variant of our pattern-based approach; all distance-based alignment-free methods are freely available from http://www.bioinformatics.org.au (as Python package decaf+py). We show that most methods exhibit a higher overall reconstruction accuracy in the presence of high among-site rate variation. Under all conditions that we considered, variants of the pattern-based approach were significantly better than the other alignment-free methods. The new pattern-based variant achieved a speed-up of an order of magnitude in the distance calculation step, accompanied by a small loss of tree reconstruction accuracy. A method of Bayesian inference from k-mers did not improve on classical alignment-free (and distance-based) methods but may still offer other advantages due to its Bayesian nature. We found the optimal word length k of word-based methods to be stable across various data sets, and we provide parameter ranges for two different alphabets. The influence of these alphabets was analyzed to reveal a trade-off in reconstruction accuracy between long and short branches. We have mapped the phylogenetic accuracy for many alignment-free methods, among them several recently introduced ones, and increased our understanding of their behavior in response to biologically important parameters. In all experiments, the pattern-based approach emerged as superior, at the expense of higher resource consumption. Nonetheless, no alignment-free method that we examined recovers the correct phylogeny as accurately as does an approach based on maximum-likelihood distance estimates of multiply aligned sequences.
New Developments in the Technology Readiness Assessment Process in US DOE-EM - 13247
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krahn, Steven; Sutter, Herbert; Johnson, Hoyt
2013-07-01
A Technology Readiness Assessment (TRA) is a systematic, metric-based process and accompanying report that evaluates the maturity of the technologies used in systems; it is designed to measure technology maturity using the Technology Readiness Level (TRL) scale pioneered by the National Aeronautics and Space Administration (NASA) in the 1980's. More recently, DoD has adopted and provided systematic guidance for performing TRAs and determining TRLs. In 2007 the GAO recommended that the DOE adopt the NASA/DoD methodology for evaluating technology maturity. Earlier, in 2006-2007, DOE-EM had conducted pilot TRAs on a number of projects at Hanford and Savannah River. In Marchmore » 2008, DOE-EM issued a process guide, which established TRAs as an integral part of DOE-EM's Project Management Critical Decision Process. Since the development of its detailed TRA guidance in 2008, DOE-EM has continued to accumulate experience in the conduct of TRAs and the process for evaluating technology maturity. DOE has developed guidance on TRAs applicable department-wide. DOE-EM's experience with the TRA process, the evaluations that led to recently developed proposed revisions to the DOE-EM TRA/TMP Guide; the content of the proposed changes that incorporate the above lessons learned and insights are described. (authors)« less
Arumugam, Abiramasundari; Joshi, Amita; Vasu, Kamala K
2017-11-01
The present work focused on the application of design of experiment (DoE) principles to the development and optimization of a stability-indicating method (SIM) for the drug imidapril hydrochloride and its degradation products (DPs). The resolution of peaks for the DPs and their drug in a SIM can be influenced by many factors. The factors studied here were pH, gradient time, organic modifier, flow rate, molar concentration of the buffer, and wavelength, with the aid of a Plackett-Burman design. Results from the Plackett-Burman study conspicuously showed influence of two factors, pH and gradient time, on the analyzed response, particularly, the resolution of the closely eluting DPs (DP-5 and DP-6) and the retention time of the last peak. Optimization of the multiresponse processes was achieved through Derringer's desirability function with the assistance of a full factorial design. Separation was achieved using a C18 Phenomenex Luna column (250 × 4.6 mm id, 5 µm particle size) at a flow rate of 0.8 mL/min at 210 nm. The optimized mobile phase composition was ammonium-acetate buffer (pH 5) in pump A and acetonitrile-methanol (in equal ratio) in pump B with a run time of 40 min using a gradient method.
Effects of Background Music on Objective and Subjective Performance Measures in an Auditory BCI.
Zhou, Sijie; Allison, Brendan Z; Kübler, Andrea; Cichocki, Andrzej; Wang, Xingyu; Jin, Jing
2016-01-01
Several studies have explored brain computer interface (BCI) systems based on auditory stimuli, which could help patients with visual impairments. Usability and user satisfaction are important considerations in any BCI. Although background music can influence emotion and performance in other task environments, and many users may wish to listen to music while using a BCI, auditory, and other BCIs are typically studied without background music. Some work has explored the possibility of using polyphonic music in auditory BCI systems. However, this approach requires users with good musical skills, and has not been explored in online experiments. Our hypothesis was that an auditory BCI with background music would be preferred by subjects over a similar BCI without background music, without any difference in BCI performance. We introduce a simple paradigm (which does not require musical skill) using percussion instrument sound stimuli and background music, and evaluated it in both offline and online experiments. The result showed that subjects preferred the auditory BCI with background music. Different performance measures did not reveal any significant performance effect when comparing background music vs. no background. Since the addition of background music does not impair BCI performance but is preferred by users, auditory (and perhaps other) BCIs should consider including it. Our study also indicates that auditory BCIs can be effective even if the auditory channel is simultaneously otherwise engaged.
Heart transplantation experiences: a phenomenological approach.
Sadala, Maria Lúcia Araújo; Stolf, Noedir Antônio Groppo
2008-04-01
The aim of this study was to understand the heart transplantation experience based on patients' descriptions. To patients with heart failure, heart transplantation represents a possibility to survive and improve their quality of life. Studies have shown that more quality of life is related to patients' increasing awareness and participation in the work of the healthcare team in the post-transplantation period. Deficient relationships between patients and healthcare providers result in lower compliance with the postoperative regimen. A phenomenological approach was used to interview 26 patients who were heart transplant recipients. Patients were interviewed individually and asked this single question: What does the experience of being heart transplanted mean? Participants' descriptions were analysed using phenomenological reduction, analysis and interpretation. Three categories emerged from data analysis: (i) the time lived by the heart recipient; (ii) donors, family and caregivers and (iii) reflections on the experience lived. Living after heart transplant means living in a complex situation: recipients are confronted with lifelong immunosuppressive therapy associated with many side-effects. Some felt healthy whereas others reported persistence of complications as well as the onset of other pathologies. However, all participants celebrated an improvement in quality of life. Health caregivers, their social and family support had been essential for their struggle. Participants realised that life after heart transplantation was a continuing process demanding support and structured follow-up for the rest of their lives. The findings suggest that each individual has unique experiences of the heart transplantation process. To go on living participants had to accept changes and adapt: to the organ change, to complications resulting from rejection of the organ, to lots of pills and food restrictions. Stimulating a heart transplant patients spontaneous expression about what they are experiencing and granting them the actual status of the main character in their own story is important to their care.
Team approach to manage difficult-to-treat TB cases: Experiences in Europe and beyond.
D'Ambrosio, L; Bothamley, G; Caminero Luna, J A; Duarte, R; Guglielmetti, L; Muñoz Torrico, M; Payen, M C; Saavedra Herrera, N; Salazar Lezama, M A; Skrahina, A; Tadolini, M; Tiberi, S; Veziris, N; Migliori, G B
As recommended by the World Health Organization (WHO), optimal management of MDR-TB cases can be ensured by a multi-speciality consultation body known as 'TB Consilium'. This body usually includes different medical specialities, competences and perspectives (e.g., clinical expertise both for adults and children; surgical, radiological and public health expertise; psychological background and nursing experience, among others), thus lowering the risk of making mistakes - or managing the patients inappropriately, in order to improve their clinical outcomes. At present, several high MDR-TB burden countries in the different WHO regions (and beyond) have introduced TB Consilium-like bodies at the national or subnational level to reach consensus on the best treatment approach for their patients affected by TB. In addition, in countries/settings where a formal system of consultation does not exist, specialized staff from MDR-TB reference centres or international organizations usually spend a considerable amount of their working time responding to phone or e-mail clinical queries on how to manage M/XDR-TB cases. The aim of this manuscript is to describe the different experiences with the TB Consilia both at the international level (European Respiratory Society - ERS/WHO TB Consilium) and in some of the countries where this experience operates successfully in Europe and beyond. The Consilium experiences are described around the following topics: (1) history, aims and focus; (2) management and funding; (3) technical functioning and structure; (4) results achieved. In addition a comparative analysis of the TB Consilia in the different countries has been performed. Copyright © 2017 Sociedade Portuguesa de Pneumologia. Published by Elsevier España, S.L.U. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manoli, Gabriele, E-mail: manoli@dmsa.unipd.it; Nicholas School of the Environment, Duke University, Durham, NC 27708; Rossi, Matteo
The modeling of unsaturated groundwater flow is affected by a high degree of uncertainty related to both measurement and model errors. Geophysical methods such as Electrical Resistivity Tomography (ERT) can provide useful indirect information on the hydrological processes occurring in the vadose zone. In this paper, we propose and test an iterated particle filter method to solve the coupled hydrogeophysical inverse problem. We focus on an infiltration test monitored by time-lapse ERT and modeled using Richards equation. The goal is to identify hydrological model parameters from ERT electrical potential measurements. Traditional uncoupled inversion relies on the solution of two sequentialmore » inverse problems, the first one applied to the ERT measurements, the second one to Richards equation. This approach does not ensure an accurate quantitative description of the physical state, typically violating mass balance. To avoid one of these two inversions and incorporate in the process more physical simulation constraints, we cast the problem within the framework of a SIR (Sequential Importance Resampling) data assimilation approach that uses a Richards equation solver to model the hydrological dynamics and a forward ERT simulator combined with Archie's law to serve as measurement model. ERT observations are then used to update the state of the system as well as to estimate the model parameters and their posterior distribution. The limitations of the traditional sequential Bayesian approach are investigated and an innovative iterative approach is proposed to estimate the model parameters with high accuracy. The numerical properties of the developed algorithm are verified on both homogeneous and heterogeneous synthetic test cases based on a real-world field experiment.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hopper, Calvin Mitchell
In May 1973 the University of New Mexico conducted the first nationwide criticality safety training and education week-long short course for nuclear criticality safety engineers. Subsequent to that course, the Los Alamos Critical Experiments Facility (LACEF) developed very successful 'hands-on' subcritical and critical training programs for operators, supervisors, and engineering staff. Since the inception of the US Department of Energy (DOE) Nuclear Criticality Technology and Safety Project (NCT&SP) in 1983, the DOE has stimulated contractor facilities and laboratories to collaborate in the furthering of nuclear criticality as a discipline. That effort included the education and training of nuclear criticality safetymore » engineers (NCSEs). In 1985 a textbook was written that established a path toward formalizing education and training for NCSEs. Though the NCT&SP went through a brief hiatus from 1990 to 1992, other DOE-supported programs were evolving to the benefit of NCSE training and education. In 1993 the DOE established a Nuclear Criticality Safety Program (NCSP) and undertook a comprehensive development effort to expand the extant LACEF 'hands-on' course specifically for the education and training of NCSEs. That successful education and training was interrupted in 2006 for the closing of the LACEF and the accompanying movement of materials and critical experiment machines to the Nevada Test Site. Prior to that closing, the Lawrence Livermore National Laboratory (LLNL) was commissioned by the US DOE NCSP to establish an independent hands-on NCSE subcritical education and training course. The course provided an interim transition for the establishment of a reinvigorated and expanded two-week NCSE education and training program in 2011. The 2011 piloted two-week course was coordinated by the Oak Ridge National Laboratory (ORNL) and jointly conducted by the Los Alamos National Laboratory (LANL) classroom education and facility training, the Sandia National Laboratory (SNL) hands-on criticality experiments training, and the US DOE National Criticality Experiment Research Center (NCERC) hands-on criticality experiments training that is jointly supported by LLNL and LANL and located at the Nevada National Security Site (NNSS) This paper provides the description of the bases, content, and conduct of the piloted, and future US DOE NCSP Criticality Safety Engineer Training and Education Project.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolverton, Christopher; Ozolins, Vidvuds; Kung, Harold H.
The objective of the proposed program is to discover novel mixed hydrides for hydrogen storage, which enable the DOE 2010 system-level goals. Our goal is to find a material that desorbs 8.5 wt.% H 2 or more at temperatures below 85°C. The research program will combine first-principles calculations of reaction thermodynamics and kinetics with material and catalyst synthesis, testing, and characterization. We will combine materials from distinct categories (e.g., chemical and complex hydrides) to form novel multicomponent reactions. Systems to be studied include mixtures of complex hydrides and chemical hydrides [e.g. LiNH 2+NH 3BH 3] and nitrogen-hydrogen based borohydrides [e.g.more » Al(BH 4) 3(NH 3) 3]. The 2010 and 2015 FreedomCAR/DOE targets for hydrogen storage systems are very challenging, and cannot be met with existing materials. The vast majority of the work to date has delineated materials into various classes, e.g., complex and metal hydrides, chemical hydrides, and sorbents. However, very recent studies indicate that mixtures of storage materials, particularly mixtures between various classes, hold promise to achieve technological attributes that materials within an individual class cannot reach. Our project involves a systematic, rational approach to designing novel multicomponent mixtures of materials with fast hydrogenation/dehydrogenation kinetics and favorable thermodynamics using a combination of state-of-the-art scientific computing and experimentation. We will use the accurate predictive power of first-principles modeling to understand the thermodynamic and microscopic kinetic processes involved in hydrogen release and uptake and to design new material/catalyst systems with improved properties. Detailed characterization and atomic-scale catalysis experiments will elucidate the effect of dopants and nanoscale catalysts in achieving fast kinetics and reversibility. And, state-of-the-art storage experiments will give key storage attributes of the investigated reactions, validate computational predictions, and help guide and improve computational methods. In sum, our approach involves a powerful blend of: 1) H2 Storage measurements and characterization, 2) State-of-the-art computational modeling, 3) Detailed catalysis experiments, 4) In-depth automotive perspective.« less
The world, entanglement, and God: Quantum theory and the Christian doctrine of creation
NASA Astrophysics Data System (ADS)
Wegter-McNelly, Kirk Matthew
The adequacy of classical physics' mechanistic worldview is called into question by an "entanglement" interpretation of quantum nonlocal correlations, which suggests a relational holistic account of physical processes. Albert Einstein rejected the possibility of such behavior, but recent experiments confirm its existence in the world. The concept of entanglement provides an especially fruitful locus for appropriating quantum insights into theological reflection because it bridges two otherwise antithetical interpretations of the theory, the indeterministic "Copenhagen" version developed by Niels Bohr and the deterministic version later discovered by David Bohm. Entanglement also offers an opportunity to explore what Robert Russell has called the method of "mutual interaction," by which theology can play a legitimate heuristic role in scientific research programs even as it responds to scientific discoveries. The concept of entanglement offers rich possibilities for developing a theological program within which to situate an ecological, trinitarian understanding of creation. In particular, a theological appropriation of entanglement can strengthen an ecological approach such as that of Sallie McFague, who argues powerfully for the importance of naturalistic metaphors in crafting a cosmic vision of wholeness but whose use of "organic" metaphors does not entirely eliminate the specter of mechanism. Entanglement can also strengthen a trinitarian approach such as one finds in Wolfhart Pannenberg, whose relational understanding of creation remains mechanistic insofar as it depends primarily on classical rather than quantum field theory. According to the theological approach developed in this dissertation, a trinitarian relational God creates a universe that is entangled with itself and, as a result of the incarnation, also with God. Additionally, this theological perspective leads to the scientific prediction that no complete solution to the quantum measurement problem beyond "decoherence" will be forthcoming. Decoherence accounts for the emergence of real separation at the macroscopic level in a world that remains holistically interconnected at the quantum level, and it does so in a manner that is consonant with an ecological, trinitarian perspective. Three appendices provide: a derivation and discussion of John Bell's inequality, a summary of several key entanglement experiments, and a general time line of related scientific developments.
NASA Astrophysics Data System (ADS)
Flowers, Alice Blood
Lack of personal connection to the natural world by most American youth builds reason for assessing effectiveness of conservation education programs. Place-based learning is important in helping youth understand how their personal and societal well-being are linked and dependent upon their local habitats. Across Montana 2277 students in grades 3--10 participate in an interactive year long fishing education program with their teachers called Hooked on Fishing (HOF). The purpose of my study was to assess the effectiveness of HOF, a place-based conservation education program established in 1996, and modeled after the national Hooked on Fishing, Not on Drugs program. Using a quasi-experimental nonequivalent group study design, students received a pre-survey during the beginning of the program, a post-survey after the program, and an extended post-survey 12 to 14 weeks later. Teachers voluntarily participated in an Internet survey during May 2006, and program instructors voluntarily participated in a structured open-ended telephone interview in June 2006. A key component of my study was the decision to conduct the evaluation process using an approach which included stakeholders in the development of the instruments to measure student outcomes. This approach is called utilization-focused evaluation and was developed by Michael Q. Patton. The motive of this approach is to promote the usability of the evaluation results. The results are considered to have a better chance to be applied by the program stakeholders to not only gauge program effectiveness, but to be used to improve the program. Two research questions were: (1) does the frequency of outdoor experiences have significant affects on students' knowledge, skills, attitudes, and intended stewardship behaviors; and (2) does improved knowledge of local natural resources have significant affects on students' skills, attitudes and intended stewardship behavior. Nonparametric statistical analyses calculated statistical significant results for most knowledge and skill outcomes in a positive direction of change with 2--3 HOF outdoor experiences. Attitudinal and intended behavior outcomes did not show similar results. Internet teacher survey and instructor interviews provided qualitative depth and insight to student self-reported responses.
Bauer, Julian
2015-03-01
Is it Possible to Experiment with Thought? Ernst Mach's Notion of Thought Experiment and its Pedagogical Context around 1900. The article tries to establish the crucial importance of the pedagogical dimension of Ernst Mach's ideas on experimenting with thought. The focus on contemporary pedagogics demonstrates, first, that Mach's didactic approach to physics is part of a much broader stream of pedagogical writings that transcends national and disciplinary borders and comprises a diversity of authors, e.g. Wilhelm Jerusalem, William James or Alfred N. Whitehead; second, that the much-heralded controversy between Mach and the French philosopher of science Pierre Duhem about thought experiments does not only revolve around epistemological issues but rather stems from their antagonist vision of teaching physics; and finally, third, that G. Stanley Hall's psychogenetic theory of pedagogics bears a strong resemblance with the evolutionary naturalism of Machian epistemology and helps explaining key tenets of Mach's conception of thought experiment. By establishing a broad convergence between the work of all these authors despite their different academic upbringing, background and nationality the article argues for a complex and historically fine-grained vision of the relations between natural, social and human sciences going beyond dichotomies like 'Erklären' and 'Verstehen' or the 'Two Cultures'. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Gopinath, T.; Veglia, Gianluigi
2016-06-01
Conventional multidimensional magic angle spinning (MAS) solid-state NMR (ssNMR) experiments detect the signal arising from the decay of a single coherence transfer pathway (FID), resulting in one spectrum per acquisition time. Recently, we introduced two new strategies, namely DUMAS (DUal acquisition Magic Angle Spinning) and MEIOSIS (Multiple ExperIments via Orphan SpIn operatorS), that enable the simultaneous acquisitions of multidimensional ssNMR experiments using multiple coherence transfer pathways. Here, we combined the main elements of DUMAS and MEIOSIS to harness both orphan spin operators and residual polarization and increase the number of simultaneous acquisitions. We show that it is possible to acquire up to eight two-dimensional experiments using four acquisition periods per each scan. This new suite of pulse sequences, called MAeSTOSO for Multiple Acquisitions via Sequential Transfer of Orphan Spin pOlarization, relies on residual polarization of both 13C and 15N pathways and combines low- and high-sensitivity experiments into a single pulse sequence using one receiver and commercial ssNMR probes. The acquisition of multiple experiments does not affect the sensitivity of the main experiment; rather it recovers the lost coherences that are discarded, resulting in a significant gain in experimental time. Both merits and limitations of this approach are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nimbalkar, Sachin U.; Brockway, Walter F.; Lung, Bruce
The primary objective of the Department of Energy’s (DOE) Energy Treasure Hunt In-Plant Training (INPLT) is to train Better Plants partner employees to lead and conduct future energy efficiency Treasure Hunts within their facilities without DOE assistance. By taking a “learning-by-doing” approach, this INPLT, like other DOE INPLT trainings, has the added benefit of uncovering real energy and cost-saving opportunities. This INPLT leverages DOE and Better Plants technical staff, resources and tools and the EPA “Energy Treasure Hunt Guide: Simple Steps to Finding Energy Savings” process. While Treasure Hunts are a relatively well-known approach to identifying energy-savings in manufacturing plants,more » DOE is adding several additional elements in its Treasure Hunt Exchanges. The first element is technical assistance and methodology. DOE provides high-quality technical resources, such as energy efficiency calculators, fact sheets, source books etc., to facilitate the Treasure Hunt process and teaches four fundamentals: 1) how to profile equipment, 2) how to collect data, and 3), data & ROI calculation methodologies. Another element is the “train the trainer” approach wherein the training facilitator will train at least one partner employee to facilitate future treasure hunts. Another element is that DOE provides energy diagnostic equipment and teaches the participants how to use them. Finally, DOE also offers partners the opportunity to exchange teams of employees either within a partners’ enterprise or with other partners to conduct the treasure hunt in each other’s facilities. This exchange of teams is important because each team can bring different insights and uncover energy-saving opportunities that would otherwise be missed. This paper will discuss DOE methodology and the early results and lessons learned from DOE’S Energy Treasure Hunt In-Plant Trainings at Better Plants Partner facilities.« less
Development of Optimal Stressor Scenarios for New Operational Energy Systems
2017-12-01
Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical information about the associated operational...from experimentation. The resulting system requirements can be used to revisit the design requirements and develop a more robust system. This process...stressor scenarios for acceptance testing. Analyzing the previous model using a design of experiments (DOE) and regression analysis provides critical
2017-09-14
averaging the gage measurements many specimens were not meeting the ASTM D3039 standard tolerance limitations when compared to the designed 3mm and 15 mm...MarkOne) 3D printer. A design of experiment (DOE) we preformed to develop a mathematical model describing the functional relationship between the...6 Design of Experiment (DOE) .................................................................................................. 6 Carbon Fiber
Does Ice Dissolve or Does Halite Melt? A Low-Temperature Liquidus Experiment for Petrology Classes.
ERIC Educational Resources Information Center
Brady, John B.
1992-01-01
Measurement of the compositions and temperatures of H2O-NaCl brines in equilibrium with ice can be used as an easy in-class experimental determination of a liquidus. This experiment emphasizes the symmetry of the behavior of brines with regard to the minerals ice and halite and helps to free students from the conceptual tethers of one-component…
Video-assisted thoracoscopic anatomic lung resections in Germany-a nationwide survey.
Reichert, Martin; Gohlke, Andrea Birgitta; Augustin, Florian; Öfner, Dietmar; Hecker, Andreas; Padberg, Winfried; Bodner, Johannes
2016-09-01
Video-assisted thoracoscopic surgery (VATS) is an accepted alternative to thoracotomy for anatomic lung resection (AR) and literature suggests benefits over the conventional open approach. However, it's routine clinical application is still low and varies within different countries. Nationwide survey among thoracic surgical units in Germany, evaluating the departmental structure, volume of the VATS program, experience with VATS-AR (lobectomies and other-than-lobectomies-anatomic-resections), surgical technique and learning curve data. Response rate among the 269 surgical units practicing thoracic surgery in Germany was 84.4 % (n = 227). One hundred twenty-two (53.7 %) units do have experience with any type of VATS-AR. The majority of units started the VATS program only within the last 5 years and 17.2 % (n = 21) of the units have performed more than 100 procedures by now. In 2013, 78.7 % of the units performed less than 25 % of their institutional AR via a VATS approach. Indications for VATS-AR were non-small cell lung cancer in 93.4 % (up to UICC-stage IA, IB, IIA, IIB, IIIA in 7 %, 22.8 %, 33.3 %, 17.5 %, 7 %, respectively), benign diseases in 57.4 %, and pulmonary metastases in 50.8 %. 43.4 % of the departments had experience with extended VATS-AR and 28.7 % performed VATS-AR after induction-therapy. Every second thoracic surgical unit in Germany does have experience in VATS-AR though only about 20 % of them perform it routinely and also in extended procedures.
NASA Astrophysics Data System (ADS)
Alam, Muhammad
2014-03-01
The discovery dye sensitized and bulk heterojunction (BHJ) solar cells in early 1990s introduced a new class of PV technology that rely on (i) distributed photogeneration of excitons, (ii) dissociation of excitons into free carriers by the heterojunction between two organic semiconductors (OSC), and (iii) collection of free carriers through electron and hole transport layers. The success of the approach is undisputed: the highest efficiency OPV cells have all relied on variants of BHJ approach. Yet, three concerns related to the use of a pair of OSCs, namely, low Voc, process sensitivity, and reliability, suggest that the technology may never achieve efficiency-variability-reliability metrics comparable to inorganic solar cells. This encourages a reconsideration of the prospects of Single semiconductor OPV (SS-OPV), a system presumably doomed by the exciton bottleneck. In this talk, we use an inverted SS-OPV to demonstrate how the historical SS-OPV experiments may have been misinterpreted. No one disputes the signature of excitons in polymer under narrowband excitation, but our experiments show that exciton dissociation need not be a bottleneck for OPV under broadband solar illumination. We demonstrate that an alternate collection-limited theory consistently interprets the classical and new experiments, resolves puzzles such as efficiency loss with increasing light intensity, and voltage-dependent reverse photo-current, etc. The theory and experiments suggest a new ``perovskite-like'' strategy to efficiency-variability-reliability of organic solar cells. The work was supported by the Columbia DOE-EFRC (DE-SC0001085) and NSF-NCN (EEC-0228390).
Giustino, Gabriella
2009-10-01
In this paper the author discusses a specific type of dreams encountered in her clinical experience, which in her view provide an opportunity of reconstructing the traumatic emotional events of the patient's past. In 1900, Freud described a category of dreams--which he called 'biographical dreams'--that reflect historical infantile experience without the typical defensive function. Many authors agree that some traumatic dreams perform a function of recovery and working through. Bion contributed to the amplification of dream theory by linking it to the theory of thought and emphasizing the element of communication in dreams as well as their defensive aspect. The central hypothesis of this paper is that the predominant aspect of such dreams is the communication of an experience which the dreamer has in the dream but does not understand. It is often possible to reconstruct, and to help the patient to comprehend and make sense of, the emotional truth of the patient's internal world, which stems from past emotional experience with primary objects. The author includes some clinical examples and references to various psychoanalytic and neuroscientific conceptions of trauma and memory. She discusses a particular clinical approach to such dreams and how the analyst should listen to them.
Time does not cause forgetting in short-term serial recall.
Lewandowsky, Stephan; Duncan, Matthew; Brown, Gordon D A
2004-10-01
Time-based theories expect memory performance to decline as the delay between study and recall of an item increases. The assumption of time-based forgetting, central to many models of serial recall, underpins their key behaviors. Here we compare the predictions of time-based and event-based models by simulation and test them in two experiments using a novel manipulation of the delay between study and retrieval. Participants were trained, via corrective feedback, to recall at different speeds, thus varying total recall time from 6 to 10 sec. In the first experiment, participants used the keyboard to enter their responses but had to repeat a word (called the suppressor) aloud during recall to prevent rehearsal. In the second experiment, articulation was again required, but recall was verbal and was paced by the number of repetitions of the suppressor in between retrieval of items. In both experiments, serial position curves for all retrieval speeds overlapped, and output time had little or no effect. Comparative evaluation of a time-based and an event-based model confirmed that these results present a particular challenge to time-based approaches. We conclude that output interference, rather than output time, is critical in serial recall.
NASA Astrophysics Data System (ADS)
Martell, Sandra Toro
The Good Field Trip is a study that uses an ethnographic approach to answer the question of what learning looks like during a field trip to a museum. The study uses the Contextual Model of Learning (Falk & Dierking, 2000) to investigate elementary students' personal, physical, and sociocultural contexts of learning as well as how time affects students' thoughts and feelings about the experience. The author accompanied a group of eight students on a three and a half day camp-like experience to a museum that promotes environmental stewardship and the integration of art, science, and technology use and learning. The author videotaped the students' conversations and experiences and interviewed students before, during, and after the trip. Analyses of the videotapes were supplemented with student documents, including comic books, journal notes, and reflective essays about the trip. Findings include that not all experiences are marked as science, art, and technology; technology use does not occur; art is presented in a more formalized manner than science, which is composed of observation and the acquisition of knowledge about plants and animals; and conversations and activities resemble traditional modes of learning in school settings.
Gravity and decoherence: the double slit experiment revisited
NASA Astrophysics Data System (ADS)
Samuel, Joseph
2018-02-01
The double slit experiment is iconic and widely used in classrooms to demonstrate the fundamental mystery of quantum physics. The puzzling feature is that the probability of an electron arriving at the detector when both slits are open is not the sum of the probabilities when the slits are open separately. The superposition principle of quantum mechanics tells us to add amplitudes rather than probabilities and this results in interference. This experiment defies our classical intuition that the probabilities of exclusive events add. In understanding the emergence of the classical world from the quantum one, there have been suggestions by Feynman, Diosi and Penrose that gravity is responsible for suppressing interference. This idea has been pursued in many different forms ever since, predominantly within Newtonian approaches to gravity. In this paper, we propose and theoretically analyse two ‘gedanken’ or thought experiments which lend strong support to the idea that gravity is responsible for decoherence. The first makes the point that thermal radiation can suppress interference. The second shows that in an accelerating frame, Unruh radiation does the same. Invoking the Einstein equivalence principle to relate acceleration to gravity, we support the view that gravity is responsible for decoherence.
Oishi, Yoshitaka
2013-06-01
Trail settings in national parks are essential management tools for improving both ecological conservation efforts and the quality of visitor experiences. This study proposes a plan for the appropriate maintenance of trails in Chubusangaku National Park, Japan, based on the recreation opportunity spectrum (ROS) approach. First, we distributed 452 questionnaires to determine park visitors' preferences for setting a trail (response rate = 68 %). Respondents' preferences were then evaluated according to the following seven parameters: access, remoteness, naturalness, facilities and site management, social encounters, visitor impact, and visitor management. Using nonmetric multidimensional scaling and cluster analysis, the visitors were classified into seven groups. Last, we classified the actual trails according to the visitor questionnaire criteria to examine the discrepancy between visitors' preferences and actual trail settings. The actual trail classification indicated that while most developed trails were located in accessible places, primitive trails were located in remote areas. However, interestingly, two visitor groups seemed to prefer a well-conserved natural environment and, simultaneously, easily accessible trails. This finding does not correspond to a premise of the ROS approach, which supposes that primitive trails should be located in remote areas without ready access. Based on this study's results, we propose that creating trails, which afford visitors the opportunity to experience a well-conserved natural environment in accessible areas is a useful means to provide visitors with diverse recreation opportunities. The process of data collection and analysis in this study can be one approach to produce ROS maps for providing visitors with recreational opportunities of greater diversity and higher quality.
NASA Astrophysics Data System (ADS)
Ruiz Ruiz, Juan; Guttenfelder, Walter; Loureiro, Nuno; Ren, Yang; White, Anne; MIT/PPPL Collaboration
2017-10-01
Turbulent fluctuations on the electron gyro-radius length scale are thought to cause anomalous transport of electron energy in spherical tokamaks such as NSTX and MAST in some parametric regimes. In NSTX, electron-scale turbulence is studied through a combination of experimental measurements from a high-k scattering system and gyrokinetic simulations. Until now most comparisons between experiment and simulation of electron scale turbulence have been qualitative, with recent work expanding to more quantitative comparisons via synthetic diagnostic development. In this new work, we propose two alternate, complementary ways to perform a synthetic diagnostic using the gyrokinetic code GYRO. The first approach builds on previous work and is based on the traditional selection of wavenumbers using a wavenumber filter, for which a new wavenumber mapping was implemented for general axisymmetric geometry. A second alternate approach selects wavenumbers in real-space to compute the power spectra. These approaches are complementary, and recent results from both synthetic diagnostic approaches applied to NSTX plasmas will be presented. Work supported by U.S. DOE contracts DE-AC02-09CH11466 and DE-AC02-05CH11231.
GO-Bayes: Gene Ontology-based overrepresentation analysis using a Bayesian approach.
Zhang, Song; Cao, Jing; Kong, Y Megan; Scheuermann, Richard H
2010-04-01
A typical approach for the interpretation of high-throughput experiments, such as gene expression microarrays, is to produce groups of genes based on certain criteria (e.g. genes that are differentially expressed). To gain more mechanistic insights into the underlying biology, overrepresentation analysis (ORA) is often conducted to investigate whether gene sets associated with particular biological functions, for example, as represented by Gene Ontology (GO) annotations, are statistically overrepresented in the identified gene groups. However, the standard ORA, which is based on the hypergeometric test, analyzes each GO term in isolation and does not take into account the dependence structure of the GO-term hierarchy. We have developed a Bayesian approach (GO-Bayes) to measure overrepresentation of GO terms that incorporates the GO dependence structure by taking into account evidence not only from individual GO terms, but also from their related terms (i.e. parents, children, siblings, etc.). The Bayesian framework borrows information across related GO terms to strengthen the detection of overrepresentation signals. As a result, this method tends to identify sets of closely related GO terms rather than individual isolated GO terms. The advantage of the GO-Bayes approach is demonstrated with a simulation study and an application example.
Complexity in Soil Systems: What Does It Mean and How Should We Proceed?
NASA Astrophysics Data System (ADS)
Faybishenko, B.; Molz, F. J.; Brodie, E.; Hubbard, S. S.
2015-12-01
The complex soil systems approach is needed fundamentally for the development of integrated, interdisciplinary methods to measure and quantify the physical, chemical and biological processes taking place in soil, and to determine the role of fine-scale heterogeneities. This presentation is aimed at a review of the concepts and observations concerning complexity and complex systems theory, including terminology, emergent complexity and simplicity, self-organization and a general approach to the study of complex systems using the Weaver (1948) concept of "organized complexity." These concepts are used to provide understanding of complex soil systems, and to develop experimental and mathematical approaches to soil microbiological processes. The results of numerical simulations, observations and experiments are presented that indicate the presence of deterministic chaotic dynamics in soil microbial systems. So what are the implications for the scientists who wish to develop mathematical models in the area of organized complexity or to perform experiments to help clarify an aspect of an organized complex system? The modelers have to deal with coupled systems having at least three dependent variables, and they have to forgo making linear approximations to nonlinear phenomena. The analogous rule for experimentalists is that they need to perform experiments that involve measurement of at least three interacting entities (variables depending on time, space, and each other). These entities could be microbes in soil penetrated by roots. If a process being studied in a soil affects the soil properties, like biofilm formation, then this effect has to be measured and included. The mathematical implications of this viewpoint are examined, and results of numerical solutions to a system of equations demonstrating deterministic chaotic behavior are also discussed using time series and the 3D strange attractors.
Adaptation of a 3-D Quadrupole Ion Trap for Dipolar DC Collisional Activation
Prentice, Boone M.; Santini, Robert E.; McLuckey, Scott A.
2011-01-01
Means to allow for the application of a dipolar DC pulse to the end-cap electrodes of a three-dimensional (3-D) quadrupole ion trap for as short as a millisecond to as long as hundreds of milliseconds are described. The implementation of dipolar DC does not compromise the ability to apply AC waveforms to the end-cap electrodes at other times in the experiment. Dipolar DC provides a nonresonant means for ion acceleration by displacing ions from the center of the ion trap where they experience stronger rf electric fields, which increases the extent of micro-motion. The evolution of the product ion spectrum to higher generation products with time, as shown using protonated leucine enkephalin as a model protonated peptide, illustrates the broad-band nature of the activation. Dipolar DC activation is also shown to be effective as an ion heating approach in mimicking high amplitude short time excitation (HASTE)/pulsed Q dissociation (PQD) resonance excitation experiments that are intended to enhance the likelihood for observing low m/z products in ion trap tandem mass spectrometry. PMID:21953251
Cancer and beyond: the question of survivorship.
Breaden, K
1997-11-01
Today, more people are surviving cancer as a result of improved treatment and early diagnosis. In Australia, the 5-year survival rate for persons diagnosed with cancer is now approaching 50%. Although there is a growing population of cancer survivors, little is known about what surviving entails. Traditionally, a survivor has been defined as one who has been disease-free for more than 5 years. However, this definition does not take into account the experience nor the process of survival and the aim of this article is to document the process of surviving cancer as reflected in the experiences of cancer survivors. Using a method of hermeneutic phenomenology (as described by van Manen), the study draws on the stories of six women, who by their definition, are surviving cancer. A discussion of themes has been structured according to the everyday experiences of living in a body and living in time. The women describe a survival process that includes: 'feeling whole again'; 'the body as the house of suspicion'; 'the future in question'; 'changes in time'; 'lucky to be alive'; and 'sharing the journey'.
Estimation of gene induction enables a relevance-based ranking of gene sets.
Bartholomé, Kilian; Kreutz, Clemens; Timmer, Jens
2009-07-01
In order to handle and interpret the vast amounts of data produced by microarray experiments, the analysis of sets of genes with a common biological functionality has been shown to be advantageous compared to single gene analyses. Some statistical methods have been proposed to analyse the differential gene expression of gene sets in microarray experiments. However, most of these methods either require threshhold values to be chosen for the analysis, or they need some reference set for the determination of significance. We present a method that estimates the number of differentially expressed genes in a gene set without requiring a threshold value for significance of genes. The method is self-contained (i.e., it does not require a reference set for comparison). In contrast to other methods which are focused on significance, our approach emphasizes the relevance of the regulation of gene sets. The presented method measures the degree of regulation of a gene set and is a useful tool to compare the induction of different gene sets and place the results of microarray experiments into the biological context. An R-package is available.
The road we travel: Māori experience of cancer.
Walker, Tai; Signal, Louise; Russell, Marie; Smiler, Kirsten; Tuhiwai-Ruru, Rawiri
2008-08-08
This research explores Maori experiences of cancer. It does so to shed light on the causes of cancer inequalities for Maori. The views of 44 Maori affected by cancer--including patients, survivors, and their whanau (extended families)--were gathered in five hui (focus groups) and eight interviews in the Horowhenua, Manawatu, and Tairawhiti districts of New Zealand. After initial analysis, a feedback hui was held to validate the findings. Maori identified effective providers of cancer services such as Maori health providers. They also identified positive and negative experiences with health professionals. The involvement of whanau in the cancer journey was viewed as highly significant as was a holistic approach to care. Participants had many suggestions for improvements to cancer services such as better resourcing of Maori providers, cultural competence training for all health workers, the use of systems 'navigators', and the inclusion of whanau in the cancer control continuum. The research identifies a range of health system, healthcare process, and patient level factors that contribute to inequalities in cancer for Maori. It also explores the role of racism as a root cause of these inequalities and calls for urgent action.
NASA Technical Reports Server (NTRS)
Bacon, Barton J.; Ostroff, Aaron J.
2000-01-01
This paper presents an approach to on-line control design for aircraft that have suffered either actuator failure, missing effector surfaces, surface damage, or any combination. The approach is based on a modified version of nonlinear dynamic inversion. The approach does not require a model of the baseline vehicle (effectors at zero deflection), but does require feedback of accelerations and effector positions. Implementation issues are addressed and the method is demonstrated on an advanced tailless aircraft. An experimental simulation analysis tool is used to directly evaluate the nonlinear system's stability robustness.
Non-Cartesian MRI Reconstruction With Automatic Regularization Via Monte-Carlo SURE
Weller, Daniel S.; Nielsen, Jon-Fredrik; Fessler, Jeffrey A.
2013-01-01
Magnetic resonance image (MRI) reconstruction from undersampled k-space data requires regularization to reduce noise and aliasing artifacts. Proper application of regularization however requires appropriate selection of associated regularization parameters. In this work, we develop a data-driven regularization parameter adjustment scheme that minimizes an estimate (based on the principle of Stein’s unbiased risk estimate—SURE) of a suitable weighted squared-error measure in k-space. To compute this SURE-type estimate, we propose a Monte-Carlo scheme that extends our previous approach to inverse problems (e.g., MRI reconstruction) involving complex-valued images. Our approach depends only on the output of a given reconstruction algorithm and does not require knowledge of its internal workings, so it is capable of tackling a wide variety of reconstruction algorithms and nonquadratic regularizers including total variation and those based on the ℓ1-norm. Experiments with simulated and real MR data indicate that the proposed approach is capable of providing near mean squared-error (MSE) optimal regularization parameters for single-coil undersampled non-Cartesian MRI reconstruction. PMID:23591478
Optimal sensor placement for leak location in water distribution networks using genetic algorithms.
Casillas, Myrna V; Puig, Vicenç; Garza-Castañón, Luis E; Rosich, Albert
2013-11-04
This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach.
Optimal Sensor Placement for Leak Location in Water Distribution Networks Using Genetic Algorithms
Casillas, Myrna V.; Puig, Vicenç; Garza-Castañón, Luis E.; Rosich, Albert
2013-01-01
This paper proposes a new sensor placement approach for leak location in water distribution networks (WDNs). The sensor placement problem is formulated as an integer optimization problem. The optimization criterion consists in minimizing the number of non-isolable leaks according to the isolability criteria introduced. Because of the large size and non-linear integer nature of the resulting optimization problem, genetic algorithms (GAs) are used as the solution approach. The obtained results are compared with a semi-exhaustive search method with higher computational effort, proving that GA allows one to find near-optimal solutions with less computational load. Moreover, three ways of increasing the robustness of the GA-based sensor placement method have been proposed using a time horizon analysis, a distance-based scoring and considering different leaks sizes. A great advantage of the proposed methodology is that it does not depend on the isolation method chosen by the user, as long as it is based on leak sensitivity analysis. Experiments in two networks allow us to evaluate the performance of the proposed approach. PMID:24193099
Millar, Ross
2013-01-01
The purpose of this paper is to present a study of how quality improvement tools and techniques are framed within healthcare settings. The paper employs an interpretive approach to understand how quality improvement tools and techniques are mobilised and legitimated. It does so using a case study of the NHS Modernisation Agency Improvement Leaders' Guides in England. Improvement Leaders' Guides were framed within a service improvement approach encouraging the use of quality improvement tools and techniques within healthcare settings. Their use formed part of enacting tools and techniques across different contexts. Whilst this enactment was believed to support the mobilisation of tools and techniques, the experience also illustrated the challenges in distributing such approaches. The paper provides an important contribution in furthering our understanding of framing the "social act" of quality improvement. Given the ongoing emphasis on quality improvement in health systems and the persistent challenges involved, it also provides important information for healthcare leaders globally in seeking to develop, implement or modify similar tools and distribute leadership within health and social care settings.
Verma, Prakash; Bartlett, Rodney J
2014-05-14
This paper's objective is to create a "consistent" mean-field based Kohn-Sham (KS) density functional theory (DFT) meaning the functional should not only provide good total energy properties, but also the corresponding KS eigenvalues should be accurate approximations to the vertical ionization potentials (VIPs) of the molecule, as the latter condition attests to the viability of the exchange-correlation potential (VXC). None of the prominently used DFT approaches show these properties: the optimized effective potential VXC based ab initio dft does. A local, range-separated hybrid potential cam-QTP-00 is introduced as the basis for a "consistent" KS DFT approach. The computed VIPs as the negative of KS eigenvalue have a mean absolute error of 0.8 eV for an extensive set of molecule's electron ionizations, including the core. Barrier heights, equilibrium geometries, and magnetic properties obtained from the potential are in good agreement with experiment. A similar accuracy with less computational efforts can be achieved by using a non-variational global hybrid variant of the QTP-00 approach.
Taubert, Jessica; Parr, Lisa A
2011-01-01
All primates can recognize faces and do so by analyzing the subtle variation that exists between faces. Through a series of three experiments, we attempted to clarify the nature of second-order information processing in nonhuman primates. Experiment one showed that both chimpanzees (Pan troglodytes) and rhesus monkeys (Macaca mulatta) tolerate geometric distortions along the vertical axis, suggesting that information about absolute position of features does not contribute to accurate face recognition. Chimpanzees differed from monkeys, however, in that they were more sensitive to distortions along the horizontal axis, suggesting that when building a global representation of facial identity, horizontal relations between features are more diagnostic of identity than vertical relations. Two further experiments were performed to determine whether the monkeys were simply less sensitive to horizontal relations compared to chimpanzees or were instead relying on local features. The results of these experiments confirm that monkeys can utilize a holistic strategy when discriminating between faces regardless of familiarity. In contrast, our data show that chimpanzees, like humans, use a combination of holistic and local features when the faces are unfamiliar, but primarily holistic information when the faces become familiar. We argue that our comparative approach to the study of face recognition reveals the impact that individual experience and social organization has on visual cognition.
How does response inhibition influence decision making when gambling?
Stevens, Tobias; Brevers, Damien; Chambers, Christopher D; Lavric, Aureliu; McLaren, Ian P L; Mertens, Myriam; Noël, Xavier; Verbruggen, Frederick
2015-03-01
Recent research suggests that response inhibition training can alter impulsive and compulsive behavior. When stop signals are introduced in a gambling task, people not only become more cautious when executing their choice responses, they also prefer lower bets when gambling. Here, we examined how stopping motor responses influences gambling. Experiment 1 showed that the reduced betting in stop-signal blocks was not caused by changes in information sampling styles or changes in arousal. In Experiments 2a and 2b, people preferred lower bets when they occasionally had to stop their response in a secondary decision-making task but not when they were instructed to respond as accurately as possible. Experiment 3 showed that merely introducing trials on which subjects could not gamble did not influence gambling preferences. Experiment 4 demonstrated that the effect of stopping on gambling generalized to different populations. Further, 2 combined analyses suggested that the effect of stopping on gambling preferences was reliable but small. Finally, Experiment 5 showed that the effect of stopping on gambling generalized to a different task. On the basis of our findings and earlier research, we propose that the presence of stop signals influences gambling by reducing approach behavior and altering the motivational value of the gambling outcome. PsycINFO Database Record (c) 2015 APA, all rights reserved.
How Does Response Inhibition Influence Decision Making When Gambling?
2015-01-01
Recent research suggests that response inhibition training can alter impulsive and compulsive behavior. When stop signals are introduced in a gambling task, people not only become more cautious when executing their choice responses, they also prefer lower bets when gambling. Here, we examined how stopping motor responses influences gambling. Experiment 1 showed that the reduced betting in stop-signal blocks was not caused by changes in information sampling styles or changes in arousal. In Experiments 2a and 2b, people preferred lower bets when they occasionally had to stop their response in a secondary decision-making task but not when they were instructed to respond as accurately as possible. Experiment 3 showed that merely introducing trials on which subjects could not gamble did not influence gambling preferences. Experiment 4 demonstrated that the effect of stopping on gambling generalized to different populations. Further, 2 combined analyses suggested that the effect of stopping on gambling preferences was reliable but small. Finally, Experiment 5 showed that the effect of stopping on gambling generalized to a different task. On the basis of our findings and earlier research, we propose that the presence of stop signals influences gambling by reducing approach behavior and altering the motivational value of the gambling outcome. PMID:25559481
Relativity, entanglement and the physical reality of the photon
NASA Astrophysics Data System (ADS)
Tiwari, S. C.
2002-04-01
Recent experiments on the classic Einstein-Podolsky-Rosen (EPR) setting claim to test the compatibility between nonlocal quantum entanglement and the (special) theory of relativity. Confirmation of quantum theory has led to the interpretation that Einstein's image of physical reality for each photon in the EPR pair cannot be maintained. A detailed critique on two representative experiments is presented following the original EPR notion of local realism. It is argued that relativity does not enter into the picture, however for the Bell-Bohm version of local realism in terms of hidden variables such experiments are significant. Of the two alternatives, namely incompleteness of quantum theory for describing an individual quantum system, and the ensemble view, it is only the former that has been ruled out by the experiments. An alternative approach gives a statistical ensemble interpretation of the observed data, and the significant conclusion that these experiments do not deny physical reality of the photon is obtained. After discussing the need for a photon model, a vortex structure is proposed based on the space-time invariant property-spin, and pure gauge fields. To test the prime role of spin for photons and the angular-momentum interpretation of electromagnetic fields, experimental schemes feasible in modern laboratories are suggested.
Tobacco control, stigma, and public health: rethinking the relations.
Bayer, Ronald; Stuber, Jennifer
2006-01-01
The AIDS epidemic has borne witness to the terrible burdens imposed by stigmatization and to the way in which marginalization could subvert the goals of HIV prevention. Out of that experience, and propelled by the linkage of public health and human rights, came the commonplace assertion that stigmatization was a retrograde force.Yet, strikingly, the antitobacco movement has fostered a social transformation that involves the stigmatization of smokers. Does this transformation represent a troubling outcome of efforts to limit tobacco use and its associated morbidity and mortality; an ineffective, counterproductive, and moralizing approach that leads to a dead end; or a signal of public health achievement? If the latter is the case, are there unacknowledged costs?
A psychological model that integrates ethics in engineering education.
Magun-Jackson, Susan
2004-04-01
Ethics has become an increasingly important issue within engineering as the profession has become progressively more complex. The need to integrate ethics into an engineering curriculum is well documented, as education does not often sufficiently prepare engineers for the ethical conflicts they experience. Recent research indicates that there is great diversity in the way institutions approach the problem of teaching ethics to undergraduate engineering students; some schools require students to take general ethics courses from philosophical or religious perspectives, while others integrate ethics in existing engineering courses. The purpose of this paper is to propose a method to implement the integration of ethics in engineering education that is pedagogically based on Kohlberg's stage theory of moral development.
Weak correlations between local density and dynamics near the glass transition.
Conrad, J C; Starr, F W; Weitz, D A
2005-11-17
We perform experiments on two different dense colloidal suspensions with confocal microscopy to probe the relationship between local structure and dynamics near the glass transition. We calculate the Voronoi volume for our particles and show that this quantity is not a universal probe of glassy structure for all colloidal suspensions. We correlate the Voronoi volume to displacement and find that these quantities are only weakly correlated. We observe qualitatively similar results in a simulation of a polymer melt. These results suggest that the Voronoi volume does not predict dynamical behavior in experimental colloidal suspensions; a purely structural approach based on local single particle volume likely cannot describe the colloidal glass transition.
Sharing stories: life history narratives in stuttering research.
Kathard, H
2001-01-01
The life experiences of people who stutter (PWS) have not featured prominently in research. Historically, the profession of speech and language therapy has amassed data and developed its theory of stuttering within a positivistic frame. As a consequence, the existing over-arching theory of research and practice does not engage holistically with the dynamic personal, socio-cultural and political contexts of the individual who stutters. Therefore a conceptual shift is required in ontology, epistemology and methodology underpinning the knowledge construction process. The use of the life history narratives as a research tool is promoted. An exploratory study of a single participant is presented to illuminate the methodological approach and emerging theoretical constructs.
Optimizing Mass Spectrometry Analyses: A Tailored Review on the Utility of Design of Experiments.
Hecht, Elizabeth S; Oberg, Ann L; Muddiman, David C
2016-05-01
Mass spectrometry (MS) has emerged as a tool that can analyze nearly all classes of molecules, with its scope rapidly expanding in the areas of post-translational modifications, MS instrumentation, and many others. Yet integration of novel analyte preparatory and purification methods with existing or novel mass spectrometers can introduce new challenges for MS sensitivity. The mechanisms that govern detection by MS are particularly complex and interdependent, including ionization efficiency, ion suppression, and transmission. Performance of both off-line and MS methods can be optimized separately or, when appropriate, simultaneously through statistical designs, broadly referred to as "design of experiments" (DOE). The following review provides a tutorial-like guide into the selection of DOE for MS experiments, the practices for modeling and optimization of response variables, and the available software tools that support DOE implementation in any laboratory. This review comes 3 years after the latest DOE review (Hibbert DB, 2012), which provided a comprehensive overview on the types of designs available and their statistical construction. Since that time, new classes of DOE, such as the definitive screening design, have emerged and new calls have been made for mass spectrometrists to adopt the practice. Rather than exhaustively cover all possible designs, we have highlighted the three most practical DOE classes available to mass spectrometrists. This review further differentiates itself by providing expert recommendations for experimental setup and defining DOE entirely in the context of three case-studies that highlight the utility of different designs to achieve different goals. A step-by-step tutorial is also provided.
Experience of the Paris Research Consortium Climate-Environment-Society
NASA Astrophysics Data System (ADS)
Joussaume, Sylvie; Pacteau, Chantal; Vanderlinden, Jean Paul
2016-04-01
It is now widely recognized that the complexity of climate change issues translates itself into a need for interdisciplinary approaches to science. This allows to first achieve a more comprehensive vision of climate change and, second, to better inform the decision-making processes. However, it seems that willingness alone is rarely enough to implement interdisciplinarity. The purpose of this presentation is to mobilize reflexivity to revisit and analyze the experience of the Paris Consortium for Climate-Environment-Society. The French Consortium Climate-Environment-Society aims to develop, fund and coordinate interdisciplinary research into climate change and its impacts on society and environment. Launched in 2007, the consortium relies on the research expertise of 17 laboratories and federation in the Paris area working mainly in the fields of climatology, hydrology, ecology, health sciences, and the humanities and social sciences. As examples, economists and climatologists have studied greenhouse gas emission scenarios compatible with climate stabilization goals. Historical records have provided both knowledge about past climate change and vulnerability of societies. Some regions, as the Mediterranean and the Sahel, are particularly vulnerable and already have to cope with water availability, agricultural production and even health issues. A project showed that millet production in West Africa is expected to decline due to warming in a higher proportion than observed in recent decades. Climate change also raises many questions concerning health: combined effects of warming and air quality, impacts on the production of pollens and allergies, impacts on infectious diseases. All these issues lead to a need for approaches integrating different disciplines. Furthermore, climate change impacts many ecosystems which, in turn, affect its evolution. Our experience shows that interdisciplinarity supposes, in order to take shape, the conjunction between programming choices, supporting this kind of approach, and autonomy given to experimenting with interdisciplinary practices. The interdisciplinary approach does not put itself in place and requires a collective reflection on the objectives and practices. Many tools exist to support this process, in particular to mature interdisciplinarity. This incubation period allows the various disciplines to learn to know each other and to build a common conceptual and methodological basis.
Issues for academic health centers to consider before implementing a balanced-scorecard effort.
Zelman, W N; Blazer, D; Gower, J M; Bumgarner, P O; Cancilla, L M
1999-12-01
Because of changes in the health care environment, it is likely that strategic planning and management will become much more important to academic health centers (AHCs) than in the past. One approach to strategic planning and management that is gaining the considerable interest of health care organizations is the balanced scorecard. Based on a year's experience in examining this management tool, and on early implementation efforts, the authors critically evaluate the applicability of the balanced-scorecard approach at AHCs in relation to two fundamental questions: Does the decentralized nature of most AHCs mitigate the potential usefulness of the balanced-scorecard approach? Are the balanced scorecard's four perspectives (learning and growth, internal; customer; and financial) appropriate for AHCs, which are neither for-profit nor manufacturing organizations? The authors conclude that (1) the unique characteristics of AHCs may mitigate the full benefit of the balanced-scorecard approach, and (2) in cases where it is used, some key modifications must be made in the balanced-scorecard approach to account for those unique characteristics. For example, in a corporation, the key question from the financial perspective is "To succeed financially, how should we appear to our stockholders?" But in an AHC, this question must be revised to "What financial condition must we achieve to allow us to accomplish our mission?"
Verbal overshadowing of visual memories: some things are better left unsaid.
Schooler, J W; Engstler-Schooler, T Y
1990-01-01
It is widely believed that verbal processing generally improves memory performance. However, in a series of six experiments, verbalizing the appearance of previously seen visual stimuli impaired subsequent recognition performance. In Experiment 1, subjects viewed a videotape including a salient individual. Later, some subjects described the individual's face. Subjects who verbalized the face performed less well on a subsequent recognition test than control subjects who did not engage in memory verbalization. The results of Experiment 2 replicated those of Experiment 1 and further clarified the effect of memory verbalization by demonstrating that visualization does not impair face recognition. In Experiments 3 and 4 we explored the hypothesis that memory verbalization impairs memory for stimuli that are difficult to put into words. In Experiment 3 memory impairment followed the verbalization of a different visual stimulus: color. In Experiment 4 marginal memory improvement followed the verbalization of a verbal stimulus: a brief spoken statement. In Experiments 5 and 6 the source of verbally induced memory impairment was explored. The results of Experiment 5 suggested that the impairment does not reflect a temporary verbal set, but rather indicates relatively long-lasting memory interference. Finally, Experiment 6 demonstrated that limiting subjects' time to make recognition decisions alleviates the impairment, suggesting that memory verbalization overshadows but does not eradicate the original visual memory. This collection of results is consistent with a recording interference hypothesis: verbalizing a visual memory may produce a verbally biased memory representation that can interfere with the application of the original visual memory.
NASA Astrophysics Data System (ADS)
Wardaya, P. D.; Noh, K. A. B. M.; Yusoff, W. I. B. W.; Ridha, S.; Nurhandoko, B. E. B.
2014-09-01
This paper discusses a new approach for investigating the seismic wave velocity of rock, specifically carbonates, as affected by their pore structures. While the conventional routine of seismic velocity measurement highly depends on the extensive laboratory experiment, the proposed approach utilizes the digital rock physics view which lies on the numerical experiment. Thus, instead of using core sample, we use the thin section image of carbonate rock to measure the effective seismic wave velocity when travelling on it. In the numerical experiment, thin section images act as the medium on which wave propagation will be simulated. For the modeling, an advanced technique based on artificial neural network was employed for building the velocity and density profile, replacing image's RGB pixel value with the seismic velocity and density of each rock constituent. Then, ultrasonic wave was simulated to propagate in the thin section image by using finite difference time domain method, based on assumption of an acoustic-isotropic medium. Effective velocities were drawn from the recorded signal and being compared to the velocity modeling from Wyllie time average model and Kuster-Toksoz rock physics model. To perform the modeling, image analysis routines were undertaken for quantifying the pore aspect ratio that is assumed to represent the rocks pore structure. In addition, porosity and mineral fraction required for velocity modeling were also quantified by using integrated neural network and image analysis technique. It was found that the Kuster-Toksoz gives the closer prediction to the measured velocity as compared to the Wyllie time average model. We also conclude that Wyllie time average that does not incorporate the pore structure parameter deviates significantly for samples having more than 40% porosity. Utilizing this approach we found a good agreement between numerical experiment and theoretically derived rock physics model for estimating the effective seismic wave velocity of rock.
Retrieval attempts enhance learning, but retrieval success (versus failure) does not matter.
Kornell, Nate; Klein, Patricia Jacobs; Rawson, Katherine A
2015-01-01
Retrieving information from memory enhances learning. We propose a 2-stage framework to explain the benefits of retrieval. Stage 1 takes place as one attempts to retrieve an answer, which activates knowledge related to the retrieval cue. Stage 2 begins when the answer becomes available, at which point appropriate connections are strengthened and inappropriate connections may be weakened. This framework raises a basic question: Does it matter whether Stage 2 is initiated via successful retrieval or via an external presentation of the answer? To test this question, we asked participants to attempt retrieval and then randomly assigned items (which were equivalent otherwise) to be retrieved successfully or to be copied (i.e., not retrieved). Experiments 1, 2, 4, and 5 tested assumptions necessary for interpreting Experiments 3a, 3b, and 6. Experiments 3a, 3b, and 6 did not support the hypothesis that retrieval success produces more learning than does retrieval failure followed by feedback. It appears that retrieval attempts promote learning but retrieval success per se does not. PsycINFO Database Record (c) 2015 APA, all rights reserved.
NASA Astrophysics Data System (ADS)
Jahn, J. M.; Denton, R. E.; Nose, M.; Bonnell, J. W.; Kurth, W. S.; Livadiotis, G.; Larsen, B.; Goldstein, J.
2016-12-01
Determining the total plasma density from ion data is essentially an impossible task for particle instruments. The lowest instrument energy threshold never includes the coldest particles (i.e., Emin> 0 eV), and any positive spacecraft charging—which is normal for a sunlit spacecraft—exacerbates the problem by shifting the detectable minimum energy to higher values. For ion data, traditionally field line resonance measurements of ULF waves in the magnetosphere have been used to determine the mass loading of magnetic field lines in this case. This approach delivers a reduced ion mass M that represents the mass ratio of all ions on a magnetic field line. For multi-species plasmas like the plasmasphere this bounds the problem, but it does not provide a unique solution. To at least estimate partial densities using particle instruments, one traditionally performs fits to the measured particle distribution functions under the assumption that the underlying particle distributions are Maxwellian. Uncertainties performing a fit aside, there is usually no possibility to detect a possible bi-Maxwellian distribution where one of the Maxwellians is very cold. The tail of such a distribution may fall completely below the low energy threshold of the measurement. In this paper we present a different approach to determining the fractional temperatures Ti and densities ni in a multi-species plasma. First, we describe and demonstrate an approach to determine Ti and ni that does not require fitting but relies more on the mathematical properties of the distribution functions. We apply our approach to Van Allen Probes measurements of the plasmaspheric H+, He+, and O+ distribution functions under the assumption that the particle distributions are Maxwellian. We compare our results to mass loading results from the Van Allen Probes field line resonance analyses (for composition) and to the total (electron) plasma density derived from the EFW and EMFISIS experiments. Then we expand our approach to allow for kappa distributions instead. While this introduces an additional degree of freedom and therefore requires fitting, our approach is still better constrained than the traditional Maxwell fitting and may hold the key to a better understanding of the true nature of plasmaspheric particle distributions.
Workflow Management Systems for Molecular Dynamics on Leadership Computers
NASA Astrophysics Data System (ADS)
Wells, Jack; Panitkin, Sergey; Oleynik, Danila; Jha, Shantenu
Molecular Dynamics (MD) simulations play an important role in a range of disciplines from Material Science to Biophysical systems and account for a large fraction of cycles consumed on computing resources. Increasingly science problems require the successful execution of ''many'' MD simulations as opposed to a single MD simulation. There is a need to provide scalable and flexible approaches to the execution of the workload. We present preliminary results on the Titan computer at the Oak Ridge Leadership Computing Facility that demonstrate a general capability to manage workload execution agnostic of a specific MD simulation kernel or execution pattern, and in a manner that integrates disparate grid-based and supercomputing resources. Our results build upon our extensive experience of distributed workload management in the high-energy physics ATLAS project using PanDA (Production and Distributed Analysis System), coupled with recent conceptual advances in our understanding of workload management on heterogeneous resources. We will discuss how we will generalize these initial capabilities towards a more production level service on DOE leadership resources. This research is sponsored by US DOE/ASCR and used resources of the OLCF computing facility.
NASA Astrophysics Data System (ADS)
Minger, Mark Austin
Having fears and frustrations while studying science topics can lead to science anxiety for some individuals. For those who experience science learning anxiety, the reality is often poor performance, lowered self-esteem, anger, and avoidance of further science courses. Using an interpretive approach, this study captures the experiences of five self-reported science anxious students as they participate in an interdisciplinary science course at the University of Minnesota. A series of three in-depth interviews were conducted with five students who were enrolled in the "Our Changing Planet" course offered at the University of Minnesota. The interviews were transcribed verbatim, coded, and analyzed thematically. Four major themes emerged from the interviews. Two of the themes involve the realities of being a science anxious student. These focus on participants' experiences of feeling frustrated, anxious and incompetent when studying both math and science; and the experiences of trying to learn science content that does not seem relevant to them. The last two themes highlight the participants' perceptions of their experiences during the "Our Changing Planet" course, including how the course seemed different from previous science courses as well as their learning experiences in cooperative groups. After presenting the themes, with supporting quotations, each theme is linked to the related literature. The essence of the participants' science anxiety experiences is presented and practical implications regarding science anxious students are discussed. Finally, insights gained and suggestions for further research are provided.
2013-01-01
Background An inverse relationship between experience and risk of injury has been observed in many occupations. Due to statistical challenges, however, it has been difficult to characterize the role of experience on the hazard of injury. In particular, because the time observed up to injury is equivalent to the amount of experience accumulated, the baseline hazard of injury becomes the main parameter of interest, excluding Cox proportional hazards models as applicable methods for consideration. Methods Using a data set of 81,301 hourly production workers of a global aluminum company at 207 US facilities, we compared competing parametric models for the baseline hazard to assess whether experience affected the hazard of injury at hire and after later job changes. Specific models considered included the exponential, Weibull, and two (a hypothesis-driven and a data-driven) two-piece exponential models to formally test the null hypothesis that experience does not impact the hazard of injury. Results We highlighted the advantages of our comparative approach and the interpretability of our selected model: a two-piece exponential model that allowed the baseline hazard of injury to change with experience. Our findings suggested a 30% increase in the hazard in the first year after job initiation and/or change. Conclusions Piecewise exponential models may be particularly useful in modeling risk of injury as a function of experience and have the additional benefit of interpretability over other similarly flexible models. PMID:23841648
Selection as a learning experience: an exploratory study.
de Visser, Marieke; Laan, Roland F; Engbers, Rik; Cohen-Schotanus, Janke; Fluit, Cornelia
2018-01-01
Research on selection for medical school does not explore selection as a learning experience, despite growing attention for the learning effects of assessment in general. Insight in the learning effects allows us to take advantage of selection as an inclusive part of medical students' learning process to become competent professionals. The aims of this study at Radboud University Medical Center, the Netherlands, were 1) to determine whether students have learning experiences in the selection process, and, if so, what experiences; and 2) to understand what students need in order to utilize the learning effects of the selection process at the start of the formal curriculum. We used focus groups to interview 30 students admitted in 2016 about their learning experiences in the selection process. Thematic analysis was used to explore the outcomes of the interviews and to define relevant themes. In the selection process, students learned about the curriculum, themselves, their relation to others, and the profession they had been selected to enter, although this was not explicitly perceived as learning. Students needed a connection between selection and the curriculum as well as feedback to be able to really use their learning experiences for their further development. Medical school selection qualifies as a learning experience, and students as well as medical schools can take advantage of this. We recommend a careful design of the selection procedure, integrating relevant selection learning experiences into the formal curriculum, providing feedback and explicitly approaching the selection and the formal curriculum as interconnected contributors to students' development.
NASA Astrophysics Data System (ADS)
Azmi, Nur Iffah Mohamed; Arifin Mat Piah, Kamal; Yusoff, Wan Azhar Wan; Romlay, Fadhlur Rahman Mohd
2018-03-01
Controller that uses PID parameters requires a good tuning method in order to improve the control system performance. Tuning PID control method is divided into two namely the classical methods and the methods of artificial intelligence. Particle swarm optimization algorithm (PSO) is one of the artificial intelligence methods. Previously, researchers had integrated PSO algorithms in the PID parameter tuning process. This research aims to improve the PSO-PID tuning algorithms by integrating the tuning process with the Variable Weight Grey- Taguchi Design of Experiment (DOE) method. This is done by conducting the DOE on the two PSO optimizing parameters: the particle velocity limit and the weight distribution factor. Computer simulations and physical experiments were conducted by using the proposed PSO- PID with the Variable Weight Grey-Taguchi DOE and the classical Ziegler-Nichols methods. They are implemented on the hydraulic positioning system. Simulation results show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE has reduced the rise time by 48.13% and settling time by 48.57% compared to the Ziegler-Nichols method. Furthermore, the physical experiment results also show that the proposed PSO-PID with the Variable Weight Grey-Taguchi DOE tuning method responds better than Ziegler-Nichols tuning. In conclusion, this research has improved the PSO-PID parameter by applying the PSO-PID algorithm together with the Variable Weight Grey-Taguchi DOE method as a tuning method in the hydraulic positioning system.
Systematic approach to verification and validation: High explosive burn models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Menikoff, Ralph; Scovel, Christina A.
2012-04-16
Most material models used in numerical simulations are based on heuristics and empirically calibrated to experimental data. For a specific model, key questions are determining its domain of applicability and assessing its relative merits compared to other models. Answering these questions should be a part of model verification and validation (V and V). Here, we focus on V and V of high explosive models. Typically, model developers implemented their model in their own hydro code and use different sets of experiments to calibrate model parameters. Rarely can one find in the literature simulation results for different models of the samemore » experiment. Consequently, it is difficult to assess objectively the relative merits of different models. This situation results in part from the fact that experimental data is scattered through the literature (articles in journals and conference proceedings) and that the printed literature does not allow the reader to obtain data from a figure in electronic form needed to make detailed comparisons among experiments and simulations. In addition, it is very time consuming to set up and run simulations to compare different models over sufficiently many experiments to cover the range of phenomena of interest. The first difficulty could be overcome if the research community were to support an online web based database. The second difficulty can be greatly reduced by automating procedures to set up and run simulations of similar types of experiments. Moreover, automated testing would be greatly facilitated if the data files obtained from a database were in a standard format that contained key experimental parameters as meta-data in a header to the data file. To illustrate our approach to V and V, we have developed a high explosive database (HED) at LANL. It now contains a large number of shock initiation experiments. Utilizing the header information in a data file from HED, we have written scripts to generate an input file for a hydro code, run a simulation, and generate a comparison plot showing simulated and experimental velocity gauge data. These scripts are then applied to several series of experiments and to several HE burn models. The same systematic approach is applicable to other types of material models; for example, equations of state models and material strength models.« less
Displaying Sensed Tactile Cues with a Fingertip Haptic Device.
Pacchierotti, Claudio; Prattichizzo, Domenico; Kuchenbecker, Katherine J
2015-01-01
Telerobotic systems enable humans to explore and manipulate remote environments for applications such as surgery and disaster response, but few such systems provide the operator with cutaneous feedback. This article presents a novel approach to remote cutaneous interaction; our method is compatible with any fingertip tactile sensor and any mechanical tactile display device, and it does not require a position/force or skin deformation model. Instead, it directly maps the sensed stimuli to the best possible input commands for the device's motors using a data set recorded with the tactile sensor inside the device. As a proof of concept, we considered a haptic system composed of a BioTac tactile sensor, in charge of measuring contact deformations, and a custom 3-DoF cutaneous device with a flat contact platform, in charge of applying deformations to the user's fingertip. To validate the proposed approach and discover its inherent tradeoffs, we carried out two remote tactile interaction experiments. The first one evaluated the error between the tactile sensations registered by the BioTac in a remote environment and the sensations created by the cutaneous device for six representative tactile interactions and 27 variations of the display algorithm. The normalized average errors in the best condition were 3.0 percent of the BioTac's full 12-bit scale. The second experiment evaluated human subjects' experiences for the same six remote interactions and eight algorithm variations. The average subjective rating for the best algorithm variation was 8.2 out of 10, where 10 is best.
Maximization, learning, and economic behavior
Erev, Ido; Roth, Alvin E.
2014-01-01
The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design. PMID:25024182
Maximization, learning, and economic behavior.
Erev, Ido; Roth, Alvin E
2014-07-22
The rationality assumption that underlies mainstream economic theory has proved to be a useful approximation, despite the fact that systematic violations to its predictions can be found. That is, the assumption of rational behavior is useful in understanding the ways in which many successful economic institutions function, although it is also true that actual human behavior falls systematically short of perfect rationality. We consider a possible explanation of this apparent inconsistency, suggesting that mechanisms that rest on the rationality assumption are likely to be successful when they create an environment in which the behavior they try to facilitate leads to the best payoff for all agents on average, and most of the time. Review of basic learning research suggests that, under these conditions, people quickly learn to maximize expected return. This review also shows that there are many situations in which experience does not increase maximization. In many cases, experience leads people to underweight rare events. In addition, the current paper suggests that it is convenient to distinguish between two behavioral approaches to improve economic analyses. The first, and more conventional approach among behavioral economists and psychologists interested in judgment and decision making, highlights violations of the rational model and proposes descriptive models that capture these violations. The second approach studies human learning to clarify the conditions under which people quickly learn to maximize expected return. The current review highlights one set of conditions of this type and shows how the understanding of these conditions can facilitate market design.
Imitative and Direct Learning as Interacting Factors in Life History Evolution.
Bullinaria, John A
2017-01-01
The idea that lifetime learning can have a significant effect on life history evolution has recently been explored using a series of artificial life simulations. These involved populations of competing individuals evolving by natural selection to learn to perform well on simplified abstract tasks, with the learning consisting of identifying regularities in their environment. In reality, there is more to learning than that type of direct individual experience, because it often includes a substantial degree of social learning that involves various forms of imitation of what other individuals have learned before them. This article rectifies that omission by incorporating memes and imitative learning into revised versions of the previous approach. To do this reliably requires formulating and testing a general framework for meme-based simulations that will enable more complete investigations of learning as a factor in any life history evolution scenarios. It does that by simulating imitative information transfer in terms of memes being passed between individuals, and developing a process for merging that information with the (possibly inconsistent) information acquired by direct experience, leading to a consistent overall body of learning. The proposed framework is tested on a range of learning variations and a representative set of life history factors to confirm the robustness of the approach. The simulations presented illustrate the types of interactions and tradeoffs that can emerge, and indicate the kinds of species-specific models that could be developed with this approach in the future.
Multi-level deep supervised networks for retinal vessel segmentation.
Mo, Juan; Zhang, Lei
2017-12-01
Changes in the appearance of retinal blood vessels are an important indicator for various ophthalmologic and cardiovascular diseases, including diabetes, hypertension, arteriosclerosis, and choroidal neovascularization. Vessel segmentation from retinal images is very challenging because of low blood vessel contrast, intricate vessel topology, and the presence of pathologies such as microaneurysms and hemorrhages. To overcome these challenges, we propose a neural network-based method for vessel segmentation. A deep supervised fully convolutional network is developed by leveraging multi-level hierarchical features of the deep networks. To improve the discriminative capability of features in lower layers of the deep network and guide the gradient back propagation to overcome gradient vanishing, deep supervision with auxiliary classifiers is incorporated in some intermediate layers of the network. Moreover, the transferred knowledge learned from other domains is used to alleviate the issue of insufficient medical training data. The proposed approach does not rely on hand-crafted features and needs no problem-specific preprocessing or postprocessing, which reduces the impact of subjective factors. We evaluate the proposed method on three publicly available databases, the DRIVE, STARE, and CHASE_DB1 databases. Extensive experiments demonstrate that our approach achieves better or comparable performance to state-of-the-art methods with a much faster processing speed, making it suitable for real-world clinical applications. The results of cross-training experiments demonstrate its robustness with respect to the training set. The proposed approach segments retinal vessels accurately with a much faster processing speed and can be easily applied to other biomedical segmentation tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, T.N.
The Global Environmental Technology Enterprise (GETE) was conceived to develop and implement strategies to facilitate the commercialization of innovative, cost-effective Department of Energy (DOE)-developed environmental technologies. These strategies are needed to aid DOE`s clean-up mission; to break down barriers to commercialization; and to build partnerships between the federal government and private industry in order to facilitate the development and use of innovative environmental technologies.
Improvement of Automated POST Case Success Rate Using Support Vector Machines
NASA Technical Reports Server (NTRS)
Zwack, Matthew R.; Dees, Patrick D.
2017-01-01
During early conceptual design of complex systems, concept down selection can have a large impact upon program life-cycle cost. Therefore, any concepts selected during early design will inherently commit program costs and affect the overall probability of program success. For this reason it is important to consider as large a design space as possible in order to better inform the down selection process. For conceptual design of launch vehicles, trajectory analysis and optimization often presents the largest obstacle to evaluating large trade spaces. This is due to the sensitivity of the trajectory discipline to changes in all other aspects of the vehicle design. Small deltas in the performance of other subsystems can result in relatively large fluctuations in the ascent trajectory because the solution space is non-linear and multi-modal [1]. In order to help capture large design spaces for new launch vehicles, the authors have performed previous work seeking to automate the execution of the industry standard tool, Program to Optimize Simulated Trajectories (POST). This work initially focused on implementation of analyst heuristics to enable closure of cases in an automated fashion, with the goal of applying the concepts of design of experiments (DOE) and surrogate modeling to enable near instantaneous throughput of vehicle cases [2]. Additional work was then completed to improve the DOE process by utilizing a graph theory based approach to connect similar design points [3]. The conclusion of the previous work illustrated the utility of the graph theory approach for completing a DOE through POST. However, this approach was still dependent upon the use of random repetitions to generate seed points for the graph. As noted in [3], only 8% of these random repetitions resulted in converged trajectories. This ultimately affects the ability of the random reps method to confidently approach the global optima for a given vehicle case in a reasonable amount of time. With only an 8% pass rate, tens or hundreds of thousands of reps may be needed to be confident that the best repetition is at least close to the global optima. However, typical design study time constraints require that fewer repetitions be attempted, sometimes resulting in seed points that have only a handful of successful completions. If a small number of successful repetitions are used to generate a seed point, the graph method may inherit some inaccuracies as it chains DOE cases from the non-global-optimal seed points. This creates inherent noise in the graph data, which can limit the accuracy of the resulting surrogate models. For this reason, the goal of this work is to improve the seed point generation method and ultimately the accuracy of the resulting POST surrogate model. The work focuses on increasing the case pass rate for seed point generation.
Presynaptic Inputs to Any CNS Projection Neuron Identified by Dual Recombinant Virus Infection
Bráz, João M.; Wang, Fan; Basbaum, Allan I.
2015-01-01
Although neuroanatomical tracing studies have defined the origin and targets of major projection neurons (PN) of the central nervous system (CNS), there is much less information about the circuits that influence these neurons. Recently, genetic approaches that use Cre recombinase-dependent viral vectors have greatly facilitated such circuit analysis, but these tracing approaches are limited by the availability of Cre-expressing mouse lines and the difficulty in restricting Cre expression to discrete regions of the CNS. Here, we illustrate an alternative approach to drive Cre expression specifically in defined subsets of CNS projection neurons, so as to map both direct and indirect presynaptic inputs to these cells. The method involves a combination of Cre-dependent transneuronal viral tracers that can be used in the adult and that does not require genetically modified mice. To trigger Cre-expression we inject a Cre-expressing adenovirus that is retrogradely transported to the projection neurons of interest. The region containing the retrogradely labeled projection neurons is next injected with Cre-dependent pseudorabies or rabies vectors, which results in labeling of poly- and monosynaptic neuronal inputs, respectively. In proof-of-concept experiments, we used this novel tracing system to study the circuits that engage projection neurons of the superficial dorsal horn of the spinal cord and trigeminal nucleus caudalis, neurons of the parabrachial nucleus of the dorsolateral pons that project to the amygdala and cortically-projecting neurons of the lateral geniculate nucleus. Importantly, because this dual viral tracing method does not require genetically derived Cre-expressing mouse lines, inputs to almost any projection system can be studied and the analysis can be performed in larger animals, such as the rat. PMID:26470056
Excited helium under high pressures in the bulk and in nanobubbles
NASA Astrophysics Data System (ADS)
Pyper, N. C.; Naginey, T. C.; Nellist, P. D.; Whelan, Colm T.
2017-08-01
We systematically investigate the effects of intense pressures on the excitation energies of helium trapped in bubbles in order to deepen our understanding of the fundamental physics of atoms in extreme conditions. The ? excitation energy of a confined helium atom is known to differ from that of a free atom being greater in both the bulk liquid or solid or a bubble confined in a metallic matrix state. We compare calculations for the energy shift with both laboratory experiments for bulk systems and results derived from scanning transmission electron microscope (STEM) studies of helium nanobubbles embedded in different matrices. We find excellent agreement between our calculations and the latest extensive measurements in the bulk. However, we find significant discrepancies when we compare with results deduced using the 'standard' approach for analysing STEM data. Here, we show the scattering matrix element determining the intensity of this excitation in a STEM experiment is significantly affected by the same environmental factors that shift the excitation energy. Consequently, there is a serious theoretical inconsistency in the way the STEM results are calculated, in that the 'standard' approach depends on a supposedly known ? scattering cross section, whereas we show here that this cross section is itself dependent on the environment. Correcting for this inconsistency does not, in itself, improve agreement.
Gupta, Shweta; Kesarla, Rajesh; Chotai, Narendra; Misra, Ambikanandan
2017-01-01
The nonnucleoside reverse transcriptase inhibitors, used for the treatment of HIV infections, are reported to have low bioavailability pertaining to high first-pass metabolism, high protein binding, and enzymatic metabolism. They also show low permeability across blood brain barrier. The CNS is reported to be the most important HIV reservoir site. In the present study, solid lipid nanoparticles of efavirenz were prepared with the objective of providing increased permeability and protection of drug due to biocompatible lipidic content and nanoscale size and thus developing formulation having potential for enhanced bioavailability and brain targeting. Solid lipid nanoparticles were prepared by high pressure homogenization technique using a systematic approach of design of experiments (DoE) and evaluated for particle size, polydispersity index, zeta potential, and entrapment efficiency. Particles of average size 108.5 nm having PDI of 0.172 with 64.9% entrapment efficiency were produced. Zeta potential was found to be −21.2 mV and the formulation was found stable. The in-vivo pharmacokinetic studies revealed increased concentration of the drug in brain, as desired, when administered through intranasal route indicating its potential for an attempt towards complete eradication of HIV and cure of HIV-infected patients. PMID:28243600
Llewellyn, Henry; Low, Joe; Smith, Glenn; Hopkins, Katherine; Burns, Aine; Jones, Louise
2014-08-01
Chronic and life-threatening conditions are widely thought to shatter the lives of those affected. In this article, we examine the accounts of 19 older people diagnosed with late stage chronic kidney disease who declined dialysis. Accounts were collected through in-depth interview in the United Kingdom (March-November, 2010). Drawing on a phenomenological approach, we focus particularly on the embodied and lived experience of the condition and on how participants constructed treatment modalities and approached treatment choice. We look toward contemporary elaborations of the conceptual framework of biographical disruption to illustrate how participants managed to contain the intrusion of illness and maintain continuity in their lives. We argue that three interactive phenomena mitigated the potential for disruption and allowed participants to maintain continuity: (a) the framing of illness as "old age"; (b) the prior experience of serious illness; and (c) the choice of the treatment with the least potential for disruption. We conclude that a diagnosis of chronic illness in late life does not inevitably shatter lives or engender biographical disruption. Instead, people are able to construct continuity owing to complex narrative interpretations of diagnosis, sensation and treatment choices. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Laun, Frederik B.; Demberg, Kerstin; Nagel, Armin M.; Uder, Micheal; Kuder, Tristan A.
2017-11-01
Nuclear magnetic resonance (NMR) diffusion measurements can be used to probe porous structures or biological tissues by means of the random motion of water molecules. The short-time expansion of the diffusion coefficient in powers of sqrt(t), where t is the diffusion time related to the duration of the diffusion-weighting magnetic field gradient profile, is universally connected to structural parameters of the boundaries restricting the diffusive motion. The sqrt(t)-term is proportional to the surface to volume ratio. The t-term is related to permeability and curvature. The short time expansion can be measured with two approaches in NMR-based diffusion experiments: First, by the use of diffusion encodings of short total duration and, second, by application of oscillating gradients of long total duration. For oscillating gradients, the inverse of the oscillation frequency becomes the relevant time scale. The purpose of this manuscript is to show that the oscillating gradient approach is blind to the t-term. On the one hand, this prevents fitting of permeability and curvature measures from this term. On the other hand, the t-term does not bias the determination of the sqrt(t)-term in experiments.
Rollins, Brandi Y.; Loken, Eric; Savage, Jennifer S.; Birch, Leann L.
2014-01-01
Parents’ use of restrictive feeding practices is counterproductive, increasing children’s intake of restricted foods and risk for excessive weight gain. The aims of this research were to replicate Fisher and Birch’s (1999b) original findings that short-term restriction increases preschool children’s (3–5 y) selection, intake, and behavioral response to restricted foods, and to identify characteristics of children who were more susceptible to the negative effects of restriction. The experiment used a within-subjects design; 37 children completed the food reinforcement task and heights/weights were measured. Parents reported on their use of restrictive feeding practices and their child’s inhibitory control and approach. Overall, the findings replicated those of Fisher and Birch (1999b) and revealed that the effects of restriction differed by children’s regulatory and appetitive tendencies. Greater increases in intake in response to restriction were observed among children lower in inhibitory control, higher in approach, who found the restricted food highly reinforcing, and who had previous experience with parental use of restriction. Results confirm that parents’ use of restriction does not moderate children’s consumption of these foods, particularly among children with lower regulatory or higher appetitive tendencies. PMID:24511616
Carnevale, John T; Kagan, Raanan; Murphy, Patrick J; Esrick, Josh
2017-04-01
Despite the federal prohibition against marijuana, state-level recreational use appears to be moving forward. Public opinion is shifting. Following well-publicized state-legalization in Washington and Colorado, states across the US have begun considering similar measures. Since the 2016 election, over 21% of Americans now live in places where recreational marijuana is state-legal, and over 63% of the country permits medical or recreational use at the state level. This paper does not consider whether states should legalize marijuana nor does it weigh all regulatory options available to states. Instead, it considers how states can create a practical framework to regulate recreational marijuana, particularly in a climate of federal uncertainty where marijuana remains illegal. We draw lessons from Colorado and Washington-assuming that other states will adopt similar models and employ commercial, for-profit systems. Considering both the variety of goals that states could adopt and how they interact, we offer recommendations in five areas: cultivation, production, and processing; sale, consumption, and possession; taxes and finance; public health and safety; and governance. We recommend that states implement a relatively restrictive regulatory approach, with a single market for recreational and medical marijuana, if appropriate. This should make marijuana laws easier to enforce, help reduce diversion, and satisfy federal guidance. Moreover, drawing from Colorado and Washington's experience, we suggest a flexible system with robust data collection and performance monitoring that supports a thorough evaluation. This should allow states to "learn as they go"-a must, given the uncertainty surrounding such policy shifts. Of course, a tightly regulated approach will have drawbacks-including a significant illegal market. But political experience teaches that states will be better off loosening a tight market than attempting to tighten a loose one. We also consider a potential role for the federal government under the status quo. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hossain, F.; Iqbal, N.; Lee, H.; Muhammad, A.
2015-12-01
When it comes to building durable capacity for implementing state of the art technology and earth observation (EO) data for improved decision making, it has been long recognized that a unidirectional approach (from research to application) often does not work. Co-design of capacity building effort has recently been recommended as a better alternative. This approach is a two-way street where scientists and stakeholders engage intimately along the entire chain of actions from design of research experiments to packaging of decision making tools and each party provides an equal amount of input. Scientists execute research experiments based on boundary conditions and outputs that are defined as tangible by stakeholders for decision making. On the other hand, decision making tools are packaged by stakeholders with scientists ensuring the application-specific science is relevant. In this talk, we will overview one such iterative capacity building approach that we have implemented for gravimetry-based satellite (GRACE) EO data for improved groundwater management in Pakistan. We call our approach a hybrid approach where the initial step is a forward model involving a conventional short-term (3 day) capacity building workshop in the stakeholder environment addressing a very large audience. In this forward model, the net is cast wide to 'shortlist' a set of highly motivated stakeholder agency staffs who are then engaged more directly in 1-1 training. In the next step (the backward model), these short listed staffs are then brought back in the research environment of the scientists (supply) for 1-1 and long-term (6 months) intense brainstorming, training, and design of decision making tools. The advantage of this backward model is that it allows for a much better understanding for scientists of the ground conditions and hurdles of making a EO-based scientific innovation work for a specific decision making problem that is otherwise fundamentally impossible in conventional training workshops. We demonstrate here our experience of implementing this hybrid model for capacity building for groundwater management for Pakistan Council for Research on Water Resources (PCRWR) with the ultimate goal of empowering naitonal agencies in their ability to monitor groundwater storage changes independently from satellites.
The Concept of Experience by John Dewey Revisited: Conceiving, Feeling and "Enliving"
ERIC Educational Resources Information Center
Hohr, Hansjorg
2013-01-01
"The concept of experience by John Dewey revisited: conceiving, feeling and 'enliving'." Dewey takes a few steps towards a differentiation of the concept of experience, such as the distinction between primary and secondary experience, or between ordinary (partial, raw, primitive) experience and complete, aesthetic experience. However, he does not…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Richard R. Schultz; Paul D. Bayless; Richard W. Johnson
2010-09-01
The Oregon State University (OSU) High Temperature Test Facility (HTTF) is an integral experimental facility that will be constructed on the OSU campus in Corvallis, Oregon. The HTTF project was initiated, by the U.S. Nuclear Regulatory Commission (NRC), on September 5, 2008 as Task 4 of the 5 year High Temperature Gas Reactor Cooperative Agreement via NRC Contract 04-08-138. Until August, 2010, when a DOE contract was initiated to fund additional capabilities for the HTTF project, all of the funding support for the HTTF was provided by the NRC via their cooperative agreement. The U.S. Department of Energy (DOE) beganmore » their involvement with the HTTF project in late 2009 via the Next Generation Nuclear Plant project. Because the NRC interests in HTTF experiments were only centered on the depressurized conduction cooldown (DCC) scenario, NGNP involvement focused on expanding the experimental envelope of the HTTF to include steady-state operations and also the pressurized conduction cooldown (PCC). Since DOE has incorporated the HTTF as an ingredient in the NGNP thermal-fluids validation program, several important outcomes should be noted: 1. The reference prismatic reactor design, that serves as the basis for scaling the HTTF, became the modular high temperature gas-cooled reactor (MHTGR). The MHTGR has also been chosen as the reference design for all of the other NGNP thermal-fluid experiments. 2. The NGNP validation matrix is being planned using the same scaling strategy that has been implemented to design the HTTF, i.e., the hierarchical two-tiered scaling methodology developed by Zuber in 1991. Using this approach a preliminary validation matrix has been designed that integrates the HTTF experiments with the other experiments planned for the NGNP thermal-fluids verification and validation project. 3. Initial analyses showed that the inherent power capability of the OSU infrastructure, which only allowed a total operational facility power capability of 0.6 MW, is inadequate to permit steady-state operation at reasonable conditions. 4. To enable the HTTF to operate at a more representative steady-state conditions, DOE recently allocated funding via a DOE subcontract to HTTF to permit an OSU infrastructure upgrade such that 2.2 MW will become available for HTTF experiments. 5. Analyses have been performed to study the relationship between HTTF and MHTGR via the hierarchical two-tiered scaling methodology which has been used successfully in the past, e.g., APEX facility scaling to the Westinghouse AP600 plant. These analyses have focused on the relationship between key variables that will be measured in the HTTF to the counterpart variables in the MHTGR with a focus on natural circulation, using nitrogen as a working fluid, and core heat transfer. 6. Both RELAP5-3D and computational fluid dynamics (CD-Adapco’s STAR-CCM+) numerical models of the MHTGR and the HTTF have been constructed and analyses are underway to study the relationship between the reference reactor and the HTTF. The HTTF is presently being designed. It has ¼-scaling relationship to the MHTGR in both the height and the diameter. Decisions have been made to design the reactor cavity cooling system (RCCS) simulation as a boundary condition for the HTTF to ensure that (a) the boundary condition is well defined and (b) the boundary condition can be modified easily to achieve the desired heat transfer sink for HTTF experimental operations.« less
A Safer, Discovery-Based Nucleophilic Substitution Experiment
ERIC Educational Resources Information Center
Horowitz, Gail
2009-01-01
A discovery-based nucleophilic substitution experiment is described in which students compare the reactivity of chloride and iodide ions in an S[subscript N]2 reaction. This experiment improves upon the well-known "Competing Nucleophiles" experiment in that it does not involve the generation of hydrogen halide gas. The experiment also introduces…
A simple approach to lifetime learning in genetic programming-based symbolic regression.
Azad, Raja Muhammad Atif; Ryan, Conor
2014-01-01
Genetic programming (GP) coarsely models natural evolution to evolve computer programs. Unlike in nature, where individuals can often improve their fitness through lifetime experience, the fitness of GP individuals generally does not change during their lifetime, and there is usually no opportunity to pass on acquired knowledge. This paper introduces the Chameleon system to address this discrepancy and augment GP with lifetime learning by adding a simple local search that operates by tuning the internal nodes of individuals. Although not the first attempt to combine local search with GP, its simplicity means that it is easy to understand and cheap to implement. A simple cache is added which leverages the local search to reduce the tuning cost to a small fraction of the expected cost, and we provide a theoretical upper limit on the maximum tuning expense given the average tree size of the population and show that this limit grows very conservatively as the average tree size of the population increases. We show that Chameleon uses available genetic material more efficiently by exploring more actively than with standard GP, and demonstrate that not only does Chameleon outperform standard GP (on both training and test data) over a number of symbolic regression type problems, it does so by producing smaller individuals and it works harmoniously with two other well-known extensions to GP, namely, linear scaling and a diversity-promoting tournament selection method.
DOE Research and Development Accomplishments Videos
accomplishments resulting from DOE and predecessor research and development? You can select 'Fast Facts', which approach to viewing a special selection of accomplishments. Fast Facts Fast Facts Visions of Success I
The integration of Human Factors (HF) in the SAR process training course text
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryan, T.G.
1995-03-01
This text provides the technical basis for a two-day course on human factors (HF), as applied to the Safety Analysis Report (SAR) process. The overall objective of this text and course is to: provide the participant with a working knowledge of human factors-related requirements, suggestions for doing a human safety analysis applying a graded approach, and an ability to demonstrate using the results of the human safety analysis, that human factors elements as defined by DOE (human factors engineering, procedures, training, oversight, staffing, qualifications), can support wherever necessary, nuclear safety commitments in the SAR. More specifically, the objectives of themore » text and course are: (1) To provide the SAR preparer with general guidelines for doing HE within the context of a graded approach for the SAR; (2) To sensitize DOE facility managers and staff, safety analysts and SAR preparers, independent reviewers, and DOE reviewers and regulators, to DOE Order 5480.23 requirements for HE in the SAR; (3) To provide managers, analysts, reviewers and regulators with a working knowledge of HE concepts and techniques within the context of a graded approach for the SAR, and (4) To provide SAR managers and DOE reviewers and regulators with general guidelines for monitoring and coordinating the work of preparers of HE inputs throughout the SAR process, and for making decisions regarding the safety relevance of HE inputs to the SAR. As a ready reference for implementing the human factors requirements of DOE Order 5480.22 and DOE Standard 3009-94, this course text and accompanying two-day course are intended for all persons who are involved in the SAR.« less
3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands
Mateo, Carlos M.; Gil, Pablo; Torres, Fernando
2016-01-01
Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object’s surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand’s fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments. PMID:27164102
3D Visual Data-Driven Spatiotemporal Deformations for Non-Rigid Object Grasping Using Robot Hands.
Mateo, Carlos M; Gil, Pablo; Torres, Fernando
2016-05-05
Sensing techniques are important for solving problems of uncertainty inherent to intelligent grasping tasks. The main goal here is to present a visual sensing system based on range imaging technology for robot manipulation of non-rigid objects. Our proposal provides a suitable visual perception system of complex grasping tasks to support a robot controller when other sensor systems, such as tactile and force, are not able to obtain useful data relevant to the grasping manipulation task. In particular, a new visual approach based on RGBD data was implemented to help a robot controller carry out intelligent manipulation tasks with flexible objects. The proposed method supervises the interaction between the grasped object and the robot hand in order to avoid poor contact between the fingertips and an object when there is neither force nor pressure data. This new approach is also used to measure changes to the shape of an object's surfaces and so allows us to find deformations caused by inappropriate pressure being applied by the hand's fingers. Test was carried out for grasping tasks involving several flexible household objects with a multi-fingered robot hand working in real time. Our approach generates pulses from the deformation detection method and sends an event message to the robot controller when surface deformation is detected. In comparison with other methods, the obtained results reveal that our visual pipeline does not use deformations models of objects and materials, as well as the approach works well both planar and 3D household objects in real time. In addition, our method does not depend on the pose of the robot hand because the location of the reference system is computed from a recognition process of a pattern located place at the robot forearm. The presented experiments demonstrate that the proposed method accomplishes a good monitoring of grasping task with several objects and different grasping configurations in indoor environments.
Personalized medicine in multiple sclerosis.
Giovannoni, Gavin
2017-11-01
The therapeutic approach in multiple sclerosis (MS) requires a personalized medicine frame beyond the precision medicine concept, which is not currently implementable due to the lack of robust biomarkers and detailed understanding of MS pathogenesis. Personalized medicine demands a patient-focused approach, with disease taxonomy informed by characterization of pathophysiological processes. Important questions concerning MS taxonomy are: when does MS begin? When does the progressive phase begin? Is MS really two or three diseases? Does a therapeutic window truly exist? Newer evidence points to a disease spectrum and a therapeutic lag of several years for benefits to be observed from disease-modifying therapy. For personalized treatment, it is important to ascertain disease stage and any worsening of focal inflammatory lesions over time.
Subliminal access to abstract face representations does not rely on attention.
Harry, Bronson; Davis, Chris; Kim, Jeesun
2012-03-01
The present study used masked repetition priming to examine whether face representations can be accessed without attention. Two experiments using a face recognition task (fame judgement) presented masked repetition and control primes in spatially unattended locations prior to target onset. Experiment 1 (n=20) used the same images as primes and as targets and Experiment 2 (n=17) used different images of the same individual as primes and targets. Repetition priming was observed across both experiments regardless of whether spatial attention was cued to the location of the prime. Priming occurred for both famous and non-famous targets in Experiment 1 but was only reliable for famous targets in Experiment 2, suggesting that priming in Experiment 1 indexed access to view-specific representations whereas priming in Experiment 2 indexed access to view-invariant, abstract representations. Overall, the results indicate that subliminal access to abstract face representations does not rely on attention. Copyright © 2011 Elsevier Inc. All rights reserved.
Does power corrupt or enable? When and why power facilitates self-interested behavior.
DeCelles, Katherine A; DeRue, D Scott; Margolis, Joshua D; Ceranic, Tara L
2012-05-01
Does power corrupt a moral identity, or does it enable a moral identity to emerge? Drawing from the power literature, we propose that the psychological experience of power, although often associated with promoting self-interest, is associated with greater self-interest only in the presence of a weak moral identity. Furthermore, we propose that the psychological experience of power is associated with less self-interest in the presence of a strong moral identity. Across a field survey of working adults and in a lab experiment, individuals with a strong moral identity were less likely to act in self-interest, yet individuals with a weak moral identity were more likely to act in self-interest, when subjectively experiencing power. Finally, we predict and demonstrate an explanatory mechanism behind this effect: The psychological experience of power enhances moral awareness among those with a strong moral identity, yet decreases the moral awareness among those with a weak moral identity. In turn, individuals' moral awareness affects how they behave in relation to their self-interest. (PsycINFO Database Record (c) 2012 APA, all rights reserved).
Simulating Astrophysical Jets with Inertial Confinement Fusion Machines
NASA Astrophysics Data System (ADS)
Blue, Brent
2005-10-01
Large-scale directional outflows of supersonic plasma, also known as `jets', are ubiquitous phenomena in astrophysics. The traditional approach to understanding such phenomena is through theoretical analysis and numerical simulations. However, theoretical analysis might not capture all the relevant physics and numerical simulations have limited resolution and fail to scale correctly in Reynolds number and perhaps other key dimensionless parameters. Recent advances in high energy density physics using large inertial confinement fusion devices now allow controlled laboratory experiments on macroscopic volumes of plasma of direct relevance to astrophysics. This talk will present an overview of these facilities as well as results from current laboratory astrophysics experiments designed to study hydrodynamic jets and Rayleigh-Taylor mixing. This work is performed under the auspices of the U. S. DOE by Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48, Los Alamos National Laboratory under Contract No. W-7405-ENG-36, and the Laboratory for Laser Energetics under Contract No. DE-FC03-92SF19460.
Modeling Supernova Shocks with Intense Lasers.
NASA Astrophysics Data System (ADS)
Blue, Brent
2006-04-01
Large-scale directional outflows of supersonic plasma are ubiquitous phenomena in astrophysics, with specific application to supernovae. The traditional approach to understanding such phenomena is through theoretical analysis and numerical simulations. However, theoretical analysis might not capture all the relevant physics and numerical simulations have limited resolution and fail to scale correctly in Reynolds number and perhaps other key dimensionless parameters. Recent advances in high energy density physics using large inertial confinement fusion devices now allow controlled laboratory experiments on macroscopic volumes of plasma of direct relevance to astrophysics. This talk will present an overview of these facilities as well as results from current laboratory astrophysics experiments designed to study hydrodynamic jets and Rayleigh-Taylor mixing. This work is performed under the auspices of the U. S. DOE by Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48, Los Alamos National Laboratory under Contract No. W-7405-ENG-36, and the Laboratory for Laser Energetics under Contract No. DE-FC03-92SF19460.
Design of large format commercial display holograms
NASA Astrophysics Data System (ADS)
Perry, John F. W.
1989-05-01
Commercial display holography is approaching a critical stage where the ability to compete with other graphic media will dictate its future. Factors involved will be cost, technical quality and, in particular, design. The tenuous commercial success of display holography has relied heavily on its appeal to an audience with little or no previous experience in the medium. Well designed images were scarce, leading many commercial designers to avoid holography. As the public became more accustomed to holograms, the excitement dissipated, leaving a need for strong visual design if the medium is to survive in this marketplace. Drawing on the vast experience of TV, rock music and magazine advertising, competitive techniques such as video walls, mural duratrans, laser light shows and interactive videos attract a professional support structure far greater than does holography. This paper will address design principles developed at Holographics North for large format commercial holography. Examples will be drawn from a number of foreign and domestic corporate trade exhibitions. Recommendations will also be made on how to develop greater awareness of a holographic design.
A variant of the anomaly initialisation approach for global climate forecast models
NASA Astrophysics Data System (ADS)
Volpi, Danila; Guemas, Virginie; Doblas-Reyes, Francisco; Hawkins, Ed; Nichols, Nancy; Carrassi, Alberto
2014-05-01
This work presents a refined method of anomaly initialisation (AI) applied to the ocean and sea ice components of the global climate forecast model EC-Earth, with the following particularities: - the use of a weight to the anomalies, in order to avoid the risk of introducing too big anomalies recorded in the observed state, whose amplitude does not fit the range of the internal variability generated by the model. - the AI of the temperature and density ocean state variables instead of the temperature and salinity. Results show that the use of such refinements improve the skill over the Arctic region, part of the North and South Atlantic, part of the North and South Pacific and the Mediterranean Sea. In the Tropical Pacific the full field initialised experiment performs better. This is probably due to a displacement of the observed anomalies caused by the use of the AI technique. Furthermore, preliminary results of an anomaly nudging experiment are discussed.
Nikolopoulou, Marialena
2011-06-01
A review of the various approaches in understanding outdoor thermal comfort is presented. The emphasis on field surveys from around the world, particularly across Europe, enables us to understand thermal perception and evaluate outdoor thermal comfort conditions. The consistent low correlations between objective microclimatic variables, subjective thermal sensation and comfort outdoors, internationally, suggest that thermophysiology alone does not adequate describe these relationships. Focusing on the concept of adaptation, it tries to explain how this influences outdoor comfort, enabling us to inhabit and get satisfaction from outdoor spaces throughout the year. Beyond acclimatization and behavioral adaptation, through adjustments in clothing and changes to the metabolic heat, psychological adaptation plays a critical role to ensure thermal comfort and satisfaction with the outdoor environment. Such parameters include recent experiences and expectations; personal choice and perceived control, more important than whether that control is actually exercised; and the need for positive environmental stimulation suggesting that thermal neutrality is not a pre-requisite for thermal comfort. Ultimately, enhancing environmental diversity can influence thermal perception and experience of open spaces.
Behavioral economics. Avoiding overhead aversion in charity.
Gneezy, Uri; Keenan, Elizabeth A; Gneezy, Ayelet
2014-10-31
Donors tend to avoid charities that dedicate a high percentage of expenses to administrative and fundraising costs, limiting the ability of nonprofits to be effective. We propose a solution to this problem: Use donations from major philanthropists to cover overhead expenses and offer potential donors an overhead-free donation opportunity. A laboratory experiment testing this solution confirms that donations decrease when overhead increases, but only when donors pay for overhead themselves. In a field experiment with 40,000 potential donors, we compared the overhead-free solution with other common uses of initial donations. Consistent with prior research, informing donors that seed money has already been raised increases donations, as does a $1:$1 matching campaign. Our main result, however, clearly shows that informing potential donors that overhead costs are covered by an initial donation significantly increases the donation rate by 80% (or 94%) and total donations by 75% (or 89%) compared with the seed (or matching) approach. Copyright © 2014, American Association for the Advancement of Science.
Script-independent text line segmentation in freestyle handwritten documents.
Li, Yi; Zheng, Yefeng; Doermann, David; Jaeger, Stefan; Li, Yi
2008-08-01
Text line segmentation in freestyle handwritten documents remains an open document analysis problem. Curvilinear text lines and small gaps between neighboring text lines present a challenge to algorithms developed for machine printed or hand-printed documents. In this paper, we propose a novel approach based on density estimation and a state-of-the-art image segmentation technique, the level set method. From an input document image, we estimate a probability map, where each element represents the probability that the underlying pixel belongs to a text line. The level set method is then exploited to determine the boundary of neighboring text lines by evolving an initial estimate. Unlike connected component based methods ( [1], [2] for example), the proposed algorithm does not use any script-specific knowledge. Extensive quantitative experiments on freestyle handwritten documents with diverse scripts, such as Arabic, Chinese, Korean, and Hindi, demonstrate that our algorithm consistently outperforms previous methods [1]-[3]. Further experiments show the proposed algorithm is robust to scale change, rotation, and noise.
Now or later? Understanding the etiologic period of suicide
Vandoros, Sotiris; Kavetsos, Georgios
2015-01-01
Previous research shows that the announcement of austerity measures leads to an immediate and short-lived increase in behaviour that demonstrates anxiety, stress, frustration and other mental effects. This paper uses evidence from the same natural experiment to study whether, for a given decision to commit suicide (as documented by the overall increase over the study period), suicides follow immediately after the announcement of austerity measures in Greece; or whether this is an effect that matures in peoples' minds before being transformed into action. We use evidence from a natural experiment and follow an econometric approach. Our findings show that, despite an overall sharp increase in suicides over the study period, the increase does not follow immediately in the first few days after each such negative event. This suggests that suicides are not spontaneous. They are rather decisions that take time to mature. This time lag implies that suicides arguably attributed to recessions are, in principle, preventable and underlines the importance of mental health services. PMID:26844154
Dorandeu, Fr; Lallement, G
2003-11-01
Toxicity assessment and demonstration of innocuousness of chemical compounds have been part of the research studies conducted in the fields of pharmacy, agriculture and chemical industry for years. Acute systemic toxicity studies are an important element of the safety evaluation. They remain compulsory for regulatory purposes and important for the public opinion that does not accept the risk anymore. Evolutions of the ethics in animal experiments foster a necessary reduction of the number of animals involved in this type of experiments, following the well-known principle of the three Rs rule of Russell and Burch (1959) (Reduction, refinement and replacement). These two views seem in contradiction. Using the example of acute toxicity testing and focusing on the now very criticized parameter lethal dose 50, we will present approaches, including statistical ones, that a toxicologist can use, when free to choose, to keep on conducting the indispensable in vivo studies while abiding by ethical recommendations.
Identification and assessment of intimate partner violence in nurse home visitation.
Jack, Susan M; Ford-Gilboe, Marilyn; Davidov, Danielle; MacMillan, Harriet L
2017-08-01
To develop strategies for the identification and assessment of intimate partner violence in a nurse home visitation programme. Nurse home visitation programmes have been identified as an intervention for preventing child abuse and neglect. Recently, there is an increased focus on the role these programmes have in addressing intimate partner violence. Given the unique context of the home environment, strategies for assessments are required that maintain the therapeutic alliance and minimise client attrition. A qualitative case study. A total of four Nurse-Family Partnership agencies were engaged in this study. Purposeful samples of nurses (n = 32), pregnant or parenting mothers who had self-disclosed experiences of abuse (n = 26) and supervisors (n = 5) participated in this study. A total of 10 focus groups were completed with nurses: 42 interviews with clients and 10 interviews with supervisors. The principles of conventional content analysis guided data analysis. Data were categorised using the practice-problem-needs analysis model for integrating qualitative findings in the development of nursing interventions. Multiple opportunities to ask about intimate partner violence are valued. The use of structured screening tools at enrolment does not promote disclosure or in-depth exploration of women's experiences of abuse. Women are more likely to discuss experiences of violence when nurses initiate nonstructured discussions focused on parenting, safety or healthy relationships. Nurses require knowledge and skills to initiate indicator-based assessments when exposure to abuse is suspected as well as strategies for responding to client-initiated disclosures. A tailored approach to intimate partner violence assessment in home visiting is required. Multiple opportunities for exploring women's experiences of violence are required. A clinical pathway outlining a three-pronged approach to identification and assessment was developed. © 2016 John Wiley & Sons Ltd.
ERIC Educational Resources Information Center
Lourenço, Fernando; Sappleton, Natalie; Cheng, Ranis
2015-01-01
The authors examined the following questions: Does gender influence the ethicality of enterprise students to a greater extent than it does nascent entrepreneurs? If this is the case, then is it due to factors associated with adulthood such as age, work experience, marital status, and parental status? Sex-role socialization theory and moral…
Van Bockstal, Pieter-Jan; Mortier, Séverine Thérèse F C; Corver, Jos; Nopens, Ingmar; Gernaey, Krist V; De Beer, Thomas
2018-02-01
Pharmaceutical batch freeze-drying is commonly used to improve the stability of biological therapeutics. The primary drying step is regulated by the dynamic settings of the adaptable process variables, shelf temperature T s and chamber pressure P c . Mechanistic modelling of the primary drying step leads to the optimal dynamic combination of these adaptable process variables in function of time. According to Good Modelling Practices, a Global Sensitivity Analysis (GSA) is essential for appropriate model building. In this study, both a regression-based and variance-based GSA were conducted on a validated mechanistic primary drying model to estimate the impact of several model input parameters on two output variables, the product temperature at the sublimation front T i and the sublimation rate ṁ sub . T s was identified as most influential parameter on both T i and ṁ sub , followed by P c and the dried product mass transfer resistance α Rp for T i and ṁ sub , respectively. The GSA findings were experimentally validated for ṁ sub via a Design of Experiments (DoE) approach. The results indicated that GSA is a very useful tool for the evaluation of the impact of different process variables on the model outcome, leading to essential process knowledge, without the need for time-consuming experiments (e.g., DoE). Copyright © 2017 Elsevier B.V. All rights reserved.
Basu, Anindya; Leong, Susanna Su Jan
2012-02-03
The Hepatitis B Virus X (HBx) protein is a potential therapeutic target for the treatment of hepatocellular carcinoma. However, consistent expression of the protein as insoluble inclusion bodies in bacteria host systems has largely hindered HBx manufacturing via economical biosynthesis routes, thereby impeding the development of anti-HBx therapeutic strategies. To eliminate this roadblock, this work reports the development of the first 'chromatography refolding'-based bioprocess for HBx using immobilised metal affinity chromatography (IMAC). This process enabled production of HBx at quantities and purity that facilitate their direct use in structural and molecular characterization studies. In line with the principles of quality by design (QbD), we used a statistical design of experiments (DoE) methodology to design the optimum process which delivered bioactive HBx at a productivity of 0.21 mg/ml/h at a refolding yield of 54% (at 10 mg/ml refolding concentration), which was 4.4-fold higher than that achieved in dilution refolding. The systematic DoE methodology adopted for this study enabled us to obtain important insights into the effect of different bioprocess parameters like the effect of buffer exchange gradients on HBx productivity and quality. Such a bioprocess design approach can play a pivotal role in developing intensified processes for other novel proteins, and hence helping to resolve validation and speed-to-market challenges faced by the biopharmaceutical industry today. Copyright © 2011 Elsevier B.V. All rights reserved.
Ghinet, Alina; Zehani, Yasmine; Lipka, Emmanuelle
2017-10-25
Two routes aimed at the manufacture of unprecedented stereoisomeric combretastatin A-4 analogue were described: flash chromatography vs supercritical fluid chromatography. The latter has many advantages over liquid chromatography and was therefore chosen for the small scale separation of methyl 1-[(3-hydroxy-4-methoxyphenyl) (3,4,5-trimethoxyphenyl)methyl]-5-oxo-l-prolinate 5, with potential antitumoral activity. After a screening of six different polysaccharide based chiral stationary phases and four co-solvents, the percentage of co-solvent, the flow-rate and the outlet pressure were optimized through a design of experiments (DoE). The preparation of 50mg of each stereoisomer was achieved successfully on a Chiralpak AD-H with isopropanol as a co-solvent. Productivity (kkd), solvent usage and environmental factor (E Factor) were calculated. Flash chromatography and supercritical fluid chromatography approaches were compared in terms of yield and purity of each stereoisomer manufactured. Copyright © 2017 Elsevier B.V. All rights reserved.
Photoassociation of ultracold LiRb molecules with short pulses near a Feshbach resonance
NASA Astrophysics Data System (ADS)
Gacesa, Marko; Ghosal, Subhas; Byrd, Jason; Côté, Robin
2014-05-01
Ultracold diatomic molecules prepared in the lowest ro-vibrational state are a required first step in many experimental studies aimed at investigating the properties of cold quantum matter. We propose a novel approach to produce such molecules in a two-color photoassociation experiment with short pulses performed near a Feshbach resonance. Specifically, we report the results of a theoretical investigation of formation of 6Li87Rb molecules in a magnetic field. We show that the molecular formation rate can be significantly increased if the pump step is performed near a magnetic Feshbach resonance due to the strong coupling between the energetically open and closed hyperfine states. In addition, the dependence of the nodal structure of the total wave function on the magnetic field allows for enhanced control over the shape and position of the wave packet. The proposed approach is applicable to different systems that have accessible Feshbach resonances. Partially supported by ARO(MG), DOE(SG), AFOFR(JB), NSF(RC).